Sıfır olmayan korelasyon bağımlılığı ima eder mi?


17

Sıfır korelasyonun bağımsızlık anlamına gelmediğini biliyoruz. Sıfır olmayan bir korelasyonun bağımlılık gösterip göstermediğiyle ilgileniyorum - yani bazı rastgele değişkenler X ve Y için Corr(X,Y)0 ise , genel olarak f X , Y ( x , y ) f X ( x ) f Y ( y ) ?XYfX,Y(x,y)fX(x)fY(y)

Yanıtlar:


13

Evet çünkü

Corr(X,Y)0Cov(X,Y)0

E(XY)E(X)E(Y)0

xyfX,Y(x,y)dxdyxfX(x)dxyfY(y)dy0

xyfX,Y(x,y)dxdyxyfX(x)fY(y)dxdy0

xy[fX,Y(x,y)fX(x)fY(y)]dxdy0

f X , Y ( x , y ) - f X ( x ) f Y ( y ) = 0 olursa, bu mümkün olmaz , . YanifX,Y(x,y)fX(x)fY(y)=0,{x,y}

Corr(X,Y)0{x,y}:fX,Y(x,y)fX(x)fY(y)

Soru: Yoğunluğu olmayan rastgele değişkenlerle ne olur?


1
Alecos, aptalca bir sorum var. Süslü ok, örneğin 1. satırda ne anlama geliyor? "Zımni" gibi bir şey hayal ediyorum ama emin değilim.
Sycorax, Reinstate Monica'ya

2
@ user777 Yani ? Gerçekten de bu "ima" anlamına gelir.
Alecos Papadopoulos

Uygulama okunu yalnızca gayri resmi argümanda kullanmanın nedeni: uygulama oku sol veya sağ ilişkilendirilebilir mi?
kasterma

\implies üretir hangi \rightarowüreten daha iyi görünüyor .
Dilip Sarwate

14

Let X and Y denote random variables such that E[X2] and E[Y2] are finite. Then, E[XY], E[X] and E[Y] all are finite.

Restricting our attention to such random variables, let A denote the statement that X and Y are independent random variables and B the statement that X and Y are uncorrelated random variables, that is, E[XY]=E[X]E[Y]. Then we know that A implies B, that is, independent random variables are uncorrelated random variables. Indeed, one definition of independent random variables is that E[g(X)h(Y)] equals E[g(X)]E[h(Y)] for all measurable functions g() and h()). This is usually expressed as

AB.
But AB is logically equivalent to ¬B¬A, that is,

correlated random variables are dependent random variables.

If E[XY], E[X] or E[Y] are not finite or do not exist, then it is not possible to say whether X and Y are uncorrelated or not in the classical meaning of uncorrelated random variables being those for which E[XY]=E[X]E[Y]. For example, X and Y could be independent Cauchy random variables (for which the mean does not exist). Are they uncorrelated random variables in the classical sense?


3
The nice thing about this answer is that it applies whether or not the random variables in question admit a density function, as opposed to other answers on this thread. This is true due to the fact that expectations can be defined with Stieltjes integrals using the CDF, with no mention of the density.
ahfoss

1

Here a purely logical proof. If AB then necessarily ¬B¬A, as the two are equivalent. Thus if ¬B then ¬A. Now replace A with independence and B with correlation.

Think about a statement "if volcano erupts there are going to be damages". Now think about a case where there are no damages. Clearly a volcano didn't erupt or we would have a condtradicition.

Similarly, think about a case "If independent X,Y, then non-correlated X,Y". Now, consider the case where X,Y are correlated. Clearly they can't be independent, for if they were, they would also be correlated. Thus conclude dependence.


If you will read my answer carefully, you will see that I too used the argument that you have made in your answer, namely that AB is the same as B¬A.
Dilip Sarwate

@DilipSarwate Edited to reflect that.
Tony
Sitemizi kullandığınızda şunları okuyup anladığınızı kabul etmiş olursunuz: Çerez Politikası ve Gizlilik Politikası.
Licensed under cc by-sa 3.0 with attribution required.