The variance of two random variables, X+Y is in the beginning of every statistics book. The distribution of X/Y is a standard Cauchy variable. So when I tried to find the variance of X*Y, I figured no problem. But it is actually very difficult to find on the web, and tedious to derive. As a public service, here is the result when X and Y are both normally distributed:
Let V(x) and V(y) be the variance of X and Y respectively
Let C(x,y) be the covariance of X and Y
then the variance of the product XY, is
V(xy)=[E(x)]^2*V(y)+[E(y)]^2*V(x)+2*E(x)*E(y)*C(x,y)
+V(x)*V(y)+C(x,y)^2
note: see derivation from anonymous commenter (I can't get math in html for my blog, so God bless him)
9 comments:
The result does not depend on X&Y being normally distd. You just need finite mean and variance.
Well, there's a third moment, E(delta(x)^2*delta(y)), that's zero for normal, but not for all other distributions.
Is this calculation =
E([{X-E(X)}{Y-E(Y)}]**2) ?
What's this info useful for?
Why, to make gobs of money via the idea that XXXXXCENSOREDBYSPONSOR XXXXXXXXX so if you know V(xy) you can maximize the Sharpe ratio
Another cool result is that if X and Y are independant unit normal distributions, and Z=XY, then the probability density of Z is (1/Pi)K_0(|z|), where K_0 is a modified Bessel function of the second kind.
Thank you!!!!
I've been looking everywhere for this!
I couldn't find it anywhere in any of my texts or on the internet.
Here's the proof.
Hi,
I found this article very intersting, but, by reading the proof i had a problem with the Cov(X^2,Y^2)
Would you mind explaining me how did you get the result? (i understood that there is a delta method somewhere, but i've never heared of this method before)
Thx in advance
Post a Comment