numerical integration - Calculate mean normed distance and normed variance of cone-shaped distribution in N-dimensions
I would like to calculate the mean and variance of the normed distance of a cone-shaped distribution,
$f(x) \propto \exp(-|x|)$,
where $x\in\mathbb{R}^d$, where $d$ can be any positive integer.
In two-dimensions, this distribution looks like
a cone! I can calculate the normalising constant for this distribution using,
Integrate[Exp[-Norm[{x,y}]], {x, -Infinity, Infinity}, {y, -Infinity, Infinity}]
which is $2\pi$. I can then calculate the mean normed distance using,
Integrate[Norm[{x,y}]Exp[-Norm[{x,y}]], {x, -Infinity, Infinity},
{y, -Infinity, Infinity}]
which is 2. Its second moment,
Integrate[Norm[{x,y}]^2Exp[-Norm[{x,y}]], {x, -Infinity, Infinity},
{y, -Infinity, Infinity}]
then allows me to calculate the variance $\mathrm{Var}(|x|) = \mathrm{E}(|x|^2) -\mathrm{E}(|x|)^2 = 2$.
Calculating the normalising constants is easy enough in higher dimensions, but I run into trouble with finding the mean and variance.
Any ideas?
I'm guessing that something can maybe be done using polar coordinates in higher dimensions but this isn't something I know much about!
Answer
For the normalization, we need to determine $\omega$ such that
$$\omega \, \int_{\mathbb{R}^{n}} \mathrm{e}^{-|x|} \, \mathrm{d}x = 1.$$
The first moment is given by
$$\omega \, \int_{\mathbb{R}^{n}} |x|^1 \, \mathrm{e}^{-|x|} \, \mathrm{d}x.$$
For the second, we have to compute
$$\omega \, \int_{\mathbb{R}^{n}} |x|^2 \, \mathrm{e}^{-|x|} \, \mathrm{d}x.$$
All these integrals are radially symmetric.
By introducing polar coordinates, we obtain $$\int_{\mathbb{R}^{n}} |x|^\alpha \, \mathrm{e}^{-|x|} \, \mathrm{d}x =\int_{S^{n-1}}\int_0^\infty \mathrm{e}^{-r} \, r^{\alpha+n-1} \, \mathrm{d} r \, \mathrm{d}S = \omega_n \int_0^\infty \mathrm{e}^{-r} \, r^{\alpha+n-1} \, \mathrm{d} r,$$
where $\omega_n = \frac{2 \pi^{n/2}}{\Gamma \left(\frac{n}{2}\right)}$ is the surface area of the unit sphere in $\mathbb{R}^n$.
Such integrals can be computed symbolically by Mathematica:
v[n_, α_] = 2 π^(n/2)/Gamma[n/2] Integrate[r^α Exp[-r] r^(n - 1), {r, 0, ∞},
Assumptions -> α + n > 0]
$$\frac{2 \pi ^{n/2} \Gamma (n+\alpha )}{\Gamma \left(\frac{n}{2}\right)}$$
So, the $k$-th moment should equal
moment[n_, k_] = FullSimplify[ v[n, k]/v[n, 0], n ∈ Integers && n > 0]
$$\frac{\Gamma (k+n)}{\Gamma (n)}$$
which, for simplicity, equals
moment[n_, k_] = (n + k - 1)!/(n - 1)!
$$\frac{(n+k-1)!}{(n-1)!} $$
So the variance of the distance is given by
var[n_] = FullSimplify[moment[n, 2] - moment[n, 1]^2]
$$n$$
Comments
Post a Comment