Monday

A Brief Note About $log$ of Gamma

On the Distribution of Log of Gamma - Some Facts

David Dobor

A pdf version of this note can be found can be found here.

A random variable $X$ is said to have a gamma distribution with parameters $(\alpha, \beta)$, denoted by $\Gamma(\alpha, \beta)$, if its pdf is given by: $$ f(x) = \frac{1}{\Gamma({\alpha})} \cdot \frac{1}{\beta} \cdot \left( \frac{x}{\beta}\right)^{\alpha - 1} \cdot \mathrm{e}^{-\frac{x}{\beta}} \cdot\, 1_{(0,\infty)}(x) \,\,\, \text{where} \;\; \alpha, \beta > 0 $$

Here $\Gamma(\alpha)$ is the gamma function computed as $\int_0^{\infty} \mathrm{e}^{-y} y^{\alpha - 1} dy$ and $1_{(0,\infty)}(x)$ stands for the indicator function (i.e. it's value is $1$ for non-negative $x$-es and $0$ elsewhere).


Distribution of Log of Gamma

First, we are interested in the distribution of the natural logarithm of Gamma. Let $Y = \log (X)$ where $X\sim\Gamma(\alpha,\beta)$. We want the distribution of $Y$.

We'll use the following well known theorem about computing the density of a transform, $g$, of some random variable $X$. Loosely put, if $g(X) = Y$ and $g$ has an inverse, call it $h(Y)$, then the pdf of $Y$ is: $$ f_Y(y) = f_X(h(y)) |h'(y)| $$

So here, denoting the inverse of $Y = \log (X)$ by $h(Y)$, we have $ X=h(Y)=\mathrm{e}^Y$ and $h'(y) = \mathrm{e}^{y}$. Therefore: \begin{align*} f_Y(y) &= f_X(h(y)) |h'(y)| \\ &= \frac{1}{\beta^\alpha \Gamma(\alpha)} \left(\mathrm{e}^{\alpha (y - 1)} - \mathrm{e}^{-\mathrm{e}^{y/\beta}}\right)\,|\mathrm{e}^{y}| \,\,\,1_{(-\infty,\infty)}(y) \\ &= \frac{1}{\beta^\alpha \Gamma(\alpha)} \left(\mathrm{e}^{\alpha y} - \mathrm{e}^{-\mathrm{e}^{y/\beta}}\right)\,\mathrm{e}^{-y}\mathrm{e}^{y} \,\,\,1_{(-\infty,\infty)}(y) \\ &= \frac{1}{\beta^\alpha \Gamma(\alpha)} \;\; \mathrm{e}^{\left(\alpha y - e^{y/\beta}\right)}\,1_{(-\infty,\infty)}(y) \\ \end{align*}

This expression is a valid pdf - it integrates to 1 - and is said to have a Log-of-Gamma pdf with parameters $(\alpha, \beta)$.


Moments of Log of Gamma

We now find the Moment Generating Function (MGF) of Log-of-Gamma $(\alpha, \beta)$. By definition: $$ M_Y(t) = \int_{-\infty}^{+\infty} \mathrm{e}^{ty} \frac{1}{\beta^\alpha \Gamma(\alpha)} \;\; \mathrm{e}^{\left(\alpha y - e^{y/\beta}\right)} dy $$ To integrate this we use an old trick: we try to leave inside the integral sign the part of the function that integrates to 1 (this is often an easily recognizable pdf); we then pull the constants that get in the way of making this integral equal to 1 out of the integral sign. Luckily, this can be done here: \begin{align*} M_Y(t) &= \frac{1}{\beta^\alpha \; \Gamma(\alpha)} \int_{-\infty}^{+\infty} \mathrm{e}^{ty} \;\; \mathrm{e}^{\left(\alpha y - e^{y/\beta}\right)} dy\\ &= \frac{1}{\beta^\alpha \; \Gamma(\alpha)} \int_{-\infty}^{+\infty} \mathrm{e}^{\left(ty +\alpha y - e^{y/\beta}\right)} dy\\ &= \frac{1}{\beta^\alpha \; \Gamma(\alpha)} \cdot \frac{\beta^{\alpha + t} \; \Gamma(\alpha + t)}{\beta^{\alpha + t} \; \Gamma(\alpha + t)} \cdot \int_{-\infty}^{+\infty} \mathrm{e}^{\left(y(\alpha + t) - e^{y/\beta}\right)} dy\\ &= \frac{ \beta^{\alpha + t} \; \Gamma(\alpha + t) }{\beta^\alpha \; \Gamma(\alpha)} \cdot \int_{-\infty}^{+\infty} \frac{1}{\beta^{t+\alpha} \; \Gamma(\alpha + t)} \mathrm{e}^{\left(y(\alpha + t) - e^{y/\beta}\right)} dy\\ &= \frac{\beta^t \; \Gamma(\alpha + t)}{\Gamma(\alpha)} \end{align*} Where the last equality follows because the thing inside the integral is the Log-of-Gamma pdf derived above and integrates to 1.


Cumulant Generating Function

The Cumulant Generating Function (CGF) of a random variable $Y$ is defined as: $$ S_Y(t) = \log (M_Y(t)) $$ It can be verified that: \begin{align*} \frac{d}{dt} S_Y(t) \Big|_{t=0} = \mathrm{E}\, Y \;\;\; \text{ and} \;\;\; \frac{d^2}{dt^2} S_Y(t) \Big|_{t=0} = Var(Y) \end{align*}


Expectation and Variance of Log of Gamma

To find the expectation of Log-of-Gamma, we differentiate $S_Y(t) = \log (M_Y(t))$ and evaluate the result at $t = 0$. \begin{align*} (S_Y(t))^{'} &= (\log (M_Y(t)))^{'} = \left(\log \left(\frac{\beta^t \; \Gamma(\alpha + t)}{\Gamma(\alpha)} \right)\right)^{'} \\ &= \frac{1}{\frac{\beta^t \; \Gamma(\alpha + t)} {\Gamma(\alpha)}} \cdot \frac{1}{\Gamma(\alpha)} \cdot \left( \beta^t \: \Gamma(\alpha + t) \right)^{'}\\ &= \frac{\Gamma(\alpha)}{\beta^t \; \Gamma(\alpha + t)} \cdot \frac{1}{\Gamma(\alpha)} \cdot \left( \beta^t \log(\beta) \; \Gamma(\alpha + t) + \beta^t \; \Gamma^{'} (\alpha +t) \right)\\ &= \log(\beta) + \frac{\Gamma^{'} (\alpha +t) }{\Gamma(\alpha + t)}\\ \end{align*} The last term is the log-derivative of the Gamma function evaluated at $\alpha + t$. This log-derivative is called the digamma function and is denoted by $\psi(\cdot)$. Thus: $$ \boxed{(S_Y(t))^{'} = \log(\beta) +\psi(\alpha +t)} $$ We conclude that if $Y$ is Log-of-Gamma$(\alpha, \beta)$ distributed, then its expectation is given by: $$ \boxed{\mathrm{E}\, Y = \log(\beta) +\psi(\alpha)} $$ To find the variance of $Y$, we look at the second derivative, $(S_Y(t))^{''}$, and evaluate it at $t=0$. \begin{align*} (S_Y(t))^{''} &= (\log(\beta) +\psi(\alpha +t))^{'}\\ &= \psi^{'}(\alpha +t) \end{align*} The derivative of digamma is the so-called trigamma function: $$ \boxed{Var(Y) = \psi^{'}(\alpha)} $$