4.4 Normal!

For a univariate Normal distribution \(X \sim N(\theta,\sigma^2)\),

\[\begin{align} f_X(x) & = \frac{1}{\sqrt{2\pi \sigma^2}}·e^{-\frac{(x-\theta)^2}{2\sigma^2}} \\ Log f(X;\theta) & = -\frac{1}{2} Log(2\pi \sigma^2) - \frac{(x-\theta)^2}{2\sigma^2}\end{align}\]

Then, we take the first and second derivative:

\[\begin{align} \frac{\partial}{\partial \theta} Log f(X;\theta) & = [-\frac{x^2-2x\theta + \theta^2}{2\sigma^2}]’ = \frac{x}{\sigma^2} - \frac{\theta}{\sigma^2} \\ \frac{\partial^2}{\partial \theta^2} Log f(X;\theta) & = -\frac{1}{\sigma^2} \end{align}\]

From there, it’s just smooth sailing:

\[I(\theta) =-E(-\frac{1}{\sigma^2}) = \frac{1}{\sigma^2}\]

 

 

Take a quick second (or more like 15 seconds) and calculate what the Jeffreys Prior is here.

 

\[I(\theta) =-E(-\frac{1}{\sigma^2}) = \frac{1}{\sigma^2} \;\;\; \rightarrow \;\;\; \pi_J \propto (\sigma^2)^{-\frac{1}{2}} = \frac{1}{\sigma}\]

 

 

 

 

 

4.4.1 A little change

What if we have multiple (say, \(n\)) observations? How would the Jeffreys Prior change?

 

Recall: \(I_n(\theta) = n·I_1(\theta)\)

\[I_n(\theta) =-E(-\frac{n}{\sigma^2}) = \frac{n}{\sigma^2} \;\;\; \rightarrow \;\;\; \pi_J \propto \left(\frac{n}{\sigma^2}\right)^{1/2} = \frac{\sqrt{n}}{\sigma}\]