Exercise 3.14 Solution Example - Hoff, A First Course in Bayesian Statistical Methods
標準ベイズ統計学 演習問題 3.14 解答例
a)
answer
The first order condition is
\begin{align*} & \frac{\sum_{i=1}^n y_i}{\theta} = \frac{n - \sum_{i=1}^n y_i}{1 - \theta} \\ \Leftrightarrow \quad & (1 - \theta) \sum_{i=1}^n y_i = \theta (n - \sum_{i=1}^n y_i) \\ \therefore \quad & \hat{\theta} = \frac{\sum_{i=1}^n y_i}{n} \\ \end{align*}and
\begin{align*} J(\theta) &= - \frac{\partial^2 l(\theta \mid \boldsymbol{y})}{\partial \theta^2} \\ &= \frac{\partial^2}{\partial \theta^2} \left( \left( \sum_{i=1}^n y_i \right) \log \theta + \left( n - \sum_{i=1}^n y_i \right) \log (1 - \theta) \right) \\ &= \frac{\partial}{\partial \theta} \left( \left( \sum_{i=1}^n y_i \right) \theta^{-1} - \left( n - \sum_{i=1}^n y_i \right) (1 - \theta)^{-1} \right) \\ &= - \left( \sum_{i=1}^n y_i \right) \theta^{-2} - \left( n - \sum_{i=1}^n y_i \right) (1 - \theta)^{-2} \\ \end{align*}Then, substituting the above \(\hat{\theta}\) into \(J(\hat{\theta})\),
\begin{align*} J(\hat{\theta}) &= - \left( \sum_{i=1}^n y_i \right) \hat{\theta}^{-2} - \left( n - \sum_{i=1}^n y_i \right) (1 - \hat{\theta})^{-2} \\ &= - \left( \sum_{i=1}^n y_i \right) \left( \frac{\sum_{i=1}^n y_i}{n} \right)^{-2} - \left( n - \sum_{i=1}^n y_i \right) \left(1 - \frac{\sum_{i=1}^n y_i}{n} \right)^{-2} \\ &= - \frac{n^2}{\sum_{i=1}^n y_i} - \frac{n^2}{n - \sum_{i=1}^n y_i} \\ \end{align*}therefore the following holds.
\begin{align*} \frac{J(\hat{\theta})}{n} &= - \frac{n}{\sum_{i=1}^n y_i} - \frac{n}{n - \sum_{i=1}^n y_i} \\ &= -n \left( \frac{1}{\sum_{i=1}^n y_i} + \frac{1}{n - \sum_{i=1}^n y_i} \right) \\ \end{align*}b)
answer
Since \(p_U(\theta)\) is probability density,
\begin{align*} & \int_0^1 p_U (\theta) d\theta = 1 \\ \Leftrightarrow \quad & \int_0^1 \theta^{\frac{\sum_{i=1}^n y_i}{n}} (1 - \theta)^{\frac{n - \sum_{i=1}^n y_i}{n}} e^c d\theta = 1 \\ \Leftrightarrow \quad & e^c = \left( \int_0^1 \theta^{\frac{\sum_{i=1}^n y_i}{n}} (1 - \theta)^{\frac{n - \sum_{i=1}^n y_i}{n}} d\theta \right)^{-1} \\ & = B\left(\frac{\sum_{i=1}^n y_i}{n} +1, 2 - \frac{\sum_{i=1}^n y_i}{n} \right)^{-1} \\ \therefore \quad & c = - \log B\left(\frac{\sum_{i=1}^n y_i}{n} +1, 2 - \frac{\sum_{i=1}^n y_i}{n} \right) \\ \end{align*}and
\begin{align*} - \frac{\partial^2 \log p_U (\theta)}{\partial \theta^2} &= n^{-1} \times \left( - \frac{\partial^2 l(\theta \mid \boldsymbol{y})}{\partial \theta^2 }\right) \\ &= n^{-1} \times J(\theta) \\ &= - \left( \frac{\sum_{i=1}^n y_i}{n} \right) \theta^{-2} - \left( 1 - \frac{\sum_{i=1}^n y_i}{n} \right) (1 - \theta)^{-2} \\ \end{align*}c)
answer
We can consider this as a posterior distribution for \(\theta\).
d)
answer
a’)
The first order condition is
\begin{align*} & \theta^{-1} \sum_{i=1}^n y_i - n = 0 \\ \therefore \quad & \hat{\theta} = \frac{\sum_{i=1}^n y_i}{n} \\ \end{align*}and
\begin{align*} J(\theta) &= - \frac{\partial^2 l(\theta \mid \boldsymbol{y})}{\partial \theta^2} \\ &= \frac{\partial^2}{\partial \theta^2} \left( \log \theta \sum_{i=1}^n y_i - n \theta - \sum_{i=1}^n \log y_i! \right) \\ &= \frac{\partial}{\partial \theta} \left( \left( \sum_{i=1}^n y_i \right) \theta^{-1} - n \right) \\ &= - \left( \sum_{i=1}^n y_i \right) \theta^{-2} \\ \end{align*}Then, substituting the above \(\hat{\theta}\) into \(J(\hat{\theta})\),
\begin{align*} J(\hat{\theta}) &= - \left( \sum_{i=1}^n y_i \right) \hat{\theta}^{-2} \\ &= - \left( \sum_{i=1}^n y_i \right) \left( \frac{\sum_{i=1}^n y_i}{n} \right)^{-2} \\ &= - \frac{n^2}{\sum_{i=1}^n y_i} \\ \end{align*}therefore the following holds.
\begin{align*} \frac{J(\hat{\theta})}{n} &= - \frac{n}{\sum_{i=1}^n y_i} \\ \end{align*}b’)
Since \(p_U(\theta)\) is probability density,
\begin{align*} & \int_0^{\infty} p_U (\theta) d \theta = 1 \\ \Leftrightarrow \quad & \int_0^{\infty} \theta^{\frac{\sum_{i=1}^n y_i }{n} } e^{c - \theta} \left( \prod_{i=1}^n y_i! \right)^{-n} d \theta = 1 \\ \Leftrightarrow \quad & e^c = \left( \prod_{i=1}^n y_i! \right)^n \left( \int_0^{\infty} \theta^{\frac{\sum_{i=1}^n y_i}{n} } e^{- \theta} d \theta \right)^{-1} \\ & = \left( \prod_{i=1}^n y_i! \right)^n \Gamma \left( \frac{\sum_{i=1}^n y_i}{n} + 1 \right)^{-1} \\ \end{align*}and
\begin{align*} - \frac{\partial^2 \log p_U (\theta)}{\partial \theta^2} &= n^{-1} \times \left( - \frac{\partial^2 l(\theta \mid \boldsymbol{y})}{\partial \theta^2 }\right) \\ &= n^{-1} \times J(\theta) \\ &= - \left( \frac{\sum_{i=1}^n y_i}{n} \right) \theta^{-2} \end{align*}c’)
We can consider this as a posterior distribution for \(\theta\).