UA MATH566 统计理论 QE练习题1

2014年1月理论题目4-6。
UA MATH566 统计理论 QE练习题1

UA MATH566 统计理论 QE练习题1
Ly9ibG9nLmNzZG4ubmV0L3dlaXhpbl80NDIwNzk3NA==,size_16,color_FFFFFF,t_70)

第四题

Part (a)
Joint likelihood function of random sample is
L(θ)=i=1n(θccXic1e(Xi/θ)c)=θnccn(i=1nXi)c1e1θci=1nXicl(θ)=logL(θ)=nclogθ+nlogc+(c1)i=1nlogXi1θci=1nXicL(\theta) = \prod_{i=1}^n \left( \theta^{-c} cX_i^{c-1}e^{-(X_i/\theta)^c} \right) = \theta^{-nc}c^n (\prod_{i=1}^n X_i)^{c-1}e^{-\frac{1}{\theta^c}\sum_{i=1}^n X_i^c} \\ l(\theta) = \log L(\theta) = -nc \log \theta + n \log c + (c-1)\sum_{i=1}^n \log X_i - \frac{1}{\theta^c} \sum_{i=1}^n X_i^c

Take derivative of log-likelihood and let it be zero,
l(θ)=ncθ+cθc+1i=1nXic=0l'(\theta) = -\frac{nc}{\theta} + \frac{c}{\theta^{c+1}} \sum_{i=1}^n X_i^c = 0

Solve this equation and we can find MLE of θ\theta,
θ^=(1ni=1nXic)1/c\hat{\theta} = \left( \frac{1}{n}\sum_{i=1}^n X_i^c \right)^{1/c}

Part (b)
Compute the density function of X1cX_1^c,
P(X1cy)=P(X1y1/c)=FX(y1/c)fX1c(y)=1cy1/c1θccyc1/ce(y1/c/θ)c=θceθcy,y>0P(X_1^c \le y) = P(X_1 \le y^{1/c}) = F_X(y^{1/c}) \\ f_{X_1^c}(y) = \frac{1}{c}y^{1/c-1} \theta^{-c}cy^{c-1/c}e^{-(y^{1/c}/\theta)^c} = \theta^{-c}e^{-\theta^{-c}y},y>0

Obviously X1cEXP(θc)X_1^c \sim EXP(\theta^c),equivalently, Γ(1,θc)\Gamma(1,\theta^{-c}). By additivity of Gamma distribution, i=1nXicΓ(n,θc)\sum_{i=1}^n X_i^c \sim \Gamma(n,\theta^{-c}). By scale transformation, 1ni=1nXicΓ(n,nθc)\frac{1}{n}\sum_{i=1}^n X_i^c \sim \Gamma(n,n\theta^{-c}). Let Y=1ni=1nXicY = \frac{1}{n}\sum_{i=1}^n X_i^c,
EY1/c=0y1/c(nθc)nyn1Γ(n)enθcydy=(nθc)nΓ(n)0y1/c+n1enθcydy=(nθc)1/cΓ(n)0(nθcy)1/c+n1enθcydnθcy=(nθc)1/cΓ(1/c+n)Γ(n)=θn1/cΓ(1/c+n)Γ(n)E(Y1/cΓ(n)n1/cΓ(1/c+n))=θEY^{1/c} = \int_{0}^{\infty} y^{1/c} \frac{(n\theta^{-c})^n y^{n-1}}{\Gamma(n)}e^{-n\theta^{-c}y}dy = \frac{(n\theta^{-c})^n }{\Gamma(n)} \int_{0}^{\infty} y^{1/c+n-1}e^{-n\theta^{-c}y}dy \\ = \frac{(n\theta^{-c})^{-1/c} }{\Gamma(n)} \int_{0}^{\infty} (n\theta^{-c}y)^{1/c+n-1}e^{-n\theta^{-c}y}dn\theta^{-c}y = \frac{(n\theta^{-c})^{-1/c} \Gamma(1/c+n)}{\Gamma(n)}\\= \theta\frac{n^{-1/c} \Gamma(1/c+n)}{\Gamma(n)} \\ \Rightarrow E \left( \frac{Y^{1/c}\Gamma(n)}{n^{1/c} \Gamma(1/c+n)} \right) = \theta

So we get an unbiased estimator of θ\theta. See joint likelihood L(θ)L(\theta), by Neyman-Fisher Theorem, we know i=1nXic\sum_{i=1}^n X_i^c is sufficient statistics. Plus,
L(θ)=cn(i=1nXi)c1exp(1θci=1nXicnclogθ)L(\theta) = c^n (\prod_{i=1}^n X_i)^{c-1}\exp\left( -\frac{1}{\theta^c}\sum_{i=1}^n X_i^c - nc \log \theta \right)

indicating it belongs to exponential family. So i=1nXic\sum_{i=1}^n X_i^c is also complete. Since MLE is definitely a function of i=1nXic\sum_{i=1}^n X_i^c, by Lehmann-Scheffe Theorem, Y1/cΓ(n)n1/cΓ(1/c+n)\frac{Y^{1/c}\Gamma(n)}{n^{1/c} \Gamma(1/c+n)} is a UMVUE of θ\theta.

Part ( c)
For arbitrary θ1θ0θ2\theta_1 \le \theta_0 \le \theta_2, let’s compute likelihood ratio
L(θ1)L(θ2)=θ1nccn(i=1nXi)c1e1θ1ci=1nXicθ2nccn(i=1nXi)c1e1θ2ci=1nXic=(θ2θ1)ncexp((1/θ2c1/θ1c)i=1nXic)\frac{L(\theta_1)}{L(\theta_2)} = \frac{\theta_1^{-nc}c^n (\prod_{i=1}^n X_i)^{c-1}e^{-\frac{1}{\theta_1^c}\sum_{i=1}^n X_i^c}}{\theta_2^{-nc}c^n (\prod_{i=1}^n X_i)^{c-1}e^{-\frac{1}{\theta_2^c}\sum_{i=1}^n X_i^c}} \\ = \left( \frac{\theta_2}{\theta_1} \right)^{nc} \exp \left( (1/\theta_2^c - 1/\theta_1^c)\sum_{i=1}^n X_i^c \right)

where (1/θ2c1/θ1c)<0(1/\theta_2^c - 1/\theta_1^c)<0. To make this likelihood ratio smaller than some number, i=1nXic\sum_{i=1}^n X_i^c should be greater than kαk_{\alpha} such that
P(i=1nXic>kα)=αP(\sum_{i=1}^n X_i^c > k_{\alpha}) = \alpha

We have discussed above that i=1nXicΓ(n,θc)\sum_{i=1}^n X_i^c \sim \Gamma(n,\theta^{-c}), so 2i=1nXic/θ0cχ2n22\sum_{i=1}^n X_i^c/\theta_0^c \sim \chi_{2n}^2. Let kαk_{\alpha} is the upper α\alpha-quantile of χ2n2\chi_{2n}^2. And the rejection region is {X1,X2,,Xn:2i=1nXic/θ0c>kα}\{X_1,X_2,\cdots,X_n:2\sum_{i=1}^n X_i^c/\theta_0^c > k_{\alpha}\}.

第五题

Part (a)
Joint likelihood function of random sample is
L(θ)=i=1nθ1Xi(1θ)/θ=θnexp(θ12θ[2i=1nlogXi])L(\theta) = \prod_{i=1}^n \theta^{-1}X_i^{(1-\theta)/\theta} = \theta^{-n}\exp \left( \frac{\theta-1}{2\theta} [-2\sum_{i=1}^n \log X_i]\right)

By Neyman-Fisher Theorem, T(X)=2i=1nlogXiT(X) =-2\sum_{i=1}^n \log X_i is sufficient statistics. For two different groups of random sample,
L(θX)L(θY)=θnexp(θ12θ[2i=1nlogXi])θnexp(θ12θ[2i=1nlogYi])=exp(θ12θ[2i=1nlogYi2i=1nlogXi])\frac{L(\theta|\textbf{X})}{L(\theta|\textbf{Y})} = \frac{\theta^{-n}\exp \left( \frac{\theta-1}{2\theta} [-2\sum_{i=1}^n \log X_i]\right)}{\theta^{-n}\exp \left( \frac{\theta-1}{2\theta} [-2\sum_{i=1}^n \log Y_i]\right)} \\ = \exp \left( \frac{\theta-1}{2\theta} [2\sum_{i=1}^n \log Y_i-2\sum_{i=1}^n \log X_i]\right)

To make this ratio independent of θ\theta, T(X)=T(Y)T(X) = T(Y) holds. So T(X)T(X) is minimal sufficient statistics.

Part(b)
Compute
P(Yy)=P(2logX1y)=P(X1ey/2)=1FX(ey/2)fY(y)=12ey/2fX(ey/2)=12ey/2θ1ey(1θ)/2θ=12θey/2θ,y>0P(Y \le y) = P(-2\log X_1 \le y) = P(X_1 \ge e^{-y/2}) = 1 - F_X(e^{-y/2}) \\ f_Y(y) = \frac{1}{2}e^{-y/2}f_X(e^{-y/2}) = \frac{1}{2}e^{-y/2} \theta^{-1}e^{-y(1-\theta)/2\theta} = \frac{1}{2\theta}e^{-y/2\theta},y>0

So YEXP(2θ)Y \sim EXP(2\theta), equivalently Γ(1,1/2θ)\Gamma(1,1/2\theta).

Part ( c)
By additivity of Gamma distribution, T(X)Γ(n,1/2θ)T(X) \sim \Gamma(n,1/2\theta). By scale transformation, T(X)/2θΓ(n,1)T(X)/2\theta \sim \Gamma(n,1). Let LL be 2.5%-quantile of Γ(n,1)\Gamma(n,1), UU be 97.5%-quantile of Γ(n,1)\Gamma(n,1),and then the 95% confidential interval is
LT(X)/2θUT(X)2UθT(X)2LL \le T(X) /2\theta\le U \Rightarrow \frac{T(X)}{2U}\le \theta \le \frac{T(X)}{2L}

Part (d)
The expected length of confidential interval is
E[T(X)(12L12U)]=nθ(UL)LUE[T(X)\left( \frac{1}{2L} - \frac{1}{2U} \right)] = \frac{n\theta(U-L)}{LU}

(I can think of a different way but I need to attach the solution here.)
UA MATH566 统计理论 QE练习题1
(参考UA MATH564 概率论VI 数理统计基础3 卡方分布的正态近似)

第六题

UA MATH566 统计理论 QE练习题1
Part (a)
Joint likelihood of random sample is
L(β)=i=1n(xiβ)YiexiβYi!=(i=1nxiYiYi!)βnYˉenβxˉl(β)=log(i=1nxiYiYi!)+nYˉlogβnxˉβL(\beta) = \prod_{i=1}^n \frac{(x_i \beta)^{Y_i}e^{-x_i\beta}}{Y_i !} = \left( \prod_{i=1}^n \frac{x_i^{Y_i}}{Y_i !} \right) \beta^{n\bar{Y}}e^{-n\beta \bar{x}} \\ l(\beta) = \log \left( \prod_{i=1}^n \frac{x_i^{Y_i}}{Y_i !} \right) + n\bar{Y} \log \beta - n \bar{x} \beta

MLE of β\beta is given by arg maxl(β)\argmax l(\beta), which is
nYˉβnxˉ=0β^=Yˉxˉ\frac{n\bar{Y}}{\beta} - n\bar{x} = 0\Rightarrow \hat{\beta} = \frac{\bar{Y}}{\bar{x}}

Part (b)
Eβ^=1xˉEYˉ=1nxˉi=1nEYi=βi=1nxinxˉ=βVar(β^)=1n2xˉ2i=1nVarYi=βi=1nxin2xˉ2=βnxˉE\hat{\beta} = \frac{1}{\bar{x}} E\bar{Y} = \frac{1}{n\bar{x}} \sum_{i=1}^n EY_i = \frac{\beta \sum_{i=1}^n x_i }{n\bar{x}} = \beta \\ Var(\hat{\beta}) = \frac{1}{n^2\bar{x}^2} \sum_{i=1}^n VarY_i = \frac{\beta \sum_{i=1}^n x_i}{n^2\bar{x}^2} = \frac{\beta}{n\bar{x}}

Part ( c)
Posterior kernel of β\beta is
π(βY)L(β)π(βw,b0)βnYˉenβxˉβwb01ewβ=βnYˉ+wb01eβ(w+nxˉ)\pi(\beta|\textbf{Y}) \propto L(\beta)\pi(\beta|w,b_0) \propto \beta^{n\bar{Y}}e^{-n\beta \bar{x}} \beta^{wb_0-1}e^{-w\beta} = \beta^{n\bar{Y}+wb_0-1}e^{-\beta(w+n\bar{x})}

This is the kernel of Γ(nYˉ+wb0,1w+nxˉ)\Gamma(n\bar{Y}+wb_0,\frac{1}{w+n\bar{x}}). So posterior density of β\beta is
π(βY)=(w+nxˉ)nYˉ+wb0Γ(nYˉ+wb0)βnYˉ+wb01eβ(w+nxˉ),β>0\pi(\beta|\textbf{Y}) = \frac{(w+n\bar{x})^{n\bar{Y}+wb_0}}{\Gamma(n\bar{Y}+wb_0)}\beta^{n\bar{Y}+wb_0-1}e^{-\beta(w+n\bar{x})},\beta>0

Part (d)
Posterior mean of β\beta is
E[βY]=nYˉ+wb0w+nxˉ=nxˉβ^w+nxˉ+wb0w+nxˉE[\beta|\textbf{Y}] = \frac{n\bar{Y}+wb_0}{w+n\bar{x}} = \frac{n\bar{x}\hat{\beta}}{w+n\bar{x}} +\frac{wb_0}{w+n\bar{x}}

definitely weighted average of β^\hat{\beta} and b0b_0. When w0w \to 0, E[βY]β^E[\beta|\textbf{Y}] \to \hat{\beta}.

相关文章:

  • 2021-10-29
  • 2021-08-28
  • 2022-01-22
  • 2021-08-20
  • 2022-12-23
  • 2021-06-15
  • 2021-04-08
  • 2021-12-17
猜你喜欢
  • 2021-04-14
  • 2021-11-27
  • 2021-09-18
  • 2021-11-28
  • 2021-09-07
  • 2021-07-19
  • 2021-04-03
相关资源
相似解决方案