##### (Answered)-8.1 Components of variance: Consider the hierarchical model where ?1, . . . , ?m|µ, t 2 ~ i.i.d....

**Description**

Solution download

**Question**

8.1 Components of variance: Consider the hierarchical model where ?1, . . . , ?m|µ, t 2 ~ i.i.d. normal(µ, t 2 ) y1,j , . . . , ynj ,j |?j , s2 ~ i.i.d. normal(?j , s2 ). For this problem, we will eventually compute the following: Var[yi,j |?i , s2 ], Var[¯y·,j |?i , s2 ], Cov[yi1,j , yi2,j |?j , s2 ] Var[yi,j |µ, t 2 ], Var[¯y·,j |µ, t 2 ], Cov[yi1,j , yi2,j |µ, t 2 ] First, lets use our intuition to guess at the answers: a) Which do you think is bigger, Var[yi,j |?i , s2 ] or Var[yi,j |µ, t 2 ]? To guide your intuition, you can interpret the first as the variability of the Y ’s when sampling from a fixed group, and the second as the variability in first sampling a group, then sampling a unit from within the group. b) Do you think Cov[yi1,j , yi2,j |?j , s2 ] is negative, positive, or zero? Answer the same for Cov[yi1,j , yi2,j |µ, t ]. You may want to think about what yi2,j tells you about yi1,j if ?j is known, and what it tells you when ?j is unknown. c) Now compute each of the six quantities above and compare to your answers in a) and b). d) Now assume we have a prior p(µ) for µ. Using Bayes’ rule, show that p(µ|?1, . . . , ?m, s2 , t 2 , y1 , . . . , ym) = p(µ|?1, . . . , ?m, t 2 ).

Solution ID:10137771 | Question answered on 16-Oct-2016

Price :*$14.649999999999999*