Showing Page:
1/3
BAYESIAN STATISTICS
Student Name
Class name
Date
Showing Page:
2/3
Assignment Bayesian statistics
Assume a negative binomial regression model as follows:
1. Explain an algorithm for extracting of β
󰇛
󰇜
.
To obtain the algorithm of extracting
󰇛
󰇜
, we first we write down the
probability distribution of a zero-truncated negative binomial probability,
Pr(
) =
󰇛
󰇜
󰇛
󰇜

󰇛
󰇜
󰇛
󰇜

Where
g(
) = Pr(Y=
, ) =
г󰇛


󰇜
г󰇛

󰇜г󰇛
󰇜
󰇡

󰇢

󰇡

󰇢
Negative binomial component includes t (exposure time) and k regressor values for Xi .
Thus,
= exp
󰇝

󰇛
󰇜




󰇞
Then calculate MLE of
ln(
) = ln(exp
󰇝

󰇛
󰇜




󰇞
) (taking natural log on both sides)
But ln(exp(ln(t)) is a constant
Therefore,
ln(exp
󰇝

󰇛
󰇜




󰇞
) =




Hence (
,
)
2. Proof

󰇛
󰇜

󰇛

󰇜

󰇛
󰇜
Showing Page:
3/3
The pdf of P(X=x) = 



(a,b) =


dt (where is (a,b) denotes incomplete gamma function )
Then
(n,b) = (n-1)!




given that n>0
P(X=x) =
󰇛󰇜


dt
But
= exp
󰇝

󰇛
󰇜




󰇞
Then calculate MLE of
ln(
) = ln(exp
󰇝

󰇛
󰇜




󰇞
) (taking natural log on both sides)
But ln(exp(ln(t)) is a constant
Therefore,
ln(exp
󰇝

󰇛
󰇜




󰇞
) =





Unformatted Attachment Preview

BAYESIAN STATISTICS Student Name Class name Date Assignment – Bayesian statistics Assume a negative binomial regression model as follows: 1. Explain an algorithm for extracting of β = (𝛽1 , 𝛽2 , 𝛽3 )and 𝜅. To obtain the algorithm of extracting β = (𝛽1 , 𝛽2 , 𝛽3 )and 𝜅, we first we write down the probability distribution of a zero-truncated negative binomial probability, Pr(𝑦𝑖 = 𝑗) = { 𝜋 + (1 − 𝜋)𝑔(𝑦1 = 0) (1 − 𝜋)𝑔(𝑦1 ) 𝑖𝑓 𝑗 = 0 𝑖𝑓 𝑗 > 0 Where g(𝑦𝑖 ) = Pr(Y=𝑦𝑖 | 𝜇𝑖 , 𝛼 ) = г(𝑦𝑖 +𝛼−1 ) 𝛼−1 1 ( ) г(𝛼−1 )г(𝑦 +1) 1+𝛼𝜇 𝑖 𝑖 𝛼𝜇 (1+𝛼𝜇𝑖 ) 𝑦𝑖 𝑖 Negative binomial component includes t (exposure time) and k regressor values for Xi . Thus, 𝜇𝑖 = exp{ln(𝑡𝑖 ) + 𝛽1 𝑥1𝑖 + 𝛽2 𝑥2𝑖 + ⋯ . +𝛽𝑘 𝑥𝑘𝑖 } Then calculate MLE of 𝜇𝑖 ln(𝜇𝑖 ) = ln(exp{ln(𝑡𝑖 ) + 𝛽1 𝑥1𝑖 + 𝛽2 𝑥2𝑖 + ⋯ . +𝛽𝑘 𝑥𝑘𝑖 }) (taking natural log on both sides) But ln(exp(ln(t)) is a constant Therefore, ln(exp{ln(𝑡𝑖 ) + 𝛽1 𝑥1𝑖 + 𝛽2 𝑥2𝑖 + ⋯ . +𝛽𝑘 𝑥𝑘𝑖 }) = 𝛽1 𝑥1𝑖 + 𝛽2 𝑥2𝑖 + ⋯ . +𝛽𝑘 𝑥𝑘𝑖 Hence (𝛽1 , 𝛽2 , 𝛽3) 2. Proof 𝑦𝑖 |𝜆𝑖 ~𝑃𝑜𝑖𝑠𝑠𝑜𝑛(𝜆𝑖 ), 𝜆𝑖 |𝑥𝑖 ~𝐺𝑎𝑚𝑚𝑎(𝜅, 𝜅𝜇𝑖 ), log(𝜇𝑖 ) = 𝛽1 + 𝛽2 𝑥𝑖 + 𝛽3 𝑥𝑖2 𝜆𝑘 The pdf of P(X=x) = −𝑒 −𝜆 ∑𝑛𝑘=0 𝑘! ∞ ⌈(a,b) = ∫𝑏 𝑡 𝛼−1 𝑒 −𝑡 dt (where is ⌈(a,b) denotes incomplete gamma function ) Then 𝑏𝑘 ⌈(n,b) = (n-1)! −𝑒 −𝑏 ∑𝑛−1 𝑘=0 𝑘! given that n>0 1 𝜆 P(X=x) =⌈(𝛽) ∫0 𝑡 𝛼−1 𝑒 −𝑡 dt But 𝜇𝑖 = exp{ln(𝑡𝑖 ) + 𝛽1 𝑥1𝑖 + 𝛽2 𝑥2𝑖 + ⋯ . +𝛽𝑘 𝑥𝑘𝑖 } Then calculate MLE of 𝜇𝑖 ln(𝜇𝑖 ) = ln(exp{ln(𝑡𝑖 ) + 𝛽1 𝑥1𝑖 + 𝛽2 𝑥2𝑖 + ⋯ . +𝛽𝑘 𝑥𝑘𝑖 }) (taking natural log on both sides) But ln(exp(ln(t)) is a constant Therefore, ln(exp{ln(𝑡𝑖 ) + 𝛽1 𝑥1𝑖 + 𝛽2 𝑥2𝑖 + ⋯ . +𝛽𝑘 𝑥𝑘𝑖 }) = 𝛽1 𝑥1𝑖 + 𝛽2 𝑥2𝑖 + ⋯ . +𝛽𝑘 𝑥𝑘𝑖 Name: Description: ...
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.
Studypool
4.7
Trustpilot
4.5
Sitejabber
4.4