Access over 20 million homework & study documents

01 mathematics for machine learning author m p deisenroth a a faisal and c s ong 280

Content type
User Generated
Rating
Showing Page:
1/1

Sign up to view the full document!

lock_open Sign Up
Unformatted Attachment Preview
Bayesian inference is about learning the distribution of random variables. Bayesian inference Bayesian inference inverts the relationship between parameters and the data. 274 When Models Meet Data uses the prediction p(x|0*) to make decisions. These decision-making systems typically have different objective functions than the likelihood, a squared-error loss or a mis-classification error. Therefore, having the full posterior distribution around can be extremely useful and leads to more robust decisions. Bayesian inference is about finding this posterior distri- bution (Gelman et al., 2004). For a dataset X, a parameter prior p(0), and a likelihood function, the posterior p(0|x) = p(X) = [ p(X | 0)p(0)dº ; (8.22) is obtained by applying Bayes' theorem. The key idea is to exploit Bayes' theorem to invert the relationship between the parameters and the data X (given by the likelihood) to obtain the posterior distribution p(0|X). The implication of having a posterior distribution on the parameters is that it can be used to propagate uncertainty from the parameters to the data. More specifically, with a distribution p(0) on the parameters our predictions will be p(X|0)p(0) P(X) 7 p(x) = [ p(x|0)p(0)d0 = Eo[p(x|0)], (8.23) and they no longer depend on the model parameters 0, which have been marginalized/integrated out. Equation (8.23) reveals that the prediction is an average over all plausible parameter values 0, where the plausibility ...
Purchase document to see full attachment
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

Anonymous
Great study resource, helped me a lot.

Studypool
4.7
Trustpilot
4.5
Sitejabber
4.4