# Machine Learning and Naive Bayes discussion questions

User Generated

Yla_fha

Computer Science

## Description

See the attachments below for more details.

Attachment preview

1. Suppose that we have two bags each containing black and white balls. One bag contains three times as many white balls as blacks. The other bag contains three times as many black balls as white. Suppose we choose one of these bags at random. For this bag we select five balls at random, replacing each ball after it has been selected. The result is that we find 4 white balls and one black. What is the probability what we were using the bag with mainly white balls?

### Unformatted Attachment Preview

Question 4 - MLE and Naive Bayes - 10 points These questions are based on maximum likelihood estimator and Naive Bayes 1. Suppose that we have two bags each containing black and white balls. One bag contains three times as many white balls as blacks. The other bag contains three times as many black balls as white. Suppose we choose one of these bags at random. For this bag we select five balls at random, replacing each ball after it has been selected. The result is that we find 4 white balls and one black. What is the probability that we were using the bag with mainly white balls? 2. Given n observations 21, 22, ..., In, Show that the Maximum likelihood estimate of a normal distribution is u == 1;/n and variance is o2 =-1(0; – u)? /n. Hint: consider the formula for normal distribution and find partial derivatives with respect to p and o? 3. Describe how random, max and bayesian classifiers are used for prediction and provide at least one scenario where each classifier can perform as well as the other two. if not possible, state why. vi 4. Suppose that the probability distribution of a random variable that represent categories is given by P(X = k) = 1/2k where k is the category. Only the categories with at least 0.03 are included in the computation of naive bayes estimator. The probability of feature i in category j is given by the formula 1/2 i + j). Compute the probability that a given observation with 3 features is in category 2. 5. Using the same assumptions as the previous part, deduce a general formula for the probability of given n observations is in the category with the index n/2. No need to simplify.
Purchase answer to see full attachment
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

Here is your assignment :). Let me know if you need more help :)

1

Question 4 - MLE and Naive Bayes - 10 points
1. Solution:
𝑃(𝑎1 |𝐵) =

𝑃(𝐵|𝑎1 ) ∗ (𝑃(𝑎1 )
𝑃(𝐵|𝑎1) ∗ 𝑃(𝑎1 ) + 𝑃(𝐵|𝑎1 ) ∗ 𝑃(𝑎2 )

The probability of a b (bags with mostly while balls):
3 4 1 1
5
𝑃(𝐵|𝑎1 ) = ( ) ( ) ( )
1 4
4
=

𝟒𝟎𝟓
𝟏𝟎𝟐𝟒

Similarly:
1 4 3 1
5
𝑃(𝐵|𝑎2 ) = ( ) ( ) ( )
1 4
4
=

𝟏𝟓
𝟏𝟎𝟐𝟒

Therefore we have:
405
1024
𝑃(𝑎1|𝐵) =
405
15
+
1024 1024
=

405
420

= 𝟎. 𝟗𝟔...

### Review

Anonymous
Really helpful material, saved me a great deal of time.

Studypool
4.7
Indeed
4.5
Sitejabber
4.4