Skip to main content

All Questions

2 votes
2 answers
109 views

If I engineer a new feature such that feature C = feature A/feature B, must I drop features A and B from a Gaussian Naive Bayes model?

As the question asks, is it bad data science not to drop the dividend and divisor features when creating a new feature that is their quotient when working with a Naive Bayes model? My understanding of ...
NaiveBae's user avatar
  • 257
1 vote
0 answers
227 views

Improving the Naive Bayes classifier performance through decorrelation?

I was wondering if it is possible to improve the performance of the Naïve Bayes classifier by decorrelating the data. The Naïve Bayes assumes conditional independence of the features given some class $...
ibayramli's user avatar
1 vote
2 answers
395 views

Derivation of the formula for the probability of a class, given conditionally independent attributes

The following is a formula that finds the posterior probability of a class (i.e. yes or no) given four conditionally independent attributes: $$P(c|X) = P(x_1|c)\cdot P(x_2|c)\cdot P(x_3|c)\cdot P(x_4|...
spacedustpi's user avatar
0 votes
0 answers
833 views

Naive bayes example by hand

Given the following data ...
baxx's user avatar
  • 946
3 votes
1 answer
652 views

Conditional Independence Example

Is there a canonical example of data which are conditionally independent? In other words, $X_1,\ldots,X_p$ are mutually independent given $Y$. This is the foundational assumption of the naive Bayes ...
jjet's user avatar
  • 1,287