Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

2
  • 1
    $\begingroup$ was just thinking - is there a mathematically precise way you can define "over fitting"? if you can, it is likely you can also build features into a likelihood function or a prior to avoid it happening. my thinking is that this notion sounds similar to "outliers". $\endgroup$ Commented May 5, 2019 at 13:49
  • $\begingroup$ Your example of a posterior "reverting" to the prior is known as posterior collapse. It's not a bug of the Bayesian method but rather an indication of insufficient information in your sample (at least for parts of your model). In a non-Bayesian setting your model would simply be unidentified. $\endgroup$
    – Durden
    Commented Mar 5 at 17:17