Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit b640902

Browse files
Update README.md
1 parent b75fa6b commit b640902

File tree

1 file changed

+15
-0
lines changed

1 file changed

+15
-0
lines changed

‎README.md‎

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -466,6 +466,21 @@ Bayes' theorem can be stated as follows:
466466
and how likely A is on its own, written P(A)
467467
and how likely B is on its own, written P(B)
468468

469+
#### Bayes Theorem Rule:
470+
471+
The rule has a very simple derivation that directly leads from the relationship between joint and conditional probabilities. First, note that P(A,B) = P(A|B)P(B) = P(B,A) = P(B|A)P(A). Next, we can set the two terms involving conditional probabilities equal to each other, so P(A|B)P(B) = P(B|A)P(A), and finally, divide both sides by P(B) to arrive at Bayes rule.
472+
473+
In this formula, A is the event we want the probability of, and B is the new evidence that is related to A in some way.
474+
475+
P(A|B) is called the **posterior**; this is what we are trying to estimate. In the above example, this would be the "probability of having cancer given that the person is a smoker".
476+
477+
P(B|A) is called the **likelihood**; this is the probability of observing the new evidence, given our initial hypothesis. In the above example, this would be the "probability of being a smoker given that the person has cancer".
478+
479+
P(A) is called the **prior**; this is the probability of our hypothesis without any additional prior information. In the above example, this would be the "probability of having cancer".
480+
481+
P(B) is called the marginal **likelihood**; this is the total probability of observing the evidence. In the above example, this would be the "probability of being a smoker". In many applications of Bayes Rule, this is ignored, as it mainly serves as normalization.
482+
483+
##### Example:
469484
Let us say P(Fire) means how often there is fire, and P(Smoke) means how often we see smoke, then:
470485

471486
P(Fire|Smoke) means how often there is fire when we can see smoke

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /