D. conditional probability that is assigned after the

D. How is
Bayes theorem important in avoiding fallacies commonly made in law and
courtrooms?

Base rate fallacy

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

If presented with related generic, general
information and specific information (information concerning a particular case),
the mind tends to ignore the former and focus on the latter.1
This can be avoided using Bayes’ Theorem, as it takes into account prior
probability (which
is the probability assessed before making reference to certain relevant
observations, especially subjectively or on the assumption that all possible
outcomes be given the same probability) and posterior probability (which is the conditional probability that is assigned after the relevant evidence or
background is taken into account)

 

Example:
Drunk drivers measured by a breathalyzer.

The
conditions are as follows:

?      Breathalyzers
display a false result in 5% of the cases tested.

?      They never fail
to detect a person who is truly drunk.

?       drivers on the road are drunk.

 

If a policeman stops a driver at random and the breathalyzer displays that the driver is drunk, what is the probability that the driver is truly drunk?

?     
For the 1 driver who is
truly drunk, there is a 100% chance of a true
positive test result as the second condition stated. Thus, the true positive test result is 1.

?     
For the other 999 drivers
who are not drunk, there is a 5% chance of false
positive test results, as the first condition stated. Thus, the false positive test results is 49.95
(999 x 5%)

From this, we can calculate
the probability that one of the drivers among the 50.95 (1 + 49.95) positive
test results, who is really drunk:

 = 0.019627

 

This can be confirmed with
Bayes Theorem.

Notations:

D = drunk

S = sober

B = breathalyzer indicates that the
driver is drunk

Result

Drunk

Sober

Test
Positive

1

0.05

Test
Negative

0

0.95

Historic
Data

0.001

0.999

 

Notation form:

P(D) = 0.001

P(S) = 0.999

P(B|D) = 1.00

P(B|S) = 0.005

Figure 1: Venn Diagram depicting
probability of driver being drunk or sober and results from breathalyzer.

 

The probability of the
driver being drunk, given that the breathalyzer indicates that h/she is drunk
is represented by p(D|B).

According to Bayes’ theorem:

P (D|B) =

From that we can arrive at
p(D):

P(D) = P(B|D)P(D) + P(B|S) P(S)

 

Computing above values to
obtain value of P(D) (refer to table and following notation
form):        

P(D) = (1.00 x 0.001) + (0.05 x
0.999)

                     = 0.05095

 

Computing all values into
Bayes’ Theorem formula:

P(D|B) =

              = 0.019627

1 Woodcock, S. (April 5, 2017) Paradoxes of probability and other statistical strangeness