next up previous contents
Next: Learning Bayesian Networks Up: Bayesian Networks Previous: Bayesian Networks

Bayesian Inference

To infer knowledge from a Bayesian network, we use Bayesian inference. First we introduce some terminology:

Prior Odds are predictive support with only background knowledge:

\begin{displaymath}
O(H)=\frac{P(H)}{P(\neg H)} = \frac{P(H)}{1-P(H)}
\end{displaymath}

Likelihood Ratio is diagnostic support given to H by observed evidence:

\begin{displaymath}
L(e \mid H)=\frac{P(e \mid H)}{P(e \mid \neg H)}
\end{displaymath}

Posterior Odds are predictive support given observed evidence, e:

\begin{displaymath}
O(H \mid e)= \frac{P(H \mid e)}{P(\neg H \mid e)} = L(e \mid H) O(H)
\end{displaymath}

By using these definitions, we can infer knowledge like in probability theory. We illustrate this with an example:

A salesman has installed an alarm in a shop, and it is known that the alarm obeys these probabilities[*]:

\begin{displaymath}
P(Alarm \mid Burglar) = 0.95
\end{displaymath}

\begin{displaymath}
P(Alarm \mid \neg Burglar) = 0.01
\end{displaymath}

P(Burgulary) = 10-4

To find the probability that it is a burglar if the alarm goes, we do the following: First we find the posterior odds of it being a burglar:

\begin{displaymath}
O(Burglar \mid Alarm) = L(Alarm \mid Burglar) * O(Burglar)
= \frac{0.95}{0.01 * 10^{-4}} = 0.0095
\end{displaymath}

To convert odds to probability, we use: (deducted from the prior odds)

\begin{displaymath}
P(A) = \frac{O(A)}{1+O(A)}
\end{displaymath}

This implies:

\begin{displaymath}
P(Burglar \mid Alarm) = \frac{0.0095}{1+0.0095}=0.00941
\end{displaymath}



Torgeir Dingsoyr
2/26/1998