Belief Propagation Algorithm

AI
3 min readNov 25, 2020

Belief Propagation Algorithm

Belief Propagation Algorithm was proposed by Judea Pearl in 1982 to formulate it as an exact inference algorithm on trees and polytrees.

This Algorithm is also known as message passing algorithm or sum-product algorithm which is mainly used to perform inference or to derive conclusion from graphical model i.e (Directed Acyclic Graph).

Fig 1

Where and how it is Used ?

It is mostly used in Bayesian Network,Markov Random Fields,low density parity checking,turbo codes etc

Let us see how it is used in Bayesian Belief Networks

Fig 2

Bayesian Belief Network
· Bayesian Belief Network is a graphical model that represents the conditional dependencies between the nodes.

· In the above graph stated the nodes represent the variables which can be discrete and edges represent casual relationship.

· Belief Propagation allows the variable to talk with each other and exchange information between them.

· In the above example Belief Propagation helps us to

Predict the umbrella sale depending upon the weather

Similarly BPA can also be used to show the probability of green leave is high when there is high probability of rainy season

Predicting health depending upon weather

· Belief Propagation Algorithm uses 2 types of probabilities in Bayesian Networks

Joint Probability

Conditional Probability

P(x1,x2,x3……xn)=P(xi|Parent(xi))

Let’s Elaborate the below example using Belief Propagation Algorithm using joint and conditional probability

Fig 3

In the above example BP uses the following Properties.

· Likelihood function: It holds the information about the observation of children,eg the likelihood of wetgrass having high probability is when there is rain.We can say this function is product of all incoming messages.

· Priors: These includes the probabilities of events which are already known to the user or given.This function sums up all possible combinations of parent values times the product of respective incoming messages.

· Belief: This function takes the probabilities that are known and generate new belief from them.

P(x1,x2,x3……xn)=P(xi|Parent(xi))

Suppose if we have to find the probability of wet grass it can be achieved using the properties mentioned above

P(W)=P(W|R) * P(R) + P(W|R’) * P(R’)

Thus the certainity and uncertainity of the random variables can be made out using Belief Propagation Algorithm in Bayesian Networks

Contributors:

Shubham M. | Abrar M.

Aditya N. | Ayush R.

--

--