The Graphical Model in Machine Learning

The Graphical Model in Machine Learning


The Graphical model is a subdivision of Machine Learning. It uses a graph to signify a domain problem. A graph states the conditional need structure between random variables. These are being used in many Machine Learning algorithms. For example;

  • Naive Bayes’ algorithm
  • The Hidden Markov Model
  • Restricted Boltzmann machine
  • Neural Networks

In this article, we will learn about the graphical model, its types and application.


There are lots of causes to learn about graphical or probabilistic modeling. Among them one of that it is a charming scientific field with a beautiful theory. That bonds in astonishing ways two very different branches of mathematics:

  • Probability and
  • Graph theory

It has similarly interesting links to philosophy, mainly the question of causality. All at once, probabilistic modeling is broadly used all over machine learning. It also used in many real-world applications. These methods may be used to solve problems in fields as varied as drug, linguistic processing, visualization, and many others. This grouping of neat theory and influential applications. It makes graphical models one of the greatest captivating topics in modern artificial intelligence and computer science.


  • Graphical models are graphs in which nodes signify arbitrary variables.
  • The lack of arcs be conditional independence assumptions.
  • They deliver a compact representation of mutual probability distributions.
  • For instance, an atomic representation of the joint, P(X1; : : : ;XN), needs O(2N) parameters if we have N binary random variables.
  • While a graphical model can need exponentially fewer.
  • That depends on which conditional norms we make.
  • This may help together inference and learning.

Types of Graphical Model

The two main types of graphical models are;

  • Directed graphical models (Bayesian networks BNs)
  • Undirected graphical models (Markov networks or Markov random elds MRFs)

Directed graphical models

  • We present a graphical language for stipulating a graphical model, called the directed graphical model.
  • It gives a dense model and concise way to require probabilistic models.
  • It permits the reader to visually parse needs between random variables.
  • A graphical model visually seizures the tactic in which the dual distribution over all random variables may be rotten into a product of factors.
  • That depends simply on a subset of these variables.
  • The joint distribution by itself may be somewhat complicated.
  • It does not state us all about structural properties of the probabilistic model.
  • For instance, the joint distribution p (a; b; c) does not speak us whatever about independence relations.
  • At this point graphical models come into play.
  • Nodes are random variables in a graphical model.

The Graphical Model in Machine Learning

  • In the above figure, the nodes represent the random variables a; b; c.
  • Edges denote probabilistic relations between variables.
  • Not every distribution may be denoted in a specific choice of graphical model.
  • They are a modest mode to imagine the structure of a probabilistic model.
  • They may be used to design or inspire new kinds of statistical models.
  • Check of the graph only provides us insight into properties, for example, conditional independence.
  • Intricate computations for inference and learning in statistical models can be said in relations of graphical handlings.
  • We may construct the matching directed graphical model from a factorized joint distribution as follows:
  1. Generate a node for all random variables.
  2. We enhance a directed link or arrow to the graph from the nodes matching to the variables on which the distribution is conditioned for each conditional distribution.

Undirected Graphical Model​

Undirected Graphical Model​

  • The undirected graph made known can have one of numerous interpretations.
  • The common story is that the presence of an edge involves some sort of dependence between the matching random variables.
  • We might infer from this graph that B , C , D are all equally independent once A is identified.
  • Consistently in this case that
  • For some non-negative functions f A B   , f A C   , f A D

Bayesian Network​

  • The model represents a factorization of the joint probability of all random variables if the network structure of the model is a directed acyclic graph.
  • More exactly, if the events are X 1   , … , X n then the joint probability fulfills
  • Where, pa (X i) is the set of parents of node X i.
  • As another way, the joint distribution factors into a product of conditional distributions.
  • For instance, the directed acyclic graph displayed in the Figure this factorization will be
  • Any two nodes are temporarily independent given the values of their parents.
  • Any two sets of nodes are in theory independent given a third set if a criterion called d-separation clenches in the graph.
  • In Bayesian networks, local and global independences are equal.
  • This kind of graphical model is recognized as a directed graphical model.


Applications of graphical models comprise as;

  • Causal inference
  • Information extraction
  • Speech recognition
  • Computer vision
  • Decoding of low-density parity-check codes
  • Modeling of gene regulatory networks
  • Gene finding and diagnosis of diseases
  • Graphical models for protein structure.
Mansoor Ahmed is Chemical Engineer, web developer, a writer currently living in Pakistan. My interests range from technology to web development. I am also interested in programming, writing, and reading.
Posts created 421

Related Posts

Begin typing your search term above and press enter to search. Press ESC to cancel.

Back To Top