Introduction The field of statistics provides us with a lot of tools that may be used to attain the Machine Learning goal of resolving a task. That is not only helpful for the training set then likewise to take a broad view. Introductory concepts for example parameter estimation, bias and variance are valuable to strictly […]
Gradient descent Method in Machine Learning
Introduction Many deep learning models pick up objectives using the gradient-descent method. Gradient-descent optimization needs a big number of training samples for a model to converge. That creates it out of shape for few-shot learning. We train our models to learn to achieve a sure objective in generic deep learning models. However, humans train to […]
Overflow and Underflow in Deep Learning
Introduction Deep learning algorithms generally need a high volume of numerical computation. This normally states to algorithms that solve mathematical problems. That is solved by methods to keep informed guesses of the solution through an iterative process. Somewhat than logically deriving a formula in case a symbolic expression for the correct solution. The general operations […]
Introduction of Vector Data
Introduction There are three changed means to think about vectors. A vector as; An array of numbers (a computer science vision) An arrow with a direction and magnitude (a physics outlook) An object that follows addition and scaling (a mathematical view) In this article, we will understand about Vector Data in detail. Description Vector data […]
The Graphical Model in Machine Learning
Introduction The Graphical model is a subdivision of Machine Learning. It uses a graph to signify a domain problem. A graph states the conditional need structure between random variables. These are being used in many Machine Learning algorithms. For example; Naive Bayes’ algorithm The Hidden Markov Model Restricted Boltzmann machine Neural Networks In this article, […]
Gaussian Distribution in Machine Learning
Introduction The Gaussian distribution is the healthy-studied probability distribution. It is for nonstop-valued random variables. It is as well stated as the normal distribution. Its position makes from the fact that it has many computationally suitable properties. The Gaussian distribution is the backbone of Machine Learning. Every data scientist needs to know during working with […]
Text Classification with Naive Bayes classifier
Introduction In this post, we are going to discuss that how to classify text using Naive Bayes classifer. Naive Bayes classifiers are collectively a group of classification algorithms. That is based on Bayes’ Theorem. It is not only one algorithm but a family of algorithms. All algorithms mutually share a general principle. For example each […]