Boltzmann Machines – A kind of imaginary recurrent neural network and normally get interpreted from the probabilistic graphical models. In a short its a neural network which has the reputation of being relaxation net and again it is a fully connected and consist of visible and hidden units. It operates in asynchronous mode with stochastic updates for each of its units and avoids local minima by using simulated annealing with stochastic nodes.
These machines are also called as probability distributions on high dimensional binary vectors. With hidden nodes, it can represent any arbitrary probability distribution over binary vectors
What is Deep Learning?
“Deep learning is undeniably mind-blowing” machine learning technique that teaches computers to do what comes naturally to humans: learn by example. It can be used with ease to predict the unpredictable”. Researchers and engineers are busy in creating artificial intelligence by using a combination of non-bio-neural networks and natural intelligence”.
Deep Learning, in short, is going much beyond machine learning and its algorithms that are either supervised or unsupervised. In DL it uses many layers of nonlinear processing units for feature extraction and transformation. It has revolutionized today’s industries by demonstrating near human-level accuracy in certain tasks. The task like pattern recognition, image classification, voice/text decoding and many more.
Deep Leaning is a key technology
- To voice control in mobile devices like handphones, TVs, vice command enabled speakers and TVs
- Behind driverless cars, enabling them to recognize a stop sign or to distinguish a pedestrian from a lamppost.
- Has revolutionised, image processing & classification and also speech recognition with high accuracy.
Deep learning is getting lots of attention lately and for good reason. It’s achieving results that were not possible before. Business leaders and developers community absolutely need to understand what it is, what it can do and how it works.
Boltzmann Machine -A Probabilistic Graphical Models
Sir Geoffrey Hinton, the “Godfather of Deep Learning” coined Boltzmann Machine in 1985 for the first time. A well-known figure and personality in the deep learning community Sir Geoffrey Hinton also a professor at the University of Toronto.
Boltzmann Machines – A kind of imaginary recurrent neural network and this normally get interpreted from the probabilistic graphical models. In a short and concise manner a neural network which is fully connected and consist of visible and hidden units. It operates in asynchronous mode with stochastic updates for each of its unit.
These machines are also called as probability distributions on high dimensional binary vectors. It’s a generative unsupervised model used for probability distribution from an original dataset. A great demanding/hungry tool for computation power however restricting its network topology the behaviour can be controlled.
It is indeed an algorithm which is useful for dimensionality reduction, classification, regression, collaborative filtering, feature learning and topic modelling. Like any other neural network, these machines also have (both BM and RBM) an input layer or referred to as the visible layer and one or several hidden layers or referred to as the hidden layer.
Deep learning Computational Models
The human brain is a deep and complex recurrent neural network. Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. In very simple words and not to confuse anything/anyone here, we can define both models as below.
- Feedforward propagation – Type of Neural Network architecture where the connections are “fed forward” only i.e. input to hidden to output The values are “fed forward”.
- Backpropagation (supervised learning algorithm) is a training algorithm with 2 steps:
- Feedforward the values
- Calculate the error and propagate it back to the layer before.
In short, forward-propagation is part of the backpropagation algorithm but comes before back-propagating.
Hidden nodes and a learning algorithm, improvement over Hopfield. Its a slow learning algorithm but need to extend to learn higher-order interactions. BMs forces a different way of thinking about learning by creating a probabilistic environment to match goals
Restricted Boltzmann Machines
Boltzmann machines are probability distributions on high dimensional binary vectors which are analogous to Gaussian Markov Random Fields in that they are fully determined by first and second-order moments.
It is used for pattern storage and retrieval. As per wiki “A Boltzmann machine is also called stochastic Hopfield network with hidden units) is a type of stochastic recurrent neural network and Markov random field.” RBM itself has many applications, some of them are listed as below
- Collaborative filtering
- Multiclass classification
- Information retrieval
- Motion capture modelling
- Segmentation
- Modelling natural images
Deep belief nets use the Boltzmann machine especially the Restricted Boltzmann machine as a key component but first order weight updates.
Books + Other readings Referred
- Open Internet
- Hands-on personal research work @AILabPage
Points to Note:
All credits if any remains on the original contributor only. We have now elaborated on our earlier posts on “AI, ML, and DL – Demystified“, for understanding Deep Learning only. You can find earlier posts on Machine Learning – The Helicopter view, Supervised Machine Learning, Unsupervised Machine Learning, and Reinforcement Learning links.
Feedback & Further Question
Do you have any questions about Quantum technologies, Artificial Intelligence and its subdomains like Deep Learning or Machine Learning? etc. Leave a comment or ask your question via email. Will try my best to answer it.
Conclusion – Deep Learning, in short, is going much beyond machine learning and its algorithms that are either supervised or unsupervised. In DL it uses many layers of nonlinear processing units for feature extraction and transformation. in the RBM, hidden units are connected only indirectly through visible units. If the visible unit values are fixed, hidden units become independent. Learning is based on multiple levels of features or representation in each layer with the layers forming a hierarchy of low-level to high-level features Where traditional machine learning focuses on feature engineering, deep learning focuses on end-to-end learning based on raw features. Traditional deep learning creates/ train-test splits of the data where ever possible via cross-validation. Load ALL the training data into main memory and compute a model from the training data.
============================ About the Author =======================
Read about Author at : About Me
Thank you all, for spending your time reading this post. Please share your opinion / comments / critics / agreements or disagreement. Remark for more details about posts, subjects and relevance please read the disclaimer.
FacebookPage ContactMe Twitter ====================================================================