Deep Learning – It can be termed as the best example of the confluence of big data, big models, big compute and big dreams. Deep Learning is an algorithm that has no theoretical limitations of what it can learn; the more data and the more computational (CPU power) time you give, the better it is – Sir Geoffrey Hinton (Google).
The Deep Learning – An Outlook
The true challenge for Artificial Intelligence is to prove and solve the tasks that are easy for the human to perform but hard to describe formally. In short deep learning is simply class/subset of machine learning, it has slightly a different perspective then machine learning.
Deep learning progressively extract higher-level features from the raw input through layers, low-level layers do the different job then upper layers, where machine learning usually focus on teaching computers to process and learn from data. Problems that we solve intuitively, that feel automatic, like recognizing spoken words, picking lyrics in music, separating all mixed songs, captioning the image, predicting the next frame in the video or recognising faces in images.
Deep learning uses multiple layers through which computer trains itself to process and learns from data. In deep learning, this is the task we try to solve at AILabPage research.

AILabPage’s Deep Learning Series
Deep Learning is used for tasks like facial recognition, generating & processing images, processing speech & language, and handwriting recognition. Deep Learning also has an excellent success story where it combines with reinforcement learning to match human-level performance. Games like Go, Chess, Dota2 and Quake III.
The Amazon Go Store relies on image recognition powered by deep learning to detect what shoppers buy. But this also has some downside on the cost part as it needs very very large datasets and a number of training cycles. To have this training successful machines need to support high-powered and expensive computing and storage hardware.
Deep Learning Algorithms
A technique for implementing machine learning. DL was also known as deep structured learning or hierarchical learning. At the same time, I also claim It is wrong to call Deep Learning as Machine Learning (in my opinion). The technique is to achieve a goal not necessarily come out of the same goal.
Deep learning’s main drivers are artificial neural networks system or neural networks or neural nets. There are some specialized versions also available. Such as convolution neural networks and recurrent neural networks. These addresses special problem domains. Two of the best use cases for Deep Learning which are unique as well. These are image processing and text/speech processing based on methodologies like Deep Neural Nets.

AILabPage’s Deep Learning Series
In practice, Deep Learning methods, specifically Recurrent Neural Networks (RNN)models are used for complex predictive analytics. Like share price forecasting and it consists of several stages. DL also includes decision tree learning, inductive logic programming, clustering, reinforcement learning, and Bayesian networks, among others.
Deep learning is the first class of algorithms that are scalable. The performance just keeps getting better as we feed the algorithms more data. Speech/Text and image processing can make a perfect robot to start with and actions based on triggers make it the best. It has to pass basic 4 tests. Turning test i.e needs to acquire a college degree, needs to work as an employee for at least 20 years and do well to get promotions and meet ASI status.
Deep Learning is Not Machine Learning
Deep Learning and Machine Learning are not the same so you cant use ML and DL terms interchangeably. Deep Learning exists because of various reasons like its limits, potential, results, etc. This can be also called a dream enabler of machine learning. The major point where DL differs from ML in its working style.
- ML works based on past and present figures and then take an educated guess into the future where DL goes much beyond just this guesswork.
- DL uses the data patterns to make decisions and predictions with the real-world. For example, healthcare involving genomics and preterm births.
In short Deep Learning is Machine Learning but you cant still say so when it’s serious discussion as it goes much beyond and further, like calling every layer of the Earth as Earth is wrong as it has 4 different layers with different names for a reason (This example has been added recently to clarify my point)

AILabPage’s Deep Learning Series
Deep learning takes feature engineering to the next level by automating feature engineering. There are deep learning methodologies to directly learn from the raw data and map to the intended goals. To uncover hidden themes in large collections of documents using topic modeling. We will see the responses to these questions in subsequent blog posts. Artificial Intelligence can be tagged as applied machine learning, deep learning or any other techniques to solve actual problems.
Why Deep Learning
If supremacy is the basis for popularity then surely Deep Learning is almost there (At least for supervised learning tasks). Deep learning attains the highest rank in terms of accuracy when trained with a huge amount of data. Deep learning algorithms take highly unorganised, messy and unlabelled data — such as video, images, audio recordings, and text.
The analogy to deep learning is that the rocket engine is the deep learning models and the fuel is the huge amount of data we can feed to these algorithms. – Sir Andrew Ng.

AILabPage’s Deep Learning Series
One of the biggest issues/limitations of Deep Learning is it requires high-end machines as opposed to traditional Machine Learning algorithms. GPU’s have become a basic or for granted requirement to perform any Deep Learning related algorithm.
There are several advantages of using deep learning over traditional Machine Learning algorithms. Deep learning outperforms if the data size is large. But with small data size, traditional Machine Learning algorithms are preferable.
- Knowing the unknown – DL techniques outshines others and avoid worries of lack of domain understanding for feature introspection or where there is a less understanding of feature engineering.
- Nothing is complex – For complex problems such as image, video, voice recognition or natural language processing, DL works like a charm.
Shortcomings
On the other hand, when it comes to unsupervised learning, research using deep learning yet to see similar behaviour like supervised learning tasks. Responses to valid argument on (if any) “if not deep learning” then why not Hierarchical Temporal Memory (HTM) will cover in upcoming posts.
Books + Other readings Referred
- Open Internet
- Hands-on personal research work @AILabPage
Points to Note:
All credits if any remains on the original contributor only. We have now elaborated on our earlier posts on “AI, ML, and DL – Demystified“, for understanding Deep Learning only. You can find earlier posts on Machine Learning – The Helicopter view, Supervised Machine Learning, Unsupervised Machine Learning, and Reinforcement Learning links.
Conclusion – Deep Learning, in short, is going much beyond machine learning and its algorithms that are either supervised or unsupervised. In DL it uses many layers of nonlinear processing units for feature extraction and transformation. Learning is based on multiple levels of features or representation in each layer with the layers forming a hierarchy of low-level to high-level features Where traditional machine learning focuses on feature engineering, deep learning focuses on end-to-end learning based on raw features. Traditional deep learning creates/ train-test splits of the data where ever possible via cross-validation. Load ALL the training data into main memory and compute a model from the training data.
============================ About the Author =======================
Read about Author at : About Me
Thank you all, for spending your time reading this post. Please share your opinion / comments / critics / agreements or disagreement. Remark for more details about posts, subjects and relevance please read the disclaimer.
FacebookPage ContactMe Twitter ====================================================================
[…] ConvNets have been successful in identifying faces, objects and traffic signs apart from powering vision in robots and self driving cars. Yann LeCun was named LeNet5 after many previous successful iterations since the year 1988. LeNet was one of the very first convolutional neural networks which has pushed forward Deep Learning. […]
LikeLike
[…] Link : https://vinodsblog.com/2018/05/14/uncovering-anxious-deep-learning-for-ease/ […]
LikeLike