Demystifying AI -: Key buzzwords like Artificial intelligence and its emerging bundled technologies like machine learning, artificial neural networks, deep learning, and many many more are simple and complex terms at the same time. These buzzwords are agents of future analytics in one or the other way. Involvement in such emerging technologies is now part of our daily life and we use them knowingly or unknowingly. All of these buzzwords will work well not only in analytics business but also going to play a key and extremely important role in our daily and business life.
In this post, we will attempt to demystify some use cases of machine learning and deep learning at a high level to make them clear. To name a few industries where these technologies are making a difference already are healthcare, finance, payments, e-commerce, manufacturing, engineering services etc.
Machine Learning(ML) – Basic Terminologies in Context
Let’s Understand a Bit
Let’s make it simple to understand “how all-encompassing terms are actually correlated and speak to each other”. Artificial Intelligence, Machine Learning and BlockChain etc. are all the golden words of today’s time. Almost every technology (now even non-technology) company on this planet is claiming the share of extra revenue by putting these buzzwords on their product display.

It’s easy to get lost and not see the difference between hype and reality. Reality and real benefits are getting lost here in the race of these buzzwords swirling around everything. In AILabPage’s view, all these terms are representative of future analytics. Sometimes it is good to un-develop something existing to uncover the hidden gems underneath. Maybe it’s like un-develop to Innovate?
In the past, Alan Turing published the “Turing Test” which speculates the possibility of creating machines that think. Where to pass the test, a computer must be able to carry on a conversation that was indistinctive from a conversation with a human being
Confusing Jargons – AI, ML and Many More
Defining confusing jargon like AI, ML, Data Science, DL and ANN with no scientific reasoning and meaning can create huge understanding gaps. Each one of these buzzwords has its own meaning and use cases. One needs to be careful about when to use what and for what reasons.
AI definition also includes things like understanding language, planning, recognising objects & sounds learning and problem-solving.
It is all about time and the right learning algorithms make all the difference. The below examples clarify a little more about the basic concept of artificial intelligence.
- A simple mathematical calculation or product of two given numbers; any computer can beat any human in terms of speed and accuracy.
- On the other hand, to identify whether an image has a dog or a cat most of the humans (even a four-year-old kid) will easily outperform computers.
In scenarios like example 2 above where artificial intelligence has to play a critical role to show its power. An animal picture is a dog or cat etc. AI doesn’t need to demystify anything more as its metaheuristic approach has almost overshadowed. Near natural intelligence technologies and computational power of cognitive systems techniques are real now.
Algorithm – The Set Of Instructions!
In an Algorithm set of rules gets followed to solve problems so in short, an algorithm is a set of rules or instructions. In machine learning; algorithms are key elements which take on the data and process all the rules to get some responses. This processing makes the algorithm complex or easy. One thing is clear here more the data stronger the algorithm gets over some time.
Algorithms suppose to work much faster, more accurately and self-resilient to demonstrate their capabilities; much beyond any human. No algorithm can be considered good or bad but surely can be data greedy or resource hungry. Algorithms need to be trained to learn how to classify and process information.
To simulate the mapping of inputs to outputs as it happens in a human brain which makes very difficult tasks for computers like image recognition, sarcasm detection, voice recognition, etc.
The efficiency and accuracy of the algorithm are dependent on how much data is fed to it to train it. Using an algorithm to calculate something does not automatically mean machine learning or AI was being used.
Artificial Intelligence – Demystifying AI
Human intelligence is exhibited by machines. AI as a branch of computer science deals with human mind simulation for machines but interesting natural intelligence plays a key role in the same. In today’s time, AI is increasingly used as a wide term. It describes machines that can mimic human functions such as learning and problem-solving. In earlier times it was believed that human intelligence can be precisely described, and machines can simulate it with AI. Before the machine starts attempting simulation, it needs to do learning with lots of data.

Artificial intelligence is now considered as a new factor of production. It has the conceivable potential to introduce new sources of growth, reinvent existing business, changing the style of work. It also reinforces the role of people to drive growth in business.
We needed AI for real life not just PhD or scholarly books material. In simple words, AI involves machines that behave and think like humans i.e. Algorithmic Thinking in general. AI-powered computers have started simulating the human brain’s sensation, action, interaction, perception and cognition abilities.
The real fact is artificial intelligence is about the people, not the machines. Technology and non-technology companies are now investing and bringing out the real and materialistic values of Artificial Intelligence to the real world.
AA
Machine Learning
Simply an approach to achieve artificial intelligence. Machine Learning as a subset of AI. It was carrying neural networks as an important ingredient for some time. Only recently it has become the focus of AI and deep learning. Sadly it becoming more accessible to developers as their tool. What we need is simply an MLaaS (Machine Learning as a Service) for everyone.
Artificial intelligence and machine learning are used interchangeably often but they are not the same. Machine learning is one of the most active areas and a way to achieve AI. Why ML is so good today; for this, there are a couple of reasons below but not limited to though.
- The explosion of big data
- Hunger for new business and revenue streams in this business shrinking times
- Advancements in machine learning algorithms
- Development of extremely powerful machine with high capacity & faster computing ability
- Storage capacity
Today’s machines are learning and performing tasks; that was only be done by humans in the past like making a better judgement, decisions, playing games etc. This is possible because machines can now analyse and read through patterns and remember learnings for future use. Today the major problem is to find resources that are skilled enough to demonstrate & differentiate their learning from university & PhD books in real business rather than just arguing on social media with others.
Machine learning should be treated as a culture in an organisation where business teams, managers and executives should have some basic knowledge of this technology. In order to achieve this as a culture, there have to be continuous programs and road shows for them. There are many courses which are designed for students, employees with little or no experience, managers, professionals and executive to give them a better understanding of how to harness this magnificent technology in their business.
MLaaS is needed for data scientists work, for architects, and data engineers who have domain expertise. It is important for everyone to have a better understanding of the possibilities of Machine Learning. What is all the fuss about machine learning anyways! Can machines be creative? Can machines empathise?
Now AI has started delivering values already. Using the contemporary view of computing exemplified by recent models and results from non-uniform complexity theory has proven the fact.
What machines can do and how creative it can be? I guess that we will see in another upcoming post “Machine Learning Evolution followed by Machine Learning Transformation”. Machine Learning is (mostly) a mathematics-specific AI technique for classification, regression and clustering.
What Machine Learning Offers !!
AILabPage defines Machine Learning as “A focal point where business, data and experience meets emerging technology and decides to work together“. Machine learning is responsible for assessing the impact of data. In machine learning algorithms are used for gaining knowledge from data sets. It completely focuses on algorithms.

Machine learning is where the traditional statistical modelling of data meets the algorithmic and computational field of data science. It focuses primarily on the development of several computer programs that can transform if and when exposed to newer sets of data.
Machine learning and data mining follow the relatively same process. Algorithms are built through which input is received and after statistical analysis output value is predicted. There are three general classes of machine learning —
- Supervised Machine Learning – Map inputs to output
- Unsupervised Machine Learning – Magnify hidden patterns and trends
- Reinforcement Learning – Reward for learning
Every machine learning algorithm has three components:
- Representation
- Evaluation
- Optimization.
Artificial Neural Networks – Demystifying AI
Mimic human brain cells — Inspired from the biological neuronal structure. The effect is to raise or lower the electrical potential inside the body of the receiving cell. If this graded potential reaches a threshold, the neuron fires. It is this characteristic that the artificial neuron model attempt to reproduce.
In a biological neuron, the transmission of a signal from one neuron to another through synapses is a complex chemical process. In this specific transmitter, substances are released from the sending side of the junction. The biological neuron model is widely used in artificial neural networks with some minor modifications on it.
The artificial neural network we train for the prediction of image and stock data has an arbitrary number of hidden layers. Also has an arbitrary number of hidden nodes in each layer, both of which the user decides during run-time.
Neural networks find great application in data mining used in sectors. For example economics, forensics, etc and for pattern recognition. “ANN – Artificial Neural Systems, or Neural Networks, are physical cellular systems which can acquire, store, and utilize experiential knowledge” – Zurada (1992).
Neural Network Architecture
There are some specialized versions of neural networks also available. Such as convolutional neural networks and recurrent neural networks. These addresses special problem domains. Like image processing and text/speech processing that is based on methodologies like deep neural nets.
Neural Network can be of 1 hidden layer to 3 at max for all practical purposes. The example below for 3 hidden layers.

- Input layer – The activity of the input units represents the raw information that can feed into the network.
- Hidden layer – To determine the activity of each hidden unit. The activities of the input units and the weights on the connections between the input and the hidden units. There can be several hidden layers.
- Output layer – The behaviour of the output units depends on the activity of the hidden units and the weights between the hidden and output units.
In practice, Deep Learning methods, specifically Recurrent Neural Networks (RNN) models are used for complex predictive analytics. Like share price forecasting and it consists of several stages.
Deep Learning – Mandate for Humans, Not Just Machines
A technique for implementing extremely powerful and much better machine learning. Deep Learning’s capabilities and limits make it so powerful that it can stand its own separate domain. Some time deep learning appears as a supernatural power of machines.
The objective of the techniques is to achieve a goal or an artificial intelligence power that teaches computers to tasks and the ability to understand anything. This process is as similar as it comes naturally to humans i.e. learn by examples.
“Deep Learning is an algorithm which has no theoretical limitations of what it can learn; the more data you give and the more computational time you provide, the better it is” – Sir Geoffrey Hinton
Key Driver of Deep Learning
Deep learning’s main driver is artificial neural networks system or neural networks. Deep learning is based on multiple levels of features or representation in each layer with the layers forming a hierarchy of low-level to high-level features.

Deep learning is the first class of algorithms that are scalable. The performance just keeps getting better as we feed the algorithms more data. Speech/Text and image processing can make a perfect robot to start with and actions based on triggers makes it the best. It has to pass basic 4 tests. Turning test i.e needs to acquire a college degree, needs to work as an employee for at least 20 years and perform well to get promotions and attain ASI status. Traditional machine learning focuses on feature engineering but deep learning focuses on end-to-end learning based on raw features.
The analogy to deep learning is that the rocket engine is the deep learning models and the fuel is the huge amounts of data we can feed to these algorithms. — Sir Andrew Ng
Decision tree learning, inductive logic programming, clustering, reinforcement learning, and Bayesian networks, among others, are key components here. A key advantage of deep learning networks is that they often continue to improve as the size of your data increases.
Deep learning Computational Models
The human brain is a deep and complex recurrent neural network. Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. In very simple words and not to confuse anything/anyone here, we can define both models as below.
- Feedforward propagation – Type of Neural Network architecture where the connections are “fed forward” only i.e. input to hidden to output The values are “fed forward”.
- Backpropagation (supervised learning algorithm) is a training algorithm with 2 steps:
- Feedforward the values
- Calculate the error and propagate it back to the layer before.
In short, forward-propagation is part of the backpropagation algorithm but comes before back-propagating.
Data Science
Data science succinctly communicate the value of analysis & benefits in real life to stakeholders. Data science is the smartest BI or most intelligent BI i.e IBI to do laborious task of ‘solving intelligence’ with a range of tools and speciality areas. Fundamentally solving intelligence means the artificial seeking and production of knowledge that answers questions whether they were asked or not, known or unknown.
Successful data science requires more than just programming and coding. However, in data mining algorithms are only combined that too as the part of a process and entirely focuses on.
Artificial Learning vs. Machine Learning vs. Deep Learning
Artificial Intelligence or so-called artificial learning and machine learning are often used interchangeably, especially in this era of big data. By now we are sure you can make out clearly these terms aren’t the same, and it is important to understand how these can be applied.
As mentioned above artificial intelligence is a much broader concept than machine learning, which addresses the use of computers to mimic the cognitive functions of humans. In machine learning tasks are based on algorithms in an “intelligent” manner.
Machine learning is a subset of AI and focuses on the ability of machines to receive a set of data and learn for themselves, changing algorithms as they learn more about the information they are processing.
Mixing Up All – Demystifying AI
Why DeepLearning – If supremacy is the basis for popularity then surely Deep Learning is almost there (At least for supervised learning tasks). Deep learning attains the highest rank in terms of accuracy when trained with a huge amount of data.
Data Science does not necessarily involve big data, but the fact that data is scaling up makes big data an important aspect of data science. On the other hand, i.e. thinking with machines or intelligence augmentation is serious evolutionary epistemology, and semiotic. It has extreme potentials to take business intelligence to competitive intelligence that can infer competitive measures using augmented site-centric data.
One of the best strength of DeepLearning does not require predefined features to find peculiar patterns that humans will always struggle or probably would never be able to define beforehand. Robots or AI enabled AI taking our job is still far not in the next 50 years or so.
We have excellent young professionals working in today’s time with the dream of changing the world. Their dreams go on to bring good values to society, business and people they work with.
During my time (around 19 years back) we had no such option to learn or to choose popular professions. It was mainly RnD for anything unusual. We were forced to use regression techniques if anyone wishing to be a data scientist or simply an engineer in computer software.
Robots can fail; as reason is very simple and known which is machine learning models are not sufficiently accurate or can’t be accurate without lots of data and lots of training. Artificial Intelligence with cloud computing adds advancements using new use cases to improvise the systems developed so far.
Some facts!
As investment in artificial intelligence is growing fast. Tech giants like Google, Microsoft, Apple and Baidu are known for their dominance in digital technologies globally are spending couple of billion united state dollars on AI. 90% of this is going to RnD & deployment, and 10% on AI acquisitions.
Are we ready to relinquish control to autonomous cars, software bots, or trust AI-based recommendation engines?
Points to Note:
All credits if any remains on the original contributor only. We have covered Artificial Intelligence, Machine Learning, Neural Network and Deep Learning to give some high-level understanding and difference. In the next upcoming post will talk about Reinforcement machine learning.
Feedback & Further Question
Do you have any questions about Deep Learning or Machine Learning? Leave a comment or ask your question via email. Will try my best to answer it.
Books + Other readings Referred
- Open Internet – Research Papers and ebooks
- Personal hand on work on data & experience of @AILabPage members
- Book “Artificial Intelligence: A Modern Approach”

Conclusion – The sphere of artificial intelligence has evolved beyond its initial academic beginnings and is now a vibrant and wide-ranging area of exploration. Lately, companies have also incorporated artificial intelligence into their products and services. It is presently under human supervision that artificial intelligence operates and I strongly desire that this collaboration will continue in the near future.
The use of AI systems to take charge of human driving should be restricted from crossing a particular level. Baidu’s speech-to-text technology has proven to outperform human performance in similar tasks. Amazon has implemented an impressive use of deep learning technology to enhance its product suggestions.
======================= About the Author ==========================
Read about Author at : About Me
Thank you all, for spending your time reading this post. Please share your feedback / comments / critics / agreements or disagreement. Remark for more details about posts, subjects and relevance please read the disclaimer.
FacebookPage Twitter ContactMe ===================================================================
Different and insightful post
This is much clearer of all articles we have seen on internet until now
[…] Machine learning is subset to Artificial Intelligence which borrows principles from computer science. It is not an AI though; It is focal point where business and experience meet emerging technology and decides to work together. ML also has very close relationship to statistics; which is a graphical branch of mathematics. It instructs an algorithm to learn for itself by analyzing data. The more data it processes, the smarter the algorithm gets. Until only recently even though foundation was laid down in 1950 ML remained largely confined to academia. […]
[…] our words Machine learning is a subject for real life work outside of phd or scholar books. At AILabPage we define ML as […]
[…] if any remains on the original contributor only. We have now elaborated our previous posts “AI, ML and DL – Demystified” for understanding DeepLearning only also you can find previous posts on Machine Learning […]
[…] […]
[…] if any remains on the original contributor only. We have now elaborated our earlier posts on “AI, ML and DL – Demystified” for understanding Deep Learning only. You can find earlier posts on Machine Learning – The […]
[…] Machine learning is also a subset of Artificial Intelligence. ML borrows principles from computer science and statistics which is a graphical branch of mathematics. […]
[…] Machine learning is also a subset of Artificial Intelligence. ML borrows principles from computer science and statistics which is a graphical branch of mathematics. […]
[…] Machine learning is also a subset of Artificial Intelligence. ML borrows principles from computer science and statistics which is a graphical branch of mathematics. […]
[…] Making decision or choosing between ML or DL depends mainly on the problem & data at hand. Some high level differences as below. Should you choose to read in details then I suggest you read “Demystifying the difference between Machine Learning and Deep Learning“. […]
This is short and concise post. I like the way you limited and kept to the point. Also links for other posts for details was great idea. Please note Deep Learning might look diffrent but may not able to achieve independence at least for next 50 years because of too much reliance on machine learning.
[…] if any remains on the original contributor only. We have now elaborated our earlier posts on “AI, ML and DL – Demystified” for understanding Deep Learning only. You can find earlier posts on Machine Learning – The […]
[…] any remains on the original contributor only. We have now elaborated on our earlier posts on “AI, ML, and DL – Demystified“, for understanding Deep Learning only. You can find earlier posts on Machine Learning – […]
[…] SHARMA, V. (2017). Vinod Sharma’s Blog. Retrieved from https://vinodsblog.com/2017/12/31/demystifying-ai-machine-learning-and-deep-learning/ […]
[…] have elaborated on our earlier posts on “AI, ML, and DL – Demystified“, for understanding Deep Learning only. You can find earlier posts on Machine Learning – […]
[…] Making a decision or choosing between ML or DL depends mainly on the problem & data at hand. Some high-level differences are as below. Should you choose to read in detdetailails then I suggest you read “Demystifying the Difference between Machine Learning and Deep Learning“. […]
[…] Machine learning is a subset to Artificial Intelligence that borrows principles from computer science. It is not an AI though; It is a focal point where business, data and experience meet emerging technology and decides to work together. Machine learning is a way to achieve Machine learning models can find patterns in data to help prevent system breakdowns, persuade customers to buy more (e-commerce), or capitalise on a myriad of other business events. […]
[…] have elaborated on our earlier posts on “AI, ML, and DL – Demystified“, for understanding Deep Learning only. You can find earlier posts on Machine Learning – […]
[…] Machine learning is also a subset of Artificial Intelligence. ML borrows principles from computer science and statistics which is a graphical branch of mathematics. […]