ai generated, machine learning, data science-8005084.jpg

Demystifying AI – Unraveling the mysteries of AI within the vast realm of technology, terms like “Artificial Intelligence” and its companions such as machine learning, artificial neural networks, and deep learning carry a unique blend of simplicity and complexity.

ai generated, machine learning, data science-8005084.jpg

These phrases serve as both sparks and enablers for the impending wave of analytics. The infusion of these cutting-edge technologies has seamlessly woven into the fabric of our daily existence, becoming an integral part of our conscious and subconscious interactions with the digital landscape. For example, generative AI like Google Translate, Apple Siri, Google or email set-up completion suggestions, etc. In the intricate dance of code and algorithms, AI becomes not just a tool but a silent companion shaping our digital journey. It whispers possibilities, sparking curiosity in the evolving symphony of human-machine collaboration. In this blog post, you and I will take a deep dive into these amazing new terminologies.

Artificial Intelligence – Introduction

Artificial Intelligence: When people talk about Artificial Intelligence (AI), Machine Learning (ML), Deep Learning (DL), and Artificial Neural Networks, they use the words in different ways, but most of the time interchangeably. However, each term has its definition, meaning, purpose, and connection to the others. Let’s understand the idea of AI as described below. These popular terms will be effective not just in the field of data analysis but also have significant significance in our personal and professional lives.

In this post, we will attempt to demystify some use cases of machine learning and deep learning at a high level to make them clear. To name a few industries where these technologies are already making a difference: healthcare, finance, payments, e-commerce, manufacturing, engineering services, etc.


Machine Learning(ML) – Basic Terminologies in Context


The collaboration among 3 key sciences i.e. AI, Physics and Photography helps to improve how pictures are processed, how computers see things, how lenses work, and all photography technology. Machines that are very clever and have been taught well can do things that people usually do with their minds, like understanding things, thinking logically, learning new things, and deciding what to do. AI helps machines become really smart by using tools and strategies. This tells us everything we need to make smart systems, both in theory and practice.

Let’s Understand a Bit

Let’s make it simple to understand “how all-encompassing terms are actually correlated and speak to each other”. Artificial Intelligence, Machine Learning, BlockChain, etc. are all the golden words of today’s time. Almost every technology (now even non-technology) company on this planet is claiming a share of extra revenue by putting these buzzwords on their product displays.

Getting lost and not seeing the difference between hype and reality is easy. Reality and real benefits are getting lost here in the race of these buzzwords swirling around everything. In AILabPage’s view, all these terms are representative of future analytics. Sometimes it is good to un-develop something existing to uncover the hidden gems underneath. Maybe it’s like un-develop to innovate”?

In the past, Alan Turing published the “Turing Test,” which speculates on the possibility of creating machines that think. To pass the test, a computer must be able to carry on a conversation that is indistinct from a conversation with a human being.

Confusing Jargons – AI, ML and Many More

Defining confusing jargon like AI, ML, data science, DL, and ANN with no scientific reasoning or meaning can create huge understanding gaps. Each of these buzzwords has its own meaning and use cases. One needs to be careful about when to use what and for what reasons.

The AI definition also includes things like understanding language, planning, recognizing objects and sounds, and problem-solving.

It is all about time, and the right learning algorithms make all the difference. The below examples clarify a little more about the basic concept of artificial intelligence.

  • A simple mathematical calculation or the product of two given numbers shows that any computer can beat any human in terms of speed and accuracy.
  • On the other hand, to identify whether an image has a dog or a cat, most humans (even a four-year-old kid) will easily outperform computers.

In scenarios like Example 2 above, where artificial intelligence has to play a critical role to show its power, An animal picture is a dog, cat, etc. AI doesn’t need to demystify anything more, as its metaheuristic approach has almost overshadowed it. Near-natural intelligence technologies and the computational power of cognitive systems techniques are real now.

Algorithm – The Set Of Instructions!

In an algorithm, a set of rules gets followed to solve problems, so in short, an algorithm is a set of rules or instructions. In machine learning, algorithms are key elements that take on the data and process all the rules to get some responses. This processing makes the algorithm complex or easy. One thing is clear here: the more data, the stronger the algorithm gets over time.

Algorithms are supposed to work much faster, more accurately, and self-resiliently to demonstrate their capabilities, which are far beyond those of humans. No algorithm can be considered good or bad, but surely it can be data- or resource-hungry. Algorithms need to be trained to learn how to classify and process information.

To simulate the mapping of inputs to outputs as it happens in a human brain, which makes very difficult tasks for computers like image recognition, sarcasm detection, voice recognition, etc.

The efficiency and accuracy of the algorithm are dependent on how much data is fed to it to train it. Using an algorithm to calculate something does not automatically mean machine learning or AI is being used.

Artificial IntelligenceDemystifying AI

Human intelligence is exhibited by machines. AI, as a branch of computer science, deals with human mind simulation for machines, but interesting natural intelligence plays a key role in the same. In today’s time, AI is increasingly used as a broad term.

It describes machines that can mimic human functions such as learning and problem-solving. In earlier times, it was believed that human intelligence could be precisely described and that machines could simulate it with AI. Before the machine starts attempting simulation, it needs to learn from lots of data.

Demystifying AI, Machine Learning and Deep Learning

Artificial intelligence is now considered a new factor of production. It has the conceivable potential to introduce new sources of growth, reinvent existing businesses, and change the style of work. It also reinforces the role of people in driving growth in business.

We needed AI for real life, not just PhD or scholarly book material. In simple words, AI involves machines that behave and think like humans, i.e., algorithmic thinking in general. AI-powered computers have started simulating the human brain’s sensation, action, interaction, perception, and cognition abilities.

The real fact is that artificial intelligence is about people, not machines. Technology and non-technology companies are now investing in and bringing the real and materialistic values of artificial intelligence to the real world.

Machine Learning

Simply an approach to achieving artificial intelligence. Machine learning as a subset of AI It had been carrying neural networks as an important ingredient for some time. Only recently has it become the focus of AI and deep learning. Sadly, it is becoming more accessible to developers as a tool. What we need is simply MLaaS (Machine Learning as a Service) for everyone.

Artificial intelligence and machine learning are often used interchangeably, but they are not the same. Machine learning is one of the most active areas and a way to achieve AI. Why ML is so good today: for this, there are a couple of reasons below, but they are not limited to

  • The explosion of big data
  • Hunger for new business and revenue streams in these shrinking business times
  • Advancements in machine learning algorithms
  • Development of extremely powerful machines with high capacity and faster computing ability
  • Storage capacity

Today’s machines are learning and performing tasks that were only done by humans in the past, like making better judgments and decisions, playing games, etc. This is possible because machines can now analyze and read through patterns and remember learnings for future use. Today, the major problem is finding resources that are skilled enough to demonstrate and differentiate their learning from university and PhD books in real business rather than just arguing on social media with others.

Machine learning should be treated as a culture in an organization where business teams, managers, and executives should have some basic knowledge of this technology. In order to achieve this as a culture, there have to be continuous programs and road shows for them. There are many courses that are designed for students, employees with little or no experience, managers, professionals, and executives to give them a better understanding of how to harness this magnificent technology in their business.

MLaaS is needed for data scientists’ work, architects, and data engineers with domain expertise. It is important for everyone to have a better understanding of the possibilities of machine learning. What is all the fuss about machine learning anyway? Can machines be creative? Can machines empathize?

AI has already started delivering value. Using the contemporary view of computing exemplified by recent models and results from non-uniform complexity theory has proven the fact.

What can machines do, and how creative can they be? I guess that we will see this in another upcoming post “Machine Learning Evolution followed by Machine Learning Transformation”. Machine learning is (mostly) a mathematics-specific AI technique for classification, regression, and clustering.

What Machine Learning Offers !!

AILabPage defines machine learning as “a focal point where business, data, and experience meet emerging technology and decide to work together”. Machine learning is responsible for assessing the impact of data. In machine learning, algorithms are used to gain knowledge from data sets. It completely focuses on algorithms.

Machine learning and data mining follow a relatively similar process. Algorithms are built through which input is received and an output value is predicted after statistical analysis. There are three general classes of machine learning:

Every machine learning algorithm has three components:

  1. Representation
  2. Evaluation
  3. Optimization.

Machine learning is where the traditional statistical modelling of data meets the algorithmic and computational fields of data science. It focuses primarily on developing several computer programs that can transform if and when exposed to newer data sets.

Artificial Neural NetworksDemystifying AI

Mimic human brain cells, inspired by the biological neuronal structure. The effect is to raise or lower the electrical potential inside the body of the receiving cell. If this graded potential reaches a threshold, the neuron fires. It is this characteristic that the artificial neuron model attempts to reproduce.

  • Our predictive artificial neural network boasts flexibility with an arbitrary number of user-defined hidden layers and nodes.
  • Drawing inspiration from biological neurons, our model mimics the intricate chemical processes involved in signal transmission through synapses.
  • The adopted biological neuron model, with slight adjustments, serves as a foundational framework for our artificial neural network in processing image and stock data.

Neural networks find great applications in data mining across sectors. For example, economics, forensics, etc., and pattern recognition “ANN—Artificial Neural Systems, or Neural Networks—are physically cellular systems that can acquire, store, and utilize experiential knowledge”—Zurada (1992).

Neural Network Architecture

There are also some specialized versions of neural networks available. Such as convolutional neural networks and recurrent neural networks. These address special problem domains. Like image processing and text and speech processing that are based on methodologies like deep neural nets.

A neural network can be of 1 hidden layer to 3 layers at most for all practical purposes. The example below shows three hidden layers.

Demystifying AI, Machine Learning and Deep Learning
  • Input layer: The activity of the input units represents the raw information that can feed into the network.
  • Hidden layer: to determine the activity of each hidden unit. The activities of the input units and the weights on the connections between the input and the hidden units There can be several hidden layers.
  • Output layer: The behavior of the output units depends on the activity of the hidden units and the weights between the hidden and output units.

In practice, deep learning methods, specifically recurrent neural network (RNN) models, are used for complex predictive analytics. Like share price forecasting, it consists of several stages.

Deep Learning – Mandate for Humans, Not Just Machines

This technique aims to achieve a goal or an artificial intelligence power that teaches computers to perform tasks and understand anything. This process is as similar as it comes naturally to humans, i.e., learning by example.

A technique for implementing extremely powerful and much better machine learning Deep learning’s capabilities and limits make it so powerful that it can stand on its own in a separate domain. Sometimes, deep learning appears as a supernatural power of machines.

“Deep Learning is an algorithm which has no theoretical limitations of what it can learn; the more data you give and the more computational time you provide, the better it is”

Sir Geoffrey Hinton

Key Driver of Deep Learning

Deep learning’s main driver is artificial neural networks, or neural networks. Deep learning is based on multiple levels of features or representation in each layer, with the layers forming a hierarchy of low-level to high-level features.

Demystifying AI, Machine Learning and Deep Learning

Deep learning is the first class of algorithms that is scalable. The performance just keeps getting better as we feed the algorithms more data. Speech, text, and image processing can make a robot perfect to start with, and actions based on triggers make it the best. It has to pass four basic tests. Turning test, i.e., needs to acquire a college degree, work as an employee for at least 20 years, and perform well to get promotions and attain ASI status. Traditional machine learning focuses on feature engineering, but deep learning focuses on end-to-end learning based on raw features.

The analogy to deep learning is that the rocket engine is the deep learning model and the fuel is the huge amounts of data we can feed to these algorithms.

Sir Andrew Ng

Important elements such as decision tree training, inductive logic programming, clustering, reinforcement-based learning, and Bayesian networks are key factors in this scenario. Deep learning networks offer a significant benefit in that they have the ability to enhance their effectiveness as they encounter greater quantities of data.

Deep learning Computational Models

The human brain is a deep and complex recurrent neural network. Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. In very simple words, and not to confuse anyone here, we can define both models as below.

  • Feedforward propagation – Type of Neural Network architecture where the connections are “fed forward” only i.e. input to hidden to output The values are “fed forward”.
  • Backpropagation (supervised learning algorithm) is a training algorithm with 2 steps:
    • Feedforward the values
    • Calculate the error and propagate it back to the layer before.

In short, forward-propagation is part of the backpropagation algorithm but comes before back-propagating.

Data Science

Data science succinctly communicates the value of analysis and its benefits in real life to stakeholdersData science is the smartest or most intelligent BI.

  1. Data Science – Data Science is a multidisciplinary field that involves extracting insights and knowledge from structured and unstructured data using various scientific methods, processes, algorithms, and systems. It encompasses a wide range of techniques, including statistics, machine learning, data analysis, and data engineering, to uncover hidden patterns, correlations, and trends.
  2. Advanced Data Analytics – Advanced Data Analytics refers to the use of sophisticated analytical techniques and tools to analyze complex datasets, often involving predictive modeling, machine learning, and advanced statistical methods. It goes beyond traditional data analytics by employing advanced algorithms to gain deeper insights, make predictions, and optimize decision-making processes.
  3. Data Analytics – Data Analytics involves the examination of raw data to draw conclusions about the information it contains, often focusing on historical data to identify trends, patterns, and insights. It can include descriptive analytics (summarizing historical data), diagnostic analytics (identifying the causes of past outcomes), and predictive analytics (forecasting future trends).
  4. Business Intelligence (BI) – Business Intelligence is a set of technologies, processes, and tools that help organizations collect, analyze, and present business data to support decision-making processes. BI focuses on providing historical, current, and predictive views of business operations, often through dashboards, reports, and data visualization.

Successful data science requires more than just programming and coding. However, in data mining algorithms, that too is combined as part of a process and entirely focused on.

Artificial Learning vs. Machine Learning vs. Deep Learning

Artificial intelligence, or so-called artificial learning, and machine learning are often used interchangeably, especially in this era of big data. Artificial intelligence is a much broader concept than machine learning, which addresses the use of computers to mimic the cognitive functions of humans. In machine learning, tasks are based on algorithms in an “intelligent” manner. Machine learning is a subset of AI and focuses on the ability of machines to receive a set of data and learn for themselves, changing algorithms as they learn more about the information they are processing.

  1. Artificial Learning (AL) – Artificial Learning is a broader concept that encompasses any form of machine learning, including traditional rule-based systems and heuristic approaches. AL focuses on creating systems that can perform tasks without explicit programming, allowing machines to learn from experience and improve over time.
  2. Machine Learning (ML) – Machine Learning is a subset of Artificial Learning that specifically involves the development of algorithms that enable machines to learn patterns from data and make predictions or decisions. ML algorithms can be categorized into supervised (learning from labeled data), unsupervised (learning from unlabeled data), and reinforcement learning (learning from interacting with an environment).
  3. Deep Learning (DL) – Deep Learning is a specialized form of machine learning that involves neural networks with multiple layers (deep neural networks). DL excels in tasks such as image and speech recognition and natural language processing, using hierarchical layers to automatically learn intricate features from data.

By now, we are sure you can clearly see that these terms aren’t the same, and it is important to understand how they can be applied.

Mixing Up AllDemystifying AI

Why Deep Learning? If supremacy is the basis for popularity, then surely deep learning is almost there (at least for supervised learning tasks). Deep learning attains the highest rank in terms of accuracy when trained with a huge or large amount of data like in generative AI.

  • Big Data in Data Science – While data science encompasses various scales, the increasing volume of data underscores the growing importance of big data in this field.
  • Intelligence Augmentation – The concept of thinking with machines or intelligence augmentation represents a profound evolution in epistemology and semiotics, empowering business intelligence to become competitive intelligence through augmented site-centric data.
  • Strengths of Deep Learning – Deep learning’s remarkable strength lies in its ability to uncover intricate patterns without the need for predefined features, a task that often challenges human understanding.
  • Future of AI and Robotics – Contrary to concerns about job displacement by robots and AI, the likelihood of such a scenario occurring within the next 50 years is remote. Challenges like the need for vast amounts of data and extensive training hinder the accuracy of machine learning models. Additionally, AI advancements, coupled with cloud computing, continue to enhance existing systems and open new possibilities for the future.

During my time (around 19 years ago), we had no such option to learn or to choose popular professions. It was mainly R&D for anything unusual. We were forced to use regression techniques if anyone wished to be a data scientist or simply an engineer in computer software.

Some facts!

As investment in artificial intelligence is growing fast. Tech giants like Google, Microsoft, Apple, and Baidu, known for their dominance in digital technologies globally, are spending a couple of billion United States dollars on AI. 90% of this is going to R&D and deployment, and 10% to AI acquisitions.

Machine Learning

Conclusion – Artificial intelligence has transformed considerably since its inception in academia and is presently an active and diverse field of research. Recently, businesses have integrated artificial intelligence into their offerings. At the moment, artificial intelligence is functioning with human guidance, and I firmly believe that this partnership should persist in the future. The level at which AI systems are allowed to assume control of human driving must be limited. Baidu’s speech-to-text technology has displayed superior results compared to those achieved by humans in comparable tasks. Amazon has successfully utilized advanced deep-learning technology to improve the quality of its product recommendations. Are we ready to relinquish control to autonomous cars, software bots, or AI-based recommendation engines?

Points to Note:

All credits, if any, remain with the original contributor only. We have covered artificial intelligence, machine learning, neural networks, and deep learning to give some high-level understanding and differences. In the next post, I will talk about reinforcement machine learning.

Feedback & Further Question

Do you have any questions about Deep Learning or Machine Learning? Leave a comment or ask your question via email. Will try my best to answer it.

Books + Other readings Referred

  • Open Internet – Research Papers and ebooks
  • Personal hand on work on data & experience of  @AILabPage members
  • Book “Artificial Intelligence: A Modern Approach”

======================= About the Author ==========================

Read about Author  at : About Me   

Thank you all, for spending your time reading this post. Please share your feedback / comments / critics / agreements or disagreement.  Remark for more details about posts, subjects and relevance please read the disclaimer.

FacebookPage                Twitter                          ContactMe                         ===================================================================

By V Sharma

A seasoned technology specialist with over 22 years of experience, I specialise in fintech and possess extensive expertise in integrating fintech with trust (blockchain), technology (AI and ML), and data (data science). My expertise includes advanced analytics, machine learning, and blockchain (including trust assessment, tokenization, and digital assets). I have a proven track record of delivering innovative solutions in mobile financial services (such as cross-border remittances, mobile money, mobile banking, and payments), IT service management, software engineering, and mobile telecom (including mobile data, billing, and prepaid charging services). With a successful history of launching start-ups and business units on a global scale, I offer hands-on experience in both engineering and business strategy. In my leisure time, I'm a blogger, a passionate physics enthusiast, and a self-proclaimed photography aficionado.

21 thoughts on “Demystifying AI, Machine Learning and Deep Learning”
  1. AI People says:

    Different and insightful post

  2. FinTech Intelligence says:

    This is much clearer of all articles we have seen on internet until now

  3. […] Machine learning is subset to Artificial Intelligence which borrows principles from computer science. It is not an AI though; It is focal point where business and experience meet emerging technology and decides to work together. ML also has very close relationship to statistics; which is a graphical branch of mathematics. It instructs an algorithm to learn for itself by analyzing data. The more data it processes, the smarter the algorithm gets.  Until only recently even though foundation was laid down in 1950 ML remained largely confined to academia. […]

  4. This is short and concise post. I like the way you limited and kept to the point. Also links for other posts for details was great idea. Please note Deep Learning might look diffrent but may not able to achieve independence at least for next 50 years because of too much reliance on machine learning.

  5. […] Machine learning is a subset to Artificial Intelligence that borrows principles from computer science. It is not an AI though; It is a focal point where business, data and experience meet emerging technology and decides to work together. Machine learning is a way to achieve Machine learning models can find patterns in data to help prevent system breakdowns, persuade customers to buy more (e-commerce), or capitalise on a myriad of other business events.  […]

Leave a Reply

Discover more from Vinod Sharma's Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading