Demystifying key buzzwords like Artificial intelligence, machine learning, artificial neural networks and deep learning is simple and complex task at the same time.
Lets Undertand a Bit
Let us attempt to melt down the thick confusion of how all-encompassing terms like artificial intelligence, machine learning, and deep learning speaks to each other. Machine learning, Blockchain and Artificial Intelligence are all the golden words these days. Almost every technology (now even non technology) company on this planet is claiming the share of extra revenue by putting these buzz words on display.
What is getting lost here is with all the buzzwords swirling around, it’s easy to get lost and not see the difference between hype and reality.
Key buzzwords of today’s technology !
Demystifying key buzzwords like Artificial intelligence, machine learning, artificial neural networks and deep learning is simple and complex task at the same time. All these terms are representative of the future of analytics. Sometimes it is good to un-develop something existing to uncover the hidden gems underneath. May be its like un-develop to Innovate?
In the past Alan Turing published “Turing Test” that speculates the possibility of creating machines that think. Where in order to pass the test, a computer must be able to carry on a conversation that was indistinctive from a conversation with a human being.
AI definition also includes things like planning, understanding language, recognizing objects and sounds, learning, and problem solving.
Confusing Jargons – AI, ML and many more
Demystifying confusing jargons like AI, ML, Data Science, DL and ANN with no scientific reasoning and meaning can create huge understanding gaps.
“Artificial neural systems, or neural networks, are physical cellular systems which can acquire, store, and utilize experiential knowledge” – Zurada (1992). Like this each one has their own meaning and use cases. One needs to be careful on when to use what for what reasons.
Algorithm – The set of instructions !
In an Algorithm set of rules gets followed to solve problems so in short an algorithm is a set of rules or instructions. In machine learning; algorithms are key elements which takes on the data and process all the rules to get some responses. This processing which makes algorithm complex or easy. One thing is clear here more the data more stronger algorithm gets over period of time.
Algorithms suppose to work much faster, accurate and self resilient to demonstrate its capabilities; much beyond any human. No algorithm can be considered good or bad but surely can be data greedy or resource hungry. Algorithms need to be trained to learn how to classify and process information.
The efficiency and accuracy of the algorithm are dependent on how much data fed to it to train it well. Using an algorithm to calculate something does not automatically mean machine learning or AI was being used.
“Artificial neural systems, or neural networks, are physical cellular systems which can acquire, store, and utilize experiential knowledge” – Zurada (1992).
So what is needed
It all about time, and the right learning algorithms makes all the difference. In below examples let me succinctly demystify for understanding the basic concept of artificial intelligence better.
- A simple mathematical calculation or product of two given numbers; any computer can beat any human in terms of speed and accuracy.
- On the other hand to identify whether an image has a dog or a cat most of the humans (even a four year old kid) will easily outperform computers.
In scenarios like example 2 as above where artificial intelligence has to play critical role to show its power.
To simulate the mapping of inputs to outputs as it happens in a human brain which makes very difficult tasks for computers like image recognition, sarcasm detection, voice recognition, etc.
Lets talk about Artificial Intelligence
Human intelligence exhibited by machines. AI is today’s increasingly used as a wide term. It describe machines that can mimic human functions such as learning and problem solving. In earlier times it was believed that human intelligence can be precisely described, and machines can simulate it with AI. Before machine start attempting simulation it needs to do learning with los of data.
Artificial intelligence is now considered as a new factor of production. It has the conceivable potential to introduce new sources of growth, reinvent existing business, changing the style of work. It also reinforce the role of people to drive growth in business.
We needed AI for real life not just phd or scholar books material. In simple words AI involves machines that behave and think like humans i.e Algorithmic Thinking in general. AI powered computers has started simulating the human brain’s sensation, action, interaction, perception and cognition abilities.
Now AI has started delivering values already. Using the contemporary view of computing exemplified by recent models and results from non-uniform complexity theory has proven the fact.
Some of the questions, answers and fact!
As investment in artificial intelligence is growing fast. Tech giants like Google, Microsoft, Apple and Baidu known for their dominance in digital technologies globally are spending couple of billion united state dollars on AI. 90% of this is going to RnD & deployment, and 10 percent on AI acquisitions.
Are we ready to relinquish control to autonomous cars, software bots, or trust AI-based recommendation engines?
Machine Intelligence or Machine Learning
Simply an approach to achieve artificial intelligence. Machine Learning as a subset of AI. It was carrying neural networks as an important ingredient for some time. Only recently it has become the focus of AI and deep learning. Sadly it becoming more accessible to developers as their tool. What we need is simply a MLaaS (Machine Learning as a Service) for every one.
MLaaS is needed for data scientists work, for architects, and data engineers who has domain expertise. It is important for everyone to have a better understanding about the possibilities of Machine Learning. What is all the fuss about machine learning any ways! Can machines be creative? Can machines empathise?
What machines can do and how creative it can be? I guess that we will see in another upcoming post “Machine Learning Evolution followed by Machine Learning Transformation”. Machine Learning is (mostly) a mathematics specific AI technique for classification, regression and clustering.
What Machine Learning has !!
Machine learning is responsible for assessing the impact of data. In machine learning algorithms are used for gaining knowledge from data sets. It completely focus on algorithms.
Machine learning is where the traditional statistical modeling of data meets the algorithmic and computational field of data science. It focuses primarily on the development of several computer programs that can transform if and when exposed to newer sets of data.
Machine learning and data mining follow the relatively same process. Algorithms are built through which input is received and after statistical analysis output value is predicted. There are three general classes of machine learning —
- Supervised machine learning
- Unsupervised machine learning
- Reinforcement learning
Every machine learning algorithm has three components: Representation, Evaluation and Optimization.
Artificial Neural Networks
Mimic Human Brain Cells — Inspired from biological neuronal structure. The effect is to raise or lower the electrical potential inside the body of the receiving cell. If this graded potential reaches a threshold, the neuron fires. It is this characteristic that the artificial neuron model attempt to reproduce.
In biological neuron the transmission of a signal from one neuron to another through synapses is a complex chemical process. In this specific transmitter substances are released from the sending side of the junction.The biological neuron model is widely used in artificial neural networks with some minor modifications on it.
The artificial neural network we train for the prediction of image and stock data has an arbitrary number of hidden layers. Also has arbitrary number of hidden nodes in each layer, both of which the user decides during run-time.
Neural networks find great application in data mining used in sectors. For example economics, forensics, etc and for pattern recognition.
A neural network may contain the following 3 layers:
- Input layer – The activity of the input units represents the raw information that can feed into the network.
- Hidden layer – To determine the activity of each hidden unit. The activities of the input units and the weights on the connections between the input and the hidden units. There can be several hidden layers.
- Output layer – The behaviour of the output units depends on the activity of the hidden units and the weights between the hidden and output units.
A technique for implementing extremely powerful and much better machine learning (Thats why I claim ML and DL are not same). At the same time I also claim It is absolutely wrong to call Deep Learning as Machine Learning (in my personal opinion).The techniques is to achieve a goal not necessarily come out of same goal.
Deep learning’s main driver are artificial neural networks system or neural networks or neural nets. There are some specialized versions also available. Such as convolutional neural networks and recurrent neural networks. These addresses special problem domains. Two of the best use cases for Deep Learning which are unique as well. These are image processing and text/speech processing that are based on methodologies like Deep Neural Nets.
In practice Deep Learning methods, specifically Recurrent Neural Networks (RNN) models are used for complex predictive analytics. Like share price forecasting and it consist of several stages. DL also includes decision tree learning, inductive logic programming, clustering, reinforcement learning, and Bayesian networks, among others.
Deep learning is the first class of algorithms that is scalable. Performance just keeps getting better as we feed the algorithms more data. Speach/Text and image processing can make perfect robot to start with and actions based on triggers makes it the best. It has to pass basic 4 tests. Turning test i.e needs to acquire college degree, needs to work as an employee for atleast 20 years and perform well to get promotions and attain ASI status.
Data science succinctly communicate the value of analysis & benefits in real life to stakeholders. Data science is the smartest BI or most intelligent BI i.e IBI to do laborious task of ‘solving intelligence’ with a range of tools and specialty areas. Fundamentally solving intelligence means the artificial seeking and production of knowledge that answers questions whether they were asked or not, known or unknown.
Successful data science requires more than just programming and coding. However, in data mining algorithms are only combined that too as the part of a process and entirely focuses on.
Artificial Learning vs. Machine Learning vs. Deep Learning
Artificial Intelligence or so called artificial learning and machine learning are often used interchangeably, especially in this era of big data. By now we are sure you can make out clearly these terms aren’t the same, and it is important to understand how these can be applied.
As mentioned above artificial intelligence is much broader concept than machine learning, which addresses the use of computers to mimic the cognitive functions of humans. In machine learning tasks are based on algorithms in an “intelligent” manner. Machine learning is a subset of AI and focuses on the ability of machines to receive a set of data and learn for themselves, changing algorithms as they learn more about the information they are processing.
“Deep Learning is an algorithm which has no theoretical limitations of what it can learn; the more data you give and the more computational time you provide, the better it is” – Geoffrey Hinton (Google)
Mixing Up All
Data Science does not necessarily involve big data, but the fact that data is scaling up makes big data an important aspect of data science. On the other hand i.e. thinking with machines or intelligence augmentation is serious evolutionary epistemology, and semiotic. It has extreme potentials to take business intelligence to competitive intelligence that can infer competitive measures using augmented site-centric data.
One of the best strength of DeepLearning does not require predefined features to find peculiar patterns that humans will always struggle or probably would never be able to define before hand. Robots or AI enabled AI taking our job is still far not in next 50 years or so.
We have excellent young professionals working in today’s time with dream of changing the world. Their dreams go on to bring good values to society, business and people they work with.
During my time (around 19 years back) we had no such option to learn or to choose popular professions. It was mainly RnD for anything unusual. We were forced to use regression techniques if anyone wishing to be a data scientist or simply an engineer in computer softwares.
Robots can fail; as reason is very simple and known which is machine learning models are not sufficiently accurate or can’t be accurate without lots of data and lots of training. Artificial Intelligence with cloud computing add advancements using new use cases to improvise the systems developed so far.
Conclusion – Artificial intelligence is a broad and active area of research, but it’s no longer the sole province of academics; increasingly, companies are incorporating AI into their products. For now AI is controlled by humans and I wish in long term it should remain the same i.e. should never starts or think to control us or should not turns out uncontrollable. Baidu’s speech-to-text services are outperforming humans in similar tasks. Amazon is also applying deep learning for best-in-class product recommendations.
======================= About the Author ================
Read about Author at : About Me
Thank you all, for spending your time reading this post. Please share your feedback / comments / critics / agreements or disagreement. Remark for more details about posts, subjects and relevance please read the disclaimer.
Categories: Artificial Intelligence