Neural Networks

Deep Learning – Introduction to Artificial Neural Networks

Artificial Neural Networks – As the name suggest “Neural Network”, they are inspired by the human brain system. ANNs were originally designed with biological neurons as a reference point thus sometimes they are called a brain model for computers. It’s more of a framework than an algorithm. ANNs can process information in form audio, video, images, texts, numbers or in any form of data. Neurons i.e. perceptrons (as known in early days) were staged as a decision function in those times.

Artificial Neural networks are designed to take several binary inputs to give a binary output. Professor Frank Rosenblatt was the first one to use neural networks.

 

Artificial Neural Networks (ANN) – Some background

As per Dr. Robert Hecht-Nielsen, the inventor of one of the first neuron computer, ANN is “A computing system made up of a number of simple, highly connected processing elements, which process information by their dynamic state response to external inputs.”

AILabPage defines – Artificial neural networks (ANNs) as “A complex computer code written with a number of simple, highly interconnected processing elements which is inspired by human biological brain structure for simulating human brain working & processing data (Information) models. Point to note, it’s way different than traditional computer program though.

Deep Learning – Introduction to Artificial Neural Networks

The human brain needs a much smaller data set when processing or classifying data sets as compared to computer simulation (neural networks) to learn patterns and captioning images. The human brain so powerful and step ahead of superfast terminology when we need to identify images, handwriting, speaking different languages and watching videos to understand & re-create them. For machine same skills are very complex but ANNs are making it possible now. For humans, understanding the brain is easy by using their own brains though.

This post is just an overview of artificial neural networks. The idea is to explain what is a neural network, what it can do and how to use them in our machine learning challenges. This post is written in simple English for readers who are new to this animal. Ph.D. scholars, professionals with experience are welcome to comment to improve this simple paper.

 

Artificial Neural Networks (ANN) – Building Blocks

In 1943 Warren S. McCulloch and Walter Pitts came out with a highly simplified model of a neuron in their paper and made an important contribution to the development of artificial neural networks. ANNs consist of several parameters, hyper-parameters, and layers. Parameters and hyper-parameters help to drive output from the neural network model. Weights, a number of epochs, biases, batch size and learning rates are few examples of parameters. Neural nets consist of 3 layers. These nodes process information:

  • Input Layers: The entry point to the neural network, this entry point or layer takes input data as numbers, texts, audio, videos and image as pixels, etc.
  • Hidden Layers: The next layer or input taken from the output of the input layer. This layer is responsible for number crunching i.e. mathematical operation, paying attention to detect patterns data which human eye cant see and extractor of feature, etc. There can be a minimum of one and many multiple hidden layers.
  • Output Layer:  Takes input from the hidden layer to generate the desired output.

Artificial neural networks have a very important role in building deep learning models. Like the human brain, ANN gets formed of the neuron to process information. ANNs have excellent possibility & flexibility to model, establish and ease out complex relationships between input and output. The neural networks are now part of research subjects like Neuro-informatics (a research field concerned with the organization of neuroscience data by the application of computational models and analytical tools).

A new term is emerging as quantum neural networks (QNNs). These networks i.e QNNs are specifically designed to work on a quantum process for much faster results. The role of quantum computing in AI and its subdomains like machine learning, deep learning, etc is not yet fully understood. Today it appears as if quantum computing is the answer for many problems in the data science domain which we face today for specific algorithm training and faster learning on the huge data set. When it’s about neural networks, quantum computing looks even more promising due to the complexity and its resource-hungry nature.

I guess as of now it may look theoretical though. I will write a detailed blog post on quantum neural networks (QNNs) soon. The biggest support or benefit will go to Backpropagation Algorithm (out of QNNs) in my personal view for various reasons. As mentioned above “Backpropagation” is an algorithm that uses supervised learning methods to compute the gradient descent (delta rule) with respect to weights. This algorithm uses supervised learning methods for training Artificial Neural Networks. The whole idea of training multi-layer perceptrons is to compute the derivatives of the error function or gradient descent with respect to weights using the backpropagation algorithm.

 

Artificial Neural Networks (ANN) – Answer to Many Questions

Artificial neural network concept came out of the human brain working model as an inspiration to allow machines to read & generate pictures, texts for translations, videos and sounds. Today we have fully working models of neural networks that can answer the question through voice processing, recognising handwritings, watch and interpret videos, understand and predict our behavior.

It can alert security officials and send the location of the wanted person through face detection and body language behavior. Neural networks are always at the foundation level for many of the biggest breakthroughs in artificial intelligence. Yes also can drive us around in our cars

Neural networks use supervised as well as unsupervised machine learning data sets and methodologies to achieve results. The supervised learning method is much more suitable and appropriate when neural networks are used for classification problems like email a spam or not spam. Often people get confused between a deep neural network which is actually an artificial neural network only with multiple layers between the input and output layers.

Deep Learning – Introduction to Artificial Neural Networks.

The question here can be how neural networks manage to do so? Well, the answer is simple and complex at the same time as it all boils down to the data we generate. These networks read, understand and find patterns in our data to learn and fine-tune itself. The more the data it processes more smartness it acquires over time. It requires extensive training of neural network before implementing them in a real-time problem-solving environment, So neural networks can

  • Predict our behavior and become an expert.
  • Learn to understand and speak our language, text, and handwriting
  • Watch videos to learn facial movement for pronouncing different words and create videos

Neural networks can solve almost any problem. Problems like identifying signatures on bank cheques, cluttering spoken words of different languages, credit card fraudulent transactions, and even unwanted email. Point to note here is neural networks are a very effective, powerful and excellent tool to find the output for any issue but keep in mind that just because you have a new hammer, it doesn’t mean that everything is a nail (highlighted text taken from internet). We need to understand, the design on paper and strategies our path for the solution to a problem in hand.

 

Artificial Neural Networks – Types and Kinds

ANN’s learn, get trained and adjust automatically like we humans do. Though ANN’s are inspired by the human brain but for a fact they run on a far simpler plane. The structure of neurons is now used for machine learning thus called artificial learning. This development has helped various problems to come to an end especially where layering is needed for refinement and granular details are needed.

There are several kinds of Neural Networks in deep learning. Which one needs to be picked is entirely depends upon the data to train and end goal. Voice recognition kind of challenges might need combinations. Neural networks consist of input and output layers and at least one hidden layer.

The most widely used type of neural network is RNN- Recurrent Neural Networks.  In RNNs data flow is in multiple directions. These networks are employed for highly complex tasks i.e voice recognition, handwriting and language recognition, etc. Thus RNNSs abilities are quite limitless.

Deep Learning – Introduction to Artificial Neural Networks

A neural network based on radial basis function can be used for strategic reasons. There are several other models of the neural network including what we have mentioned above. For an introduction to the neural network and their working model continue reading this post. You will get a sense of how they work and used for real mathematical problems.

In short ANN’s are designed to simulate the computer’s working process the way the human brain processes information. ANN’s got popular just in recent times as in the past we neither had the computing power nor the required amount of data to train them.

 

Artificial Neural Networks Components

Neural Networks work as a visual guide for data evolution strategies which are a highly scalable alternative for deep reinforcement learning. Deep neural networks are fed through raw data; which learns to identify the object on which it is trained or “being trained”. From an architectural point of view, artificial neural networks have three components.

  1. Model Topology (Connections) – To describes the layers of neurons and the structure of the connections between them.
  2. Activation Function (Transfer Function) – Function to be used by artificial neurons.
  3. Learning Algorithm  –  To find the ideal values of the weights.

Each “neuron” is a relatively simple element e.g. summing its inputs and applying a threshold to the result, to decide the output of that “neuron”. Deep Neural Networks are the first family of algorithms within machine learning that do not require manual feature engineering, rather, they learn on their own by processing and learning the high-level features from raw data.

The data thrown into the neural network reveal some patterns (hopefully). On the downside, the neural network works as a complete mystery/black box. The neural network can find patterns in data but how and why it exists, never tells this secrete so because it can’t explain such matter thus we need a data scientist to do this job.

 

Neural Network Working

Deep Learning uses neural networks to create its foundation of the working model architecture. A neural network that took the idea of the human brain working for its basic working model has its basic unit a neuron. Like the human brain artificial neural network mimic the similar information processing model i.e.:

  • Like the brain sees some pictures through eyes and combines the feature of the picture, similarly, the neural network takes input as linear combinations through input neurons and combines it for processing.
  • The brain does something and gives output without efforts similarly ANNs do some processing and give output. In ANNs it can show and detect that there are some patterns and useful information but can’t tell how and why.

At the time of inputs in the form of linear combination,  we need to apply some function for optimisation reasons. This function is called as the activation function to achieve the desired output. Some common examples are:

  • Sigmoid Function – A function to crashes all inputs to allow output to be between 0 and 1
  • ReLU Function  – The Rectified Linear Unit function is a much better function than “tanh” function and also the most commonly used activation function. Returns 0 for negative inputs and stay silent for all positive values outputs.
  • Tanh Function –  A function to crashes all inputs to allow output to be between -1 and 1.

After making the explanation above now we can say online, “A neural network is a complicated form of neuron-like structures”. It has excellent possibility & flexibility to model, establish and ease out complex relationships between input and output. Hidden remains hidden though.

AAAA

Deep Learning or Machine Learning

Making a decision or choosing between ML or DL depends mainly on the problem & data at hand. Some high-level differences as below. Should you choose to read in details then I suggest you read “Demystifying the difference between Machine Learning and Deep Learning“.

  • ANN’s – In ANN’s learning is based on multiple layers of features or representation in each layer with the layers forming a hierarchy of low-level to high-level features. It may sound far-fetched, but remember that humans are neurologically hard-wired. ANN’s Focus is really on end-to-end automatic learning based on raw features.

Deep Learning – Introduction to Artificial Neural Networks

  • Machine learning – In ML focus is on feature engineering. Traditional machine learning creates/train-test splits of the data where ever possible via cross-validation.

In a scenario where a deep understanding of medical symptoms is required for some disease detection, high performance and accuracy are very crucial. So in this case, deep learning would be a better choice compared to traditional machine learning.

 

Frequently used Jargons in Deep Learning

  • Perceptrons –  A single layer neural network. Perceptron is a linear classifier. It is used in supervised learning. In this computing structure are based on the design of the human brain and algorithms take a set of inputs and returns a set of outputs.
  • Multilayer Perceptron (MLP)- A Multilayer Perceptron is a Feedforward Neural Network with multiple fully-connected layers that use nonlinear activation functions to deal with data that is not linearly separable.
  • Deep Belief Network (DBN) – DBN is a type of probabilistic graphical model that learns a hierarchical representation of the data in an unsupervised manner.
  • Deep Dream – A technique invented by Google that tries to distill the knowledge captured by a deep Convolutional Neural Network.
  • Deep Reinforcement Learning (DRN) –  This is a powerful and exciting area of AI research, with potential applicability to a variety of problem areas. Other common terms under this area are DQN, Deep Deterministic Policy Gradients (DDPG), etc.
  • Deep Neural Network (DNN)  – A neural network with many hidden layers. There is no hard-coded definition of how many layers a minimum a deep neural network has to have. Usually a minimum of 3-5 or more.
  • Recursive Neural Networks  –  Recursive Neural Networks – Call it a deep tree-like structure. When the need is to parse a whole sentence we use the recursive neural network. Tree-like topology allows branching connections and hierarchical structure. Arguments here can be how recursive neural network is different than recurrent neural networks?
  • Recurrent Neural Networks – RNNs abilities are quite limitless. Don’t get lost between Recursive and Recurrent NNs. ANN’s structure is what enables artificial intelligence, machine learning, and supercomputing to flourish. Neural networks are used for language translation, face recognition, picture captioning, text summarization and lot more tasks.
  • Convolutional Neural Networks – Convolutional Neural Networks (CNN) are an excellent tool and one of the most advanced achievements in deep learning. In today’s time, CNN’s deep learning got hyped, much of attention and focus from all players in the business. The two core concepts of convolutional neural networks are convolution (hence the name) and pooling.
  • Generative Adversarial Networks – GANs are a class of unsupervised learning algorithms used environment. As the name suggests they are called as Adversarial Networks because they are is made up of two competing neural networks i.e Generator and  Discriminator, Both networks compete with each other to achieve the zero-sum game. Both neural networks are assigned different job role i.e. contesting with each other:

 

Points not covered here

  • Adding  weights
  • Sigmoid neuron/functions
  • How to give the neurons an activation bias
  • Determine the activation level

 

Points to Note:

When to use artificial neural networks as oppose to traditional machine learning algorithms is a complex one to answer. It entirely depends upon the problem in hand to solve. One needs to be patient and experienced enough to have the correct answer. All credits if any remains on the original contributor only. In the next upcoming post will talk about Recurrent Neural Networks in detail.

 

Books Referred & Other material referred

  • Open Internet research, news portals and white papers reading
  • Lab and hands-on experience of  @AILabPage (Self-taught learners group) members.
  • Learning through
    • Live webinars
    • Conferences, Lectures & Seminars
    • AI Talkshows

 

Feedback & Further Question

Do you have any questions about Deep Learning or Machine Learning? Leave a comment or ask your question via email. Will try my best to answer it.

 

SECaaS - Security as a Service Is the Next Big ThingConclusion –  We have seen in the post above that ANN’s don’t create or invent any new information or facts. ANN helps us to make sense of what’s already available in the hidden format. It takes an empirical approach on a massive amount of data to give the best and near accurate results.

Deep Learning, in short, is going much beyond machine learning and its algorithms that are either supervised or unsupervised. In DL it uses many layers of nonlinear processing units for feature extraction and transformation. It has revolutionized today’s industries by demonstrating near human-level accuracy in certain tasks. The task like pattern recognition, image classification, voice/text decoding and many more. Self-driving CAR is one of the best examples and biggest achievements so far.

============================ About the Author =======================

Read about Author at : About Me

Thank you all, for spending your time reading this post. Please share your opinion / comments / critics / agreements or disagreement. Remark for more details about posts, subjects and relevance please read the disclaimer.

FacebookPage    ContactMe      Twitter

====================================================================

Facebook Comments
Advertisements

9 replies »

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.