Neural Networks

How Neural Network Algorithms Works : An Overview

Neural Network Algorithms – In this post we will explore neural networks behind the scenes working. Neural Networks arguably works close enough to human brain. Yes arguably as answer is yes and no both. Conceptually artificial neural networks is inspired by neural networks in the brain but actual implementation in machine learning are way far from reality. Take in multiple inputs, and produce a single output.

A very high level, simple and basic understanding post for just giving simple glimpse of NNs Algorithms. This is part -2 of our previous post – Artificial Neural Networks – Everything You Need To Know

 

Artificial Neural Networks – What Is It

AILabPage defines – Artificial neural networks (ANNs) as “Biologically inspired computing code with number of simple, highly interconnected processing elements for simulating human brain working & to process information model”. Its way different then computer program though. There are several kinds of Neural Networks in deep learning. Neural networks consist of input and output layers and at-least one hidden layer.

Neural network based on radial basis function with can be used for strategic reasons. There are several other models of neural network including what we have mentioned above. For an introduction to neural network and their working model continue reading this post. You will get a sense of how they work and used for real mathematical problems.

Brain Neuron vs ANN

ANN’s learns, get trained and adjust automatically like we human do. Though ANN’s are inspired by human brain but for a fact they run on a far simpler plane. The structure of neurons are now used for machine learning thus called as artificial learning. This development has helped various problems to come to an end especially where layering is needed for refinement and granular details are needed.

 

Neural Network Architecture

Neural networks consist of input, output layers hidden layers. Transformation of input into valuable output unit is the main job. They are excellent example of mathematical constructs.  Information flows in neural network happens in two ways.

  • Feedforward Networks – In this signals only travel in one direction without any loop i.e. towards the output layer. Extensively used in pattern recognition. This network with single input layer and a single output layer can have zero or multiple hidden layers though. This method has two common designs as below
    • At the time of it’s learning or “being trained”
    • At the time of operating normally or “after being trained”
  • Feedback Networks – In this recurrent or interactive networks can use their internal state (memory) to process sequences of inputs.  Signals can travel in both directions with loops in the network. As of now limited to  time series/sequential tasks. Typical human brain model.

 

Architectural Components

  • Input Layers, Neurons, and Weights  –  The basic unit in a neural network is called as the neuron or node. These units receives input from external source or some ANNother nodes. Idea here is to compute an output based associated weight. Weights to neuron are assigned based on its relative importance compared with other inputs. Now finally function is applied on this for computations.
    • Lets assume our task to it to make tea so our ingredients will represent the “neurons” or input neurons as these are building blocks or starting points. The amount of each ingredient is called a “weight.” After dumping tea, sugar, species, milk and water in a pan and then mixing will transform it another state and colour. This process of transformation can be called as an “activation function”.
  • Hidden Layers and Output Layers – The hidden layer layer is always isolated from external world hence its called as hidden. The main job of hidden layer to take inputs from input layer and perform its job i.e calculation and transform the result to output nodes. Bunch of hidden nodes can be called as hidden layer.
    • Continuing the same example above – In our tea making task, now using the mixture of our ingredients coming out of input layer, the solution upon heating (computation process) starts changing colour.The layers made up by the intermediate products are called “hidden layers”. Heating can be compared with activation process at end we get our final tea as output.

Network described here is much simpler for ease of understanding compared to the one you will find in real life. All computations in the forward propagation step and backpropagation step are done in the same way (at each node) as discussed before.

 

Behind The Scenes – Neural Networks Algorithms

There are many different algorithms used to train neural networks with too many variants. Lets visualise an artificial neural network (ANN) to get some fair idea on how neural networks operates. By now we all know there are three layer in neural network.

  • The input layer
  • Hidden layer
  • The output layer

ANN Picture

We outline few main algorithms with an overview to create our basic understanding and big picture on behind the scene of this excellent networks. In neural networks almost every neuron influence and connected to each other as seen on above picture.  Below 5 methods are commonly used in neural networks.

  • Feedforward algorithm
  • Sigmoid – A common activation algorithm
  • Cost function
  • Back propagation
  • Gradient descent – Applying the learning rate

 

Convolutional Neural Networks

Convolutional Neural Networks is an excellent and one of the most advanced Everything You Need to Know About Convolutional Neural Networksachievement in deep learning. Because of  CNN’s deep learning got hyped and so much of attention & focus from all players in business. The two core concepts in this are convolution and pooling.

Why do we need CNN’s and not just use use feed-forward neural networks. I guess if you read this post on “Convolutional Neural Networks“; you will find out the answer.

 

Generative Adversarial Networks

A very young family member of Deep Neural Networks Architecture. Introduce by Ian Generative Adversarial Networks (GANs) - The Basics You Need To KnowGoodfellow and his team at the University of Montreal in 2014. GANs are class of unsupervised machine learning algorithm. So as name suggest it is called as Adversarial Networks because this is made up of two neural networks. Both neural networks are assigned different job role i.e. contesting with each other.

  • Neural Network one is called as Generator, because it generate new data instances.
  • Other neural net is called as Discriminator, evaluates work for first neural net for authenticity.

The cycle continue to obtain accuracy or near perfection results. Still confused, its ok read this post on “Generative Adversarial Networks“; you will find more details and understanding.

 

Recursive Neural Networks

Recursive Neural Networks – Call it as deep tree like structure. When need is to parse a whole sentence we use recursive neural network. Tree like topology allow branching connections and  hierarchical structure. Arguments here can be how recursive neural network are different then recurrent neural networks?

Answer – To respond in one line we can say recurrent neural networks are in fact recursive neural networks with a particular structure: that of a linear chain.

 

Books Referred & Other material referred

  • Open Internet reading and research work
  • AILabPage (group of self-taught engineers) members hands on lab work.

Points to Note:

When to use artificial neural networks as oppose to traditional machine learning algorithms is complex one to answer.  It entirely depends upon on the problem in hand to solve. One needs to be patient and experienced enough to have correct answer.

All credits if any remains on the original contributor only. In the next upcoming post will talk about Recurrent Neural Networks in detail.

 

Feedback & Further Question

Do you have any questions about Deep Learning or Machine Learning? Leave a comment or ask your question via email . Will try my best to answer it.

 

SECaaS - Security as a Service Is the Next Big ThingConclusion – For any effective machine learning model requirement is only one which is reliable data pipelines. We have seen in post above that ANN’s don’t create or invent any new information or facts. ANN help us make sense of what’s already in front of us hidden in our data. Deep Learning in short is going much beyond machine learning and its algorithms that are either supervised or unsupervised. In DL it uses many layers of nonlinear processing units for feature extraction and transformation. ANN’s structure is what enables artificial intelligence, machine learning and supercomputing to flourish. Neural networks are powers language translation, face recognition, picture captioning, text summarisation and lot more.

 

============================ About the Author =======================

Read about Author at : About Me

Thank you all, for spending your time reading this post. Please share your opinion / comments / critics / agreements or disagreement. Remark for more details about posts, subjects and relevance please read the disclaimer.

FacebookPage    ContactMe      Twitter

====================================================================

Facebook Comments
Advertisements

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.