What is Artificial Neural Network?
An Artificial Neural Network
(ANN) is an information processing paradigm that is inspired by biological
nervous system. It is composed of a large number of highly interconnected
processing elements called neurons. An ANN is configured for a specific
application, such as patter recognition or data classification.
A trained neural network can be thought of as an
"expert" in the category of information it has been given to analyze.
Why use Neural Network?
- It has a ability to derive meaning from complicated or imprecise data.
 - It extract patterns and detect trends that are too complex to be noticed by either humans or other computer techniques.
 - Adaptive learning.
 - Real Time Operation.
 - Robust and Fault Torelant
 
Types of Artificial Neural Networks
There are two Artificial Neural Network topologies
− FeedForward and Feedback.
FeedForward ANN
The information flow is unidirectional. A
unit sends information to other unit from which it does not receive any
information. There are no feedback loops. They are used in pattern
generation/recognition/classification. They have fixed inputs and outputs.
FeedBack ANN
Here, feedback loops are allowed. They are
used in content addressable memories.
Advantages:
·     A neural network can perform tasks in which
a linear program cannot perform.
·     When an element of the neural network
fails, it can continue without any problem by their parallel nature.
·     A neural network does not need to be
reprogrammed as it learns itself.
·     It can be implemented in an easy way
without any problem.
·    As adaptive, intelligent systems, neural
networks are robust and excel at solving complex problems. Neural networks are
efficient in their programming and the scientists agree that the advantages of
using ANNs outweigh the risks.
·     It can be implemented in any application.
Disadvantages:
·    The neural network requires
training to operate.
·    Requires high processing time for large
neural networks.
·  The architecture of a neural network is
different from the architecture and history of microprocessor so they have
to be emulated.




Comments
Post a Comment