Neural Networks are useful in several industrial applications

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Neural Networks are useful in several industrial applications

1,132 Views
yosoytono
Fresh Out Contributor

Artificial Neural Networks (ANN) are algorithmic constructs that enable machines to learn (without task-specific programming) to do things by analyzing hand-labeled training examples. ANN are parallel computational models with densely interconnected adaptive processing units: many simple processors running in parallel modeling nonlinear static or dynamic systems, where complex relationships exist between an input and its output.

ANN have two important key features. The first is their Adaptive Nature in which "learning by example" (doing automatic adjustment of the system's parameters, generating a correct output for a given input) replaces "programming" in solving problems. The second is their Intrinsic Parallel Architecture that allows fast computation of solutions when an ANN is implemented on parallel digital computers or in customized hardware.

Warren McCulloch and Walter Pitts, researchers of the College of Medicine at the Illinois Neuropsychiatric Department, pioneered in 1944 a computational model for Artificial Neural Networks based on mathematics and algorithms. Their "A Logical Calculus of the Ideas Immanent in Nervous Activity," divided ANN into two approaches: the biological processes found in the brain and the application of neural networks into Artificial Intelligence.

Mimicking a biological brain, ANN can be mapped as thousands “artificial neurons” (like axons) sequentially stacked in rows (“layers”) forming millions of connections (like synapses). Those neurons transmit signals (usually real numbers between 0 and 1) across neurons; the receiving neuron (postsynaptic) process the signal and prompt downstream neurons connected to it. Signals travel from the first (input) to the last (output) layer after traversing the layers multiple times.

Each node gives a number ("weight") to each of its incoming connection. When the network is active, the nodes receive different signals over each of its connections and multiply it by the corresponding weight. Neurons and connections have a weight that changes as learning proceeds, which can grow or shrink signal strength that it sends downstream, using a threshold for the aggregated signal.

The brain, its complexity, and neuroscience have inspired ANN; but they are not biological, neural, or cognitive models. Many Artificial Neural Networks are more related to statistics (nonparametric pattern classifiers, regression models, clustering algorithms, or nonlinear filters) than to neurobiological models.

Mathematical models can describe Artificial Neural Networks by associating a learning rule and defining a function and its class; where members of the class are parameters variations, connection weights, or architecture's attributes (the number of neurons or their connectivity). An ANN's function is defined as a composition of functions which can be decomposed into other functions.

When modeling ANN as functions, there are some types of architectures:
- Feedforward Neural Network (FNN) moves information in one direction only, with no cycles or loops in its network. Signals travel from input nodes to output nodes, passing through (if any) hidden nodes.

- Recurrent Neural Network (RNN) moves information in a bi-directional flow, with a directed cycle in its network with a temporal dynamic behavior.

- Modular Neural Networks (MNN) are formed by independent non-interactive neural networks; they utilize unconnected inputs dividing and accomplishing the complete task. An intermediary moderates MNN, accepts the input of those independent ANN, process them, and creates the final output.

- Physical Neural Network (PNN) emphasize the dependency on physical hardware (emulating the neurons) as opposed to software alone simulating the ANN.

- Radial Basis Function (RBF) is a highly intuitive ANN and is usually used to interpolate multidimensional space. Each neuron in the RBF neural network stores an example from the training example as a “prototype.”

- Kohonen self-organizing: performs functions on unlabeled data to describe hidden structures. It applies competitive learning to a set of input data, instead of the standard error-correction learning used by other neural networks; is ideal for the visualization of low-dimensional views of high-dimensional data.


The adaptive nature of Artificial Neural Networks makes them useful in application domains where training data is available, but there is a little or incomplete understanding of the problem to be solved.

Artificial Neural Networks are useful in several industrial applications (computer vision & image filtering, gaming, speech recognition & machine translation, medical diagnosis, and others) for a variety of tasks: pattern classification & image compression, speech synthesis & recognition, human/machine interfaces, function approximation, nonlinear system modeling, or forecasting & prediction.

Labels (1)
0 Kudos
Reply
0 Replies