Catalogue of Artificial Intelligence Techniques


Jump to: Top | Entry | References | Comments

View Maths as: Images | MathML

Stochastic Neural Networks

Aliases: Boltzmann Machine

Keywords: boltzmann, neural network, stochastic

Categories: Neural Networks

Author(s): Iain Whiteside

Stochastic Neural Networks

Stochastic Neural Networks are a variation of Hopfield Networks designed to add some ‘noise’ or randomisation into the dynamics of the network. This will allow the network to get out of the local minima in which a Hopfield Network would tend to get trapped. A Stochastic Neural Network can move to a seemingly less productive state in order to reach the global minima node (or state) which is the goal. A popular type of Stochastic Neural Network is the Boltzmann machine which was developed in the 1980’s by Hinton and Sejnowski.

A Stochastic Neural Network works by adding what are called stochastic weights to each edge. So there is a specific non-zero probability in moving from one node to another. These weights form a probability distribution over all the edges, such that the probability of each move is never zero and the algorithm will reach the global minimum with a probability, by integration over all nodes of 1.

Stochastic Neural Networks are of more use for theoretical study, than practical applications. The reason for this is that for a shallow global minimum the Algorithm may visit all nodes, thus making it inefficient, giving little benefit over the Hopfield model. Neurons in the human brain are Stochastic systems, making Stochastic Neural Networks an interesting topic for the study of memory emergence, and to accurately model the human brain we must include use this fact.



Add Comment

No comments.