Historical notes

  1. 1943 --- McCulloch and Pitts (start of the modern era of neural networks). 

    Logical calculus of neural networks. A network consists of sufficient number of neurons (using a simple model) and properly set synaptic connections can compute any computable function.

     

  2. 1949 --- Hebb's book "The organization of behavior". 

    An explicit statement of a physiological learning rule for synaptic modification was presented for the first time. 

    Hebb proposes that the connectivity of the brain is continually changing as an organism learns differing functional tasks, and that neural assemblies are created by such changes. 

    Hebb's work was immensely influential among psychologyists.

     

  3. 1958 --- Rosenblatt introduced Perceptron

    A novel method of supervised learning.

    Perceptron convergence theorem.

    Least mean-square (LMS) algorithm

     

  4. 1969 --- Minsky and Papert showed limits on perceptron computation.

    Minsky and Papert showed that there are fundamental limits on what single-layer perceptrons can compute.

    They speculated that the limits could not be overcome for the multi-layer version.

     

  5. 1982 --- Hopfield's networks

    Hopfield showed how to use "Ising spin glass" type of model to store information in dynamically stable networks.

    His work paved the way for physicists to enter neural modeling, thereby transforming the field of neural networks.

     

  6. 1982 --- Kohonen's self-organizing maps (SOM)

    Kohonen's self-organizing maps is capable of reproducing important aspects of the structure of biological neural nets: Data representation using topographic maps (which are common in the nervous systems). SOM also has a wide range of applications.

    SOM shows how the output layer can pick up the correlational structure (from the inputs) in the form of the spatial arrangement of units.

    1985 --- Ackley, Hinton, and Sejnowski, developed Boltzmann machine, which was the first successful realization of a multilayer neural network.

     

  7. 1986 --- Rumelhart, Hinton, and Williams developed the back-propagation algorithm --- the most popular learning algorithm for the training of multilayer perceptrons. It has been the workhorse for many neural network applications.