Neural Networks: A Brief History - «Deep Learning for Coders with Fastai and PyTorch» (2020)

A simplified model of a real neuron could be represented using simple addition and thresholding.

Rosenblatt further developed the artificial neuron to give it the ability to learn.

An MIT professor named Marvin Minsky along with Seymour Papert, wrote a book called Perceptrons (MIT Press) about Rosenblatt’s invention.
They showed that a single layer of these devices was unable to learn some simple but critical mathematical functions (such as XOR).
In the same book, they also showed that using multiple layers of the devices would allow these limitations to be addressed.

Traditional computer programs work very differently from brains, and that might be why computer programs had been (at that point) so bad at doing things that brains find easy (such as recognizing objects in pictures).

In fact, the approach laid out in PDP is very similar to the approach used in today’s neural networks.
The book defined parallel distributed processing as requiring the following:

  • A set of processing units
  • A state of activation
  • An output function for each unit
  • A pattern of connectivity among units
  • A propagation rule for propagating patterns of activities through the network of connectivities
  • An activation rule for combining the inputs impinging on a unit with the current state of that unit to produce an output for the unit
  • A learning rule whereby patterns of connectivity are modified by experience
  • An environment within which the system must operate