Hebbian Learning
Neural network learning rule based on the principle that synapses between neurons are strengthened when the neurons activate simultaneously.
Hebbian learning, often summarized as "cells that fire together, wire together," is a fundamental principle in neuroscience and artificial intelligence for understanding how synaptic plasticity contributes to learning and memory. This learning rule posits that the synaptic efficacy between two neurons is increased if the activation of the presynaptic neuron consistently leads to the activation of the postsynaptic neuron. In artificial neural networks, Hebbian learning is used to adjust the weights of connections in an unsupervised manner, often forming the basis for associative learning and pattern recognition. The mechanism is vital for explaining processes like the development of feature detectors in the brain, where neurons become specialized in responding to specific stimuli through repeated exposure and reinforcement of particular neural pathways.
Hebbian learning was first introduced by the Canadian psychologist Donald Hebb in his 1949 book "The Organization of Behavior." The concept gained widespread attention and popularity in the subsequent decades, particularly during the 1980s and 1990s, as it became a cornerstone for neural network models and computational neuroscience.
Donald Hebb is the primary figure associated with the development of Hebbian learning, having articulated the rule in his seminal work. Subsequent contributions by researchers such as Geoffrey Hinton and David Rumelhart in the 1980s helped integrate Hebbian principles into modern neural network algorithms, advancing the field of machine learning and deep learning.