Learning selectivity and invariance through spatiotemporal Hebbian plasticity in a hierarchical neural network
Abstract: When an object moves smoothly across a field of view, the identify of the object is unchanged, but the activation pattern of the photoreceptors on the retina changes drastically. One of the major computational roles of our visual system is to manage selectivity for different objects and tolerance to such identity-preserving transformations as translations or rotations. This study demonstrates that a hierarchical neural network, whose synaptic connectivities are learned competitively with Hebbian plasticity operating within a local spatiotemporal pooling range, is capable of gradually achieving feature selectivity and transformation tolerance, so that the top level neurons carry higher mutual information about object categories than a single-level neural network. Furthermore, when genetic algorithm is applied to search for a network architecture that maximizes transformation-invariant object recognition performance, in conjunction with the associative learning algorithm, it is found that deep networks outperform shallower ones.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.