Article
Details
Citation
Kay JW & Phillips W (1997) Activation functions, computational goals, and learning rules for local processors with contextual guidance. Neural Computation, 9 (4), pp. 895-910. https://doi.org/10.1162/neco.1997.9.4.895
Abstract
Information about context can enable local processors to discover latent variables that are relevant to the context within which they occur, and it can also guide short-term processing. For example, Becker and Hinton (1992) have shown how context can guide learning, and Hummel and Biederman (1992) have shown how it can guide processing in a large neural net for object recognition. This article studies the basic capabilities of a local processor with two distinct classes of inputs: receptive field inputs that provide the primary drive and contextual inputs that modulate their effects. The contextual predictions are used to guide processing without confusing them with receptive field inputs. The processor's transfer function must therefore distinguish these two roles. Given these two classes of input, the information in the output can be decomposed into four disjoint components to provide a space of possible goals in which the unsupervised learning of Linsker (1988) and the internally supervised learning of Becker and Hinton (1992) are special cases. Learning rules are derived from an information-theoretic objective function, and simulations show that a local processor trained with these rules and using an appropriate activation function has the elementary properties required.
Journal
Neural Computation: Volume 9, Issue 4
Status | Published |
---|---|
Publication date | 15/05/1997 |
URL | http://hdl.handle.net/1893/24332 |
Publisher | MIT Press |
ISSN | 0899-7667 |
People (1)
Emeritus Professor, Psychology