Normal view MARC view ISBD view

Hebbian Learning and Negative Feedback Networks [electronic resource] /by Colin Fyfe.

by Fyfe, Colin [author.]; SpringerLink (Online service).
Material type: materialTypeLabelBookSeries: Advanced Information and Knowledge Processing: Publisher: London : Springer London, 2005.Description: XVIII, 383 p. 117 illus. online resource.ISBN: 9781846281181.Subject(s): Computer science | Artificial intelligence | Computer simulation | Optical pattern recognition | Computer Science | Probability and Statistics in Computer Science | Artificial Intelligence (incl. Robotics) | Pattern Recognition | Simulation and Modeling | Computer Science, generalDDC classification: 005.55 Online resources: Click here to access online
Contents:
Single Stream Networks -- Background -- The Negative Feedback Network -- Peer-Inhibitory Neurons -- Multiple Cause Data -- Exploratory Data Analysis -- Topology Preserving Maps -- Maximum Likelihood Hebbian Learning -- Dual Stream Networks -- Two Neural Networks for Canonical Correlation Analysis -- Alternative Derivations of CCA Networks -- Kernel and Nonlinear Correlations -- Exploratory Correlation Analysis -- Multicollinearity and Partial Least Squares -- Twinned Principal Curves -- The Future.
In: Springer eBooksSummary: The central idea of Hebbian Learning and Negative Feedback Networks is that artificial neural networks using negative feedback of activation can use simple Hebbian learning to self-organise so that they uncover interesting structures in data sets. Two variants are considered: the first uses a single stream of data to self-organise. By changing the learning rules for the network, it is shown how to perform Principal Component Analysis, Exploratory Projection Pursuit, Independent Component Analysis, Factor Analysis and a variety of topology preserving mappings for such data sets. The second variants use two input data streams on which they self-organise. In their basic form, these networks are shown to perform Canonical Correlation Analysis, the statistical technique which finds those filters onto which projections of the two data streams have greatest correlation. The book encompasses a wide range of real experiments and displays how the approaches it formulates can be applied to the analysis of real problems.
Tags from this library: No tags from this library for this title. Add tag(s)
Log in to add tags.
    average rating: 0.0 (0 votes)

Single Stream Networks -- Background -- The Negative Feedback Network -- Peer-Inhibitory Neurons -- Multiple Cause Data -- Exploratory Data Analysis -- Topology Preserving Maps -- Maximum Likelihood Hebbian Learning -- Dual Stream Networks -- Two Neural Networks for Canonical Correlation Analysis -- Alternative Derivations of CCA Networks -- Kernel and Nonlinear Correlations -- Exploratory Correlation Analysis -- Multicollinearity and Partial Least Squares -- Twinned Principal Curves -- The Future.

The central idea of Hebbian Learning and Negative Feedback Networks is that artificial neural networks using negative feedback of activation can use simple Hebbian learning to self-organise so that they uncover interesting structures in data sets. Two variants are considered: the first uses a single stream of data to self-organise. By changing the learning rules for the network, it is shown how to perform Principal Component Analysis, Exploratory Projection Pursuit, Independent Component Analysis, Factor Analysis and a variety of topology preserving mappings for such data sets. The second variants use two input data streams on which they self-organise. In their basic form, these networks are shown to perform Canonical Correlation Analysis, the statistical technique which finds those filters onto which projections of the two data streams have greatest correlation. The book encompasses a wide range of real experiments and displays how the approaches it formulates can be applied to the analysis of real problems.

There are no comments for this item.

Log in to your account to post a comment.
@ Jomo Kenyatta University Of Agriculture and Technology Library

Powered by Koha