Entropy learning and relevance criteria for neural network pruning

In this paper, entropy is a term used in the learning phase of a neural network. As learning progresses, more hidden nodes get into saturation. The early creation of such hidden nodes may impair generalisation. Hence an entropy approach is proposed to dampen the early creation of such nodes by using...

Full description

Bibliographic Details
Main Authors: Geok, See Ng, Abdul Rahman, Abdul Wahab, Shi, Daming
Format: Article
Language:English
Published: World Scientific Publishing Company 2003
Subjects:
Online Access:http://irep.iium.edu.my/38198/
http://irep.iium.edu.my/38198/
http://irep.iium.edu.my/38198/1/Entropy_learning_and_relevance_criteria_for_neural_network_pruning.pdf
Description
Summary:In this paper, entropy is a term used in the learning phase of a neural network. As learning progresses, more hidden nodes get into saturation. The early creation of such hidden nodes may impair generalisation. Hence an entropy approach is proposed to dampen the early creation of such nodes by using a new computation called entropy cycle. Entropy learning also helps to increase the importance of relevant nodes while dampening the less important nodes. At the end of learning, the less important nodes can then be pruned to reduce the memory requirements of the neural network.