MURAL - Maynooth University Research Archive Library

    Chaitin-Kolmogorov Complexity and Generalization in Neural Networks

    Pearlmutter, Barak A. and Rosenfeld, Ronald (1991) Chaitin-Kolmogorov Complexity and Generalization in Neural Networks. Advances in Neural Information Processing Systems. pp. 925-931. ISSN 1049-5258

    Download (1MB) | Preview

    Share your research

    Twitter Facebook LinkedIn GooglePlus Email more...

    Add this article to your Mendeley library


    We present a unified framework for a number of different ways of failing to generalize properly. During learning, sources of random information contaminate the network, effectively augmenting the training data with random information. The complexity of the function computed is therefore increased, and generalization is degraded. We analyze replicated networks, in which a number of identical networks are independently trained on the same data and their results averaged. We conclude that replication almost always results in a decrease in the expected complexity of the network, and that replication therefore increases expected generalization. Simulations confirming the effect are also presented.

    Item Type: Article
    Keywords: Chaitin-Kolmogorov Complexity; Neural Networks;
    Academic Unit: Faculty of Science and Engineering > Computer Science
    Faculty of Science and Engineering > Research Institutes > Hamilton Institute
    Item ID: 5536
    Depositing User: Barak Pearlmutter
    Date Deposited: 04 Nov 2014 10:23
    Journal or Publication Title: Advances in Neural Information Processing Systems
    Publisher: Massachusetts Institute of Technology Press (MIT Press)
    Refereed: Yes
    Use Licence: This item is available under a Creative Commons Attribution Non Commercial Share Alike Licence (CC BY-NC-SA). Details of this licence are available here

    Repository Staff Only(login required)

    View Item Item control page


    Downloads per month over past year

    Origin of downloads