Loading...
Complexity Results on Learning by Neural Nets
Lin, Jyh-Han ; Vitter, Jeffrey Scott
Lin, Jyh-Han
Vitter, Jeffrey Scott
An error occurred retrieving the object's statistics
Citations
Altmetric:
Abstract
We consider the computational complexity of learning by neural nets. We are inter-
ested in how hard it is to design appropriate neural net architectures and to train
neural nets for general and specialized learning tasks. Our main result shows that
the training problem for 2-cascade neural nets (which have only two non-input nodes,
one of which is hidden) is NP-complete, which implies that nding an optimal net
(in terms of the number of non-input units) that is consistent with a set of exam-
ples is also NP-complete. This result also demonstrates a surprising gap between the
computational complexities of one-node (perceptron) and two-node neural net training
problems, since the perceptron training problem can be solved in polynomial time by
linear programming techniques. We conjecture that training a k-cascade neural net,
which is a classical threshold network training problem, is also NP-complete, for each
xed k 2. We also show that the problem of nding an optimal perceptron (in
terms of the number of non-zero weights) consistent with a set of training examples is
NP-hard.
Our neural net learning model encapsulates the idea of modular neural nets, which
is a popular approach to overcoming the scaling problem in training neural nets. We
investigate how much easier the training problem becomes if the class of concepts to
be learned is known a priori and the net architecture is allowed to be su ciently
non-optimal. Finally, we classify several neural net optimization problems within the
polynomial-time hierarchy.
Description
Date
1991
Journal Title
Journal ISSN
Volume Title
Publisher
Springer Verlag
Collections
Files
Research Projects
Organizational Units
Journal Issue
Keywords
Citation
J.-H. Lin and J. S. Vitter. “Complexity Results on Learning by Neural Nets,” Machine Learning, 6, 1991, 211–230. An extended abstract appears in Proceedings of the 2nd Annual ACM Workshop on Computational Learning Theory (COLT ’89), Santa Cruz, CA, July–August 1989, published by Morgan Kaufmann, San Mateo, CA, 118–133. http://dx.doi.org/10.1023/A:1022657626762
