Computational Complexity of Machine Learning | The MIT Press
The Computational Complexity of Machine Learning is a mathematical study of the possibilities for efficient learning by computers. It works within recently introduced models for machine inference that are based on the theory of computational complexity and that place an explicit emphasis on efficient and general algorithms for learning. Theorems are presented that help elucidate the boundary of what is efficiently learnable from examples. These results take the form of both algorithms with proofs of their performance, and hardness results demonstrating the intractability of learning in certain natural settings. In addition the book contains lower bounds on the resources required for learning, an extensive study of learning in the presence of errors in the sample data, and several theorems demonstrating reducibilities between learning problems.
Contents Definitions, Notations, and Motivation · Overview of Recent Research in Computational Learning Theory · Useful Tools for Distribution-Free Learning · Learning in the Presence of Errors · Lower Bounds on Sample Complexity · Cryptographic Limitations on Polynomial-Time Learning · Distribution-Specific Learning in Polynomial Time · Equivalence of Weak Learning and Group Learning