Computational learning theory
COLT (COmputational Learning Theory) is a field of
machine learning.
In COLT the main interest is the computational aspects of learning, e.g., the time complexity of learning problem. The computational aspects are considered in a learning framework, like the very common one of Probably approximately correct learning.
Note that a computation is considered feasible if it can be done in polynomial time.
There are two kind of results in COLT.
Possitive results - Showing the a certain class of function is learnable in polynomial time.
Negative results - Showing that certain classes cannot be learned in polynomial time.
Negative results were proven only by assumption.
The assumptions the are common in negative results are:
Computational - P<>NP
Cryptographic - One way functions exits.
A list of important COLT papers
Surveys
- [Angluin, 92] Angluin, D. 1992. Computational learning theory: Survey and selected bibliography. In Proceedings of the Twenty-Fourth Annual ACM Symposium on Theory of Computing (May 1992), pp. 351--369.
- [Hau,90] D. Haussler. Probably approximately correct learning. In AAAI-90 Proceedings of the Eight National Conference on Artificial Intelligence, Boston, MA, pages 1101--1108. American Association for Artificial Intelligence, 1990.
" class="external">http://citeseer.nj.nec.com/haussler90probably.html
Feature selection
- [DH,94] A. Dhagat and L. Hellerstein. PAC learning with irrelevant attributes. In Proceedings of the IEEE Symp. on Foundation of Computer Science, 1994. To appear.
" class="external">http://citeseer.nj.nec.com/dhagat94pac.html
Inductive inference
- [Gold, 67] E. M. Gold. Language identification in the limit. Information and Control, 10:447--474, 1967.
Optimal O notation learning
- [GG96] O. Goldreich, D. Ron. On universal learning algorithms.
" class="external">http://citeseer.nj.nec.com/69804.html
Negative results
- [KV,89] M. Kearns and L. G. Valiant. 1989. Cryptographic limitations on learning boolean formulae and finite automata. In Proceedings of the 21st Annual ACM Symposium on Theory of Computing, pages 433--444, New York. ACM.
" class="external">http://citeseer.ist.psu.edu/kearns89cryptographic.html
Boosting
- [Sch, 90] Robert E. Schapire. The strength of weak learnability. Machine Learning, 5(2):197--227, 1990
" class="external">http://citeseer.nj.nec.com/schapire90strength.html
Occam's Razor
- Blumer, A.; Ehrenfeucht, A.; Haussler, D.; Warmuth, M. K. "Occam's razor" Inf.Proc.Lett. 24, 377-380, 1987.
- A. Blumer, A. Ehrenfeucht, D. Haussler, and M. K. Warmuth. Learnability and the Vapnik-Chervonenkis dimension. Journal of the ACM, 36(4):929--865, 1989.
Probably approximately correct learning
- [Valiant, 84] L. Valiant. A Theory of the Learnable. Communications of the ACM, 27(11):1134--1142, 1984.
Error tolerance
This article is a stub. You can help Wikipedia by fixing it.