There are several different boosting algorithms, depending on the exact mathematical form of the strength and weight. One of the most common boosting algorithms is AdaBoost. Most boosting algorithms fit into the AnyBoost framework, which shows that boosting performs gradient descent in function space.
Boosting is based on probably approximately correct learning (PAC learning), which is a branch of learning theory.
Schapire was the first to show that if a concept is weakly PAC learnable then it is also strongly PAC learnable using boosting. Schapire's result was published at : Robert E. Schapire. The strength of weak learnability. Machine Learning, 5(2):197--227, 1990.
Algorithmically, boosting is related to