There are two main branches of calculus:
The conceptual foundations of calculus include the function, limit, infinite sequences, infinite series, and continuity. Its tools include the symbol manipulation techniques associated with elementary algebra, and mathematical induction.
Calculus has been extended to differential equations, vector calculus, calculus of variations, time scale calculus and differential topology. The modern, formally correct version of calculus is known as real analysis.
Although Archimedes and others have used integral methods throughout history, and a great many (Barrow, Fermat, Pascal, Wallis and others) had previously invented the idea of a derivative, Gottfried Wilhelm Leibniz and Sir Isaac Newton are usually credited with the invention, in the late 1600s, of differential and integral calculus as we know it today. Leibniz and Newton, apparently working independently, arrived at similar results. It is thought that Newton's discoveries were made earlier, but Leibniz' were the first to be published. Newton (who represented derivatives as , , etc.) provided a host of applications in physics, but Leibniz' more flexible notation (, , etc.) was eventually adopted. (The simpler notation is still used in some cases where it is sufficient.)
In 1704 an anonymous pamphlet, later determined to have been written by Leibniz, accused Newton of having plagiarised Leibniz' work. That claim is easily refuted as there is ample evidence to show that Newton commenced work on the calculus long before Leibniz can possibly have done, however the resulting controversy lead to suggestions that Leibniz may not have invented the calculus independently as he claimed, but may have been influenced by reading copies of Newton's early manuscripts. This claim is not so easily dismissed and there is in fact considerable circumstantial evidence to support it. Leibniz was not known at the time for his probity, and later admitted to falsifying the dates on certain of his manuscripts in an effort to bolster his claims. Furthermore, a copy of one of Newton's very early manuscripts with annotations by Leibniz was found among Leibniz' papers after his death, although the exact date when Leibniz first acquired this is unknown. It is also interesting to note that a similar controversy exists in philosophy over whether or not Leibniz may have appropriated the ideas of Spinoza in his writings on that subject.
The truth of the matter will never be known, and in any case is unimportant to anyone alive today. Leibniz' great contribution to calculus was his notation, and this is beyond doubt purely of Leibniz's invention. The controversy was unfortunate however in that it divided the mathematicians of Britain and Europe for many years. This set back British analysis (i.e. calculus-based mathematics) for a very long time. Newton's terminology and notation was clearly less flexible than that of Leibniz, yet it was retained in British usage until the early 19th century, when the work of the Analytical Society successfully saw the introduction of Leibniz's notation in Great Britain.
The strict limit definition of the derivative presented above was not evolved until much later, and neither Newton nor Leibniz, nor any of their followers until the mid-1800s, developed calculus with acceptable rigour. Nevertheless, the calculus was widely used, as it was a very powerful mathematical tool, but it was not until the nineteenth century that mathematicians like Augustin Louis Cauchy, Bernard Bolzano, and Karl Weierstrass were able to provide a mathematically rigorous exposition. This eventually resulted in deep explorations of the concept of infinity by Georg Cantor and others.
See also: calculus with polynomials
This usage is particularly common in mathematical logic, where a calculus is applied to compute universally true statements of a certain formal logic. Examples include the calculus of natural deduction, the sequent calculus, as well as many other calculi that are deviced in proof theory.
Derived from the Latin word for "pebble", calculus in its most general sense can mean any method or system of calculation. Other topics where the term calculus is used in this sense include:
History
Further Reading
In mathematics and related fields, the term calculus more generally refers to a system of formal rules of inference and axioms that are used for computation.