Sum rule in integration
In
calculus the sum rule in integration states that
It is of constant use in going from the left hand to the right-hand side, to
integrate sums. It is derived from the
sum rule in differentiation; and is one part of the
linearity of integration.
For example, if you know that the integral of exp(x) is exp(x) from calculus with exponentials and that the integral of cos(x) is sin(x) from calculus with trigonometry then:
Some other general results come from this rule. For example:
-
-
-
-
The proof above relied on the special case of the
constant factor rule in integration with k=-1.
Thus, the sum rule might be written as:
Another basic application is that sigma and integral signs can be changed around. That is:
This is simply because:
-
-
-
Since the integral is similar to a sum anyway, this is hardly surprising.
Passing rom the case of indefinite integrals to the case of integrals over an interval [a,b], we get exactly the same form of rule (the arbitrary constant of integration disappears).
First note that from the definition of integration as the antiderivative, the reverse process of differentiation:
-
Adding these,
Now take the
sum rule in differentiation:
Integrate both sides with respect to x:
So we have, looking at (1) and (2):
-
Therefore:
Now substitute:
-