See also: counting
Table of contents |
2 Notation 3 Relationships to other operations and constants 4 Useful sums 5 Approximation by integrals |
When adding finitely many numbers, it doesn't matter how you group the numbers and in which order you add them; you will always get the same result.
(See Associativity and Commutativity.)
If you add zero to any number, the quantity won't change; zero is the identity element for addition.
The sum of any number and its additive inverse (in contexts where such a thing exists) is zero.
If the terms are all written out individually, then addition is written using the plus sign ("+").
Thus, the sum of 1, 2, and 4 is 1 + 2 + 4 = 7. If the terms are not written out individually, then the sum may be written with an ellipsis to mark out the missing terms.
Thus, the sum of all the natural numbers from 1 to 100 is 1 + 2 + ... + 99 + 100.
Alternatively, the sum can be represented by the summation symbol, which is the capital Sigma. This is defined as:
Important properties
Notation
The subscript gives the symbol for a dummy variable, i. Here, i represents the index of summation; m is the lower bound of summation, and n is the upper bound of summation.
So, for example:
One may also consider sums of infinitely many terms; these are called infinite series.
Notationally, we would replace n above by the infinity symbol (∞).
The sum of such a series is defined as the limit of the sum of the first n terms, as n grows without bound.
That is:
It's possible to add fewer than 2 numbers.
If you add the single term x, then the sum is x.
If you add zero terms, then the sum is zero, because zero is the identity for addition.
This is known as the empty sum.
These degenerate cases are usually only used when the summation notation gives a degenerate result in a special case.
For example, if m = n in the definition above, then there is only one term in the sum; if m = n + 1, then there is none.
Many other operations can be thought of as generalised sums.
If a single term x appears in a sum n times, then the sum is nx, the result of a multiplication.
If n is not a natural number, then the multiplication may still make sense, so that we have a sort of notion of adding a term, say, two and a half times.
A special case is multiplication by -1, which leads to the concept of the additive inverse, and to subtraction, the inverse operation to addition.
The most general version of these ideas is the linear combination, where any number of terms are included in the generalised sum any number of times.
The following are useful identities:
Relationships to other operations and constants
Useful sums
The mathematics, behind this first identity, were demonstrated by Carl Friedrich Gauss, during the 18th Century
In general, the sum of the first n mth powers is
where is the kth Bernoulli number.
The following are useful approximations (using theta notation):
Many such approximations can be obtained by the following connection between sums and integrals, which holds for any increasing function f: