An infinitesimal is only a notional quantity - there exists no infinitesimal real number. This can be shown using the least upper bound axiom of the real numbers: consider whether the least upper bound c of the set of all infinitesimals is or is not an infinitesimal. If it is, then so is 2c, contradicting the fact that c is an upper bound. It it is not, then neither is c/2, contradicting the fact that among all upper bounds, c is the least.
The first mathematician to make use of infinitesimals was Archimedes. See how Archimedes used infinitesimals.
When Newton and Leibniz developed the calculus, they made use of infinitesimals. A typical argument might go:
It was not until the second half of the nineteenth century that the calculus was given a formal mathematical foundation by Karl Weierstrass and others using the notion of a limit, which obviates the need to use infinitesimals.
Nevertheless, the use of infinitesimals continues to be convenient for simplifying notation and calculation.
Infinitesimals are legitimate quantities in the non-standard analysis of Abraham Robinson. In this theory, the above computation of the derivative of f(x) = x² can be justified with a minor modification: we have to talk about the standard part of the difference quotient, and the standard part of x + dx is x.
Alternatively, we can have synthetic differential geometry.
See also: