System of linear equations
In
mathematics and
linear algebra, a
system of linear equations is a set of
linear equations such as
- 3 x1 + 2 x2 - x3 = 1
- 2 x1 - 2 x2 + 4x3 = -2
- -x1 + 1/2 x2 - x3 = 0
The problem is to find those values for the unknowns
x1,
x2 and
x3 which satisfy all three equations simultaneously.
Systems of linear equations belong to the oldest problems in mathematics and they have many applications, such as in digital signal processing, estimation, forecasting and generally in linear programming and in the approximation of non-linear problems in numerical analysis. An efficient way to solve systems of linear equations is given by the Gauss-Jordan elimination or by the Cholesky decomposition.
In general, a system with m linear equations and n unknowns can be written as
- a11x1 + a12x2 +
+ a1nxn = b1
- a21x1 + a22x2 +
+ a2nxn = b2
-     :
-     :
- am1x1 + am2x2 +
+ amnxn = bm
where
x1,...,
xn are the unknowns and the numbers
aij are the coefficients of the system. We write the coefficients in the matrix in the following way
-
This can also be represented as
- A x = b,
where
A is an
m-by-
n matrix above,
x is a column vector with
n entries and
b is a column vector with
m entries. The above mentioned Gauss-Jordan elimination applies to all these systems, even if the coefficients come from an arbitrary
field.
If the field is infinite (as in the case of the real or complex numbers), then only the following three cases are possible for any given system of linear equations:
- the system has no solution
- the system has a single solution
- the system has infinitely many solutions
A system of the form
- A x = 0
is called a
homogenous system of linear equations. The set of all solutions of such a homogeneous system is always a
linear subspace of
F n, where
F is the field of coefficients.
Especially in view of the abiove applications, several more efficient alternatives to Gauss-Jordan elimination have been developed for a wide diversity of special cases. Many of these improved algorithms are of complexity O(n²). Some of the most common special cases are:
- For problems of the form Ax = b, where A is a symmetric Toeplitz matrix, we can use Levinson recursion or one of its derivatives. One special commonly used Levinson-like derivative is Schur recursion, which is used in many digital signal processing applications.
- For problems of the form Ax = b, where A is a singular matrix or nearly singular, the matrix A is decomposed into the product of three matrices in a process called singular-value decomposition. The left and right hand matrices are left and right hand singular vectors. The middle matrix is a diagonal matrix and contains the singular values. The matrix can then be inverted simply by reversing the order of the three components, transposing the singular vector matrices, and taking the reciprocal of the diagonal elements of the middle matrix. If any of the singular values is too close to zero and therefore close to being singular, they are set to zero.