Please refer to Glossary of matrix theory for the definition of terms used throughout matrix theory.
Table of contents |
2 Elementary introduction 3 Some useful theorems 4 External links |
History
Study of matrices dated back to the very ancient. Latin squares and magic squares are the first matrices studied.
Leibniz, one of the two founders of calculus, used determinants in 1693 and Cramer presented his determinant-based idea to solve system of linear equations, known as Cramer's rule in 1750.
In 1800s, Gauss-Jordan elimination was invented.
It was J.J. Sylvester who first coined the term "matrix" in 1848. Cayley, Hamilton, Grassmann, Frobenius and von Neumann were among the famous mathematicians who had worked on matrix theory.
A matrix is a rectangular array of numbers. It can be identified with a linear transformation between two vector spaces. Therefore matrix theory is usually considered as a branch of linear algebra. The square matrices play a special role, because the nxn matrices for fixed n have many closure properties
In graph theory, each labelled graph corresponds to a unique nonnegative matrix, the adjacency matrix. A permutation matrix is the matrix representation of a permutation; it is a square matrix with entries 0 and 1, with just one entry 1 in each row and each column. These types of matrices are used in combinatorics.
The ideas of stochastic matrix and doubly stochastic matrix are important tools to study stochastic processes, in statistics.
Positive-definite matrices occur in the search for maxima and minima of real-valued functions, when there are several varaibles.
It is also important to have a theory of matrices over arbitrary ringss. In particular, matrices over polynomial rings are used in control theory.
Within pure mathematics, matrix rings can provide a rich field of counterexamples for mathematical conjectures, amongst other uses.
Elementary introduction
Some useful theorems
External links