# Gram-Schmidt orthogonalization  Main Article Discussion Related Articles  [?] Bibliography  [?] External Links  [?] Citable Version  [?] This editable Main Article is under development and subject to a disclaimer. [edit intro]

In mathematics, especially in linear algebra, Gram-Schmidt orthogonalization is a sequential procedure or algorithm for constructing a set of mutually orthogonal vectors from a given set of linearly independent vectors. Orthogonalization is important in diverse applications in mathematics and the applied sciences because it can often simplifiy calculations or computations by making it possible, for instance, to do the calculation in a recursive manner.

## The Gram-Schmidt orthogonalization algorithm

Let X be an inner product space over the sub-field $F$ of real or complex numbers with inner product $\langle \cdot ,\cdot \rangle$ , and let $x_{1},x_{2},\ldots ,x_{n}$ be a collection of linearly independent elements of X. Recall that linear independence means that

$a_{1}x_{1}+a_{2}x_{2}+\ldots +a_{n}x_{n}=0{\,\,{\rm {for\,\,some\,\,}}}a_{1},a_{2},\ldots ,a_{n}\in F$ implies that $a_{1}=a_{2}=\ldots =a_{n}=0$ . The Gram-Schmidt orthogonalization procedure constructs, in a sequential manner, a new sequence of vectors $y_{1},y_{2},\ldots ,y_{n}\in X$ such that:

$\langle y_{i},y_{j}\rangle =0\,\,{\rm {whenever\,}}i\neq j.\quad (1)$ The vectors $y_{1},y_{2},\ldots ,y_{n}\in X$ satisfying (1) are said to be orthogonal.

The Gram-Schmidt orthogonalization algorithm is actually quite simple and goes as follows:

Set $y_{1}=x_{1}$ For i = 2 to n,
$y_{i}=x_{i}-\sum _{j=1}^{i-1}\langle x_{i},y_{j}\rangle {\frac {y_{j}}{\langle y_{j},y_{j}\rangle }}$ End

It can easily be checked that the sequence $y_{1},y_{2},\ldots ,y_{n}$ constructed in such a way will satisfy the requirement (1).