User talk:Paul Wormer/scratchbook1

Rotations in $$\mathbb{R}^3$$
Consider a real 3&times;3 matrix R with columns r1, r2,  r3, i.e.,

\mathbf{R} = \left(\mathbf{r}_1, \mathbf{r}_2 , \mathbf{r}_3 \right) $$. The matrix R is orthogonal if

\mathbf{r}_i \cdot \mathbf{r}_j = \delta_{ij}, \quad i,j = 1,2,3. $$ The matrix R is a proper rotation matrix, if it is orthogonal and if r1, r2, r3 form a right-handed set, i.e.,

\mathbf{r}_i \times \mathbf{r}_j = \sum_{k=1}^3 \, \varepsilon_{ijk} \mathbf{r}_k. $$ Here the symbol &times; indicates a cross product and $$\varepsilon_{ijk}$$ is the antisymmetric Levi-Civita symbol,

\begin{align} \varepsilon_{123} =&\; \varepsilon_{312} = \varepsilon_{231} = 1 \\ \varepsilon_{213} =&\; \varepsilon_{321} = \varepsilon_{132} = -1 \end{align} $$ and $$\varepsilon_{ijk} = 0$$ if two or more indices are equal.

The matrix R is an improper rotation matrix if its column vectors form a left-handed set, i.e.,

\mathbf{r}_i \times \mathbf{r}_j = - \sum_k \, \varepsilon_{ijk} \mathbf{r}_k \;. $$ The last two equations can be condensed into one equation

\mathbf{r}_i \times \mathbf{r}_j = \det(\mathbf{R}) \sum_{k=1}^3 \; \varepsilon_{ijk} \mathbf{r}_k $$ by virtue of the the fact that the determinant of a proper rotation matrix is 1 and of an improper rotation &minus;1. This can be proved as follows: The determinant of a 3&times;3  matrix with column vectors  a, b, and c can be written as

\mathbf{a} \cdot (\mathbf{b}\times\mathbf{c}) $$. Remember that for a proper rotation the columns of R are orthonormal and satisfy,

\mathbf{r}_1 \cdot (\mathbf{r}_2 \times \mathbf{r}_3 ) = \sum_k \, \varepsilon_{23k} \, \mathbf{r}_1 \cdot \mathbf{r}_k = \varepsilon_{231} = 1. $$ Likewise the determinant is &minus;1 for an improper rotation, which ends the proof.

Theorem
A proper rotation matrix R can be factorized thus

\mathbf{R} = \mathbf{R}_z (\omega_3 ) \; \mathbf{R}_y (\omega_2 ) \; \mathbf{R}_x (\omega_1 ) $$ which is referred to as the z-y-x parametrization, or also as

\mathbf{R} = \mathbf{R}_z (\alpha) \; \mathbf{R}_y (\beta ) \; \mathbf{R}_z (\gamma ) \quad $$ the z-y-z Euler parametrization.

Here

\mathbf{R}_z (\varphi ) \equiv \begin{pmatrix} \cos \varphi & -\sin \varphi & 0 \\ \sin \varphi & \cos \varphi & 0 \\ 0 &  0           & 1 \\ \end{pmatrix}, \quad \mathbf{R}_y (\varphi ) \equiv \begin{pmatrix} \cos \varphi & 0 & \sin \varphi \\ 0    & 1 &        0     \\ -\sin \varphi& 0 & \cos \varphi \\ \end{pmatrix}, \quad \mathbf{R}_x (\varphi ) \equiv \begin{pmatrix} 1 & 0 & 0 \\ 0 & \cos \varphi & -\sin \varphi \\ 0 & \sin \varphi & \cos \varphi \\ \end{pmatrix}. $$

Proof
First the z-y-x-parametrization will be proved by describing an algorithm for the factorization of R. Consider to that end

\mathbf{R}_z (\omega_3 ) \, \mathbf{R}_y (\omega_2 ) = \begin{pmatrix} \cos \omega_3 \cos \omega_2 & -\sin \omega_3 & \cos \omega_3 \sin \omega_2 \\ \sin \omega_3 \cos \omega_2 & \cos \omega_3  & \sin \omega_3 \sin \omega_2 \\ -\sin \omega_2 &              0 & \cos \omega_2 \\ \end{pmatrix} \equiv (\mathbf{a}_1, \mathbf{a}_2 , \mathbf{a}_3 ). $$ Note that the multiplication by Rx(&omega;1) on the right does not affect the first column, so that r1 = a1. Solve $$\omega_2$$ and $$\omega_3$$ from the first column of R,

\mathbf{r}_1 = \begin{pmatrix} \cos \omega_3 \; \cos \omega_2 \\ \sin \omega_3 \; \cos \omega_2 \\ -\sin \omega_2 \\ \end{pmatrix}. $$ This is possible. First solve $$\omega_2$$ for $$ -\pi/2 \leq \omega_2 \leq \pi/2$$ from

\sin \omega_2 = - R_{31} \equiv - (\mathbf{r}_1 )_3. $$ Then solve $$\omega_3$$ for $$0 \leq \omega_3 \leq 2 \pi$$ from

\begin{align} \cos \omega_3 =& {R_{11} \over \cos \omega_2} \\ \sin \omega_3 =& {R_{21} \over \cos \omega_2}. \end{align} $$ This determines the vectors a2 and a3.

Since a1, a2 and a3 are the columns of a proper rotation matrix they form an orthonormal right-handed system. The plane spanned by a2 and a3 is orthogonal to $$ \mathbf{a}_1 \equiv \mathbf{r}_1$$ and hence contains $$\mathbf{r}_2$$ and $$\mathbf{r}_3$$. Thus,

( \mathbf{r}_2, \mathbf{r}_3 ) = (\mathbf{a}_2 , \mathbf{a}_3 ) \begin{pmatrix} \cos \omega_1 & -\sin \omega_1 \\ \sin \omega_1 & \cos \omega_1 \\ \end{pmatrix}. $$ Since $$\mathbf{r}_2, \mathbf{a}_2 , \mathbf{a}_3$$ are known unit vectors we can compute

\begin{align} \mathbf{a}_2 \cdot \mathbf{r}_2 =& \cos \omega_1 \\ \mathbf{a}_3 \cdot \mathbf{r}_2 =& \sin \omega_1. \end{align} $$ These equations give $$\omega_1$$ with $$ 0 \leq \omega_1 \leq 2 \pi$$. Augment the matrix to $$\mathbf{r}_x(\omega_1)$$, then

\begin{align} \mathbf{R} \equiv& ( \mathbf{r}_1, \mathbf{r}_2 , \mathbf{r}_3 ) = ( \mathbf{r}_1 , \mathbf{a}_2 , \mathbf{a}_3 ) \mathbf{R}_x (\omega_1 ) \\ =& (\mathbf{a}_1, \mathbf{a}_2, \mathbf{a}_3)\mathbf{R}_x (\omega_1 ) = \mathbf{R}_z (\omega_3 ) \, \mathbf{R}_y (\omega_2 ) \, \mathbf{R}_x (\omega_1 ). \end{align} $$ This concludes the proof of the z-y-x parametrization.