# Permutation Matrices

Jump to: navigation, search

In another article we have discussed Matrix Elimination. However, we haven't yet discussed, what to do when we encounter a zero in the pivot position. What we are supposed to do, is to exchange rows in order to get a non-zero entry at the pivot position. We can do this row operations by matrices called permutation matrices $\mathbf{P}$. In three dimensions there exist six permutation matrices. \begin{align} \begin{bmatrix} {1} & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix},\begin{bmatrix} {1} & 0 & 0 \\ 0 & 0 & 1 \\ 0 & 1 & 0 \end{bmatrix},\begin{bmatrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 1 \end{bmatrix},\begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 1 & 0 & 0 \\ \end{bmatrix},\begin{bmatrix} 0 & 0 & 1 \\ 0 & 1 & 0 \\ 1 & 0 & 0 \\ \end{bmatrix},\begin{bmatrix} 0 & 0 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \\ \end{bmatrix} \tag{1} \end{align} For an $n$-dimensional space, we can find $n!$ permutation matrices. If we look at the permutation matrix \begin{align} \begin{bmatrix} {1} & 0 & 0 \\ 0 & 0 & 1 \\ 0 & 1 & 0 \end{bmatrix} \end{align} and think about the row operations done by it, we see that the first row stays the same, but the second and the third row are exchanged. By using permutation matrices we can write the LU Decomposition in a more general form involving row exchanges when zeros occur in the pivot position. \begin{align} \mathbf{P}\mathbf{A}&=\mathbf{L}\mathbf{U}\\ \end{align} Permutation matrices have the special property that their transpose is also the inverse \begin{align} \mathbf{P}^{-1}=\mathbf{P}^{T} \end{align}

All permutation matrices belonging to a certain dimension (e.g. (1)) form a group, which means if we multiply two permutation matrices of (1) the result will again be a matrix of (1).