The different algorithms to compute the QR algorithm can be easily remembered according to this scheme:

Either, one thinks about creating the Q matrix from the initial matrix A. The simplest approach is then to go through the columns and orthogonalize them against all previous ones. Because this means that the first column in A is a linear combination of the first Q column, the second column is a linear combination of the first two columns in Q and so on, one obtains the R matrix as required.

Alternatively, one can consider creating the R matrix from the initial matrix A. In this case, if we use only orthogonal primitive operations, their product will automatically yield the Q matrix. Two such primitive orthogonal operations are mirroring (Householder) and rotation (Givens).

If one doesn’t run through the columns left-to-right, but instead takes the vector that has the largest contribution to the yet untouched subspace (= norm of the vector restricted to the lowest (n-k) components in the k-th step), one ends up with a rank-revealing QR decomposition that terminates as soon as all remaining columns are linearly-dependent (and therefore have norm=0 in this sense)

*Attachments are only visible to logged in users.*