QR Decomposition Calculator

Orthonormal Q & upper triangular R

Enter a real matrix and this tool computes its QR decomposition using a Gram–Schmidt style algorithm. You get an orthonormal matrix Q, an upper triangular matrix R, plus diagnostic checks for A ≈ QR and QᵀQ ≈ I suitable for teaching, homework and numerical linear algebra workflows.

Gram–Schmidt orthogonalization least squares ready orthogonality checks matrix reconstruction error

Matrix input & decomposition options

Use spaces, tabs, commas or semicolons between entries. Each line is a row. All rows must have the same number of entries. For QR, it is typical (but not required) to have at least as many rows as columns (m ≥ n).

Only affects how results are displayed, not internal precision.

Matrix size limits

Recommended up to about 8×8 for didactic use. For larger matrices, use a dedicated numerical library (e.g. MATLAB, NumPy, LAPACK).

Additional diagnostics

Definition of QR decomposition

Let \(A\) be an \(m \times n\) real matrix with \(m \ge n\). A QR decomposition of \(A\) is a factorisation \[ A = Q R, \] where:

  • \(Q\) is an \(m \times n\) matrix whose columns are orthonormal, i.e. \(Q^\top Q = I_n\);
  • \(R\) is an \(n \times n\) upper triangular matrix.

Column-space interpretation

If the columns of \(A\) are linearly independent, the columns of \(Q\) form an orthonormal basis for the same column space. The matrix \(R\) encodes how each original column of \(A\) is expressed as a linear combination of these orthonormal basis vectors.

Gram–Schmidt algorithm for QR

One conceptual way to obtain a QR decomposition is to apply the Gram–Schmidt orthogonalization process to the columns of \(A\). Denote the columns of \(A\) by \(\{a_1, a_2, \dots, a_n\}\). The goal is to construct an orthonormal set \(\{q_1, q_2, \dots, q_n\}\) and coefficients \(r_{ij}\) such that \[ a_j = \sum_{i=1}^j r_{ij} q_i, \quad j = 1,\dots,n. \]

Classical Gram–Schmidt

For each column \(a_j\):

  1. Set \(v_j = a_j\).
  2. For \(i = 1, \dots, j-1\), \[ r_{ij} = q_i^\top a_j,\quad v_j \leftarrow v_j - r_{ij} q_i. \]
  3. Set \(r_{jj} = \lVert v_j \rVert\) and \(q_j = v_j / r_{jj}\).

The matrix \(Q\) has columns \(q_j\) and \(R = [r_{ij}]\) is upper triangular.

In floating-point arithmetic, modified Gram–Schmidt is usually preferred for better numerical stability. The implementation in this calculator follows a modified Gram–Schmidt style update that is more robust for teaching-size problems.

Applications of QR decomposition

  • Least squares problems — to solve an overdetermined system \(A x \approx b\), you can use \(A = Q R\) and solve \(R x = Q^\top b\).
  • Eigenvalue algorithms — the QR algorithm iteratively applies QR decompositions to converge to upper-triangular Schur forms.
  • Numerical stability — orthogonal transformations preserve norms, which helps control rounding errors in linear algebra computations.
  • Orthonormal bases — QR provides orthonormal bases for column spaces, useful in high-dimensional geometry and PCA pipelines.

Interpreting the diagnostic checks

  • Reconstruction error \(\lVert A - Q R \rVert_\infty\) — the max absolute entry of the residual matrix. Values close to machine precision (\(\approx 10^{-12}\) to \(10^{-14}\) in double precision) indicate a very accurate factorisation for moderate-sized matrices.
  • Orthogonality error \(\lVert Q^\top Q - I \rVert_\infty\) — measures how close the columns of \(Q\) are to perfectly orthonormal. Larger deviations suggest numerical issues or rank-deficient columns.
  • Small diagonal entries in \(R\) — very small values relative to the norm of \(A\) often indicate near linear dependence among columns, which implies an ill-conditioned least squares problem.

Related linear algebra tools