Matrix Decompositions

Matrix Decompositions#

Let’s dive into matrix decompositions using SageMath, along with explanations, examples, code, and visualizations.

Why Matrix Decompositions?#

Matrix decompositions are like breaking down a complex object into simpler parts. This makes it easier to analyze, solve equations, and understand the underlying structure of the matrix. Each type of decomposition has unique properties and applications.

1. LU Decomposition

  • Concept: Expresses a matrix A as the product of a lower triangular matrix (L) and an upper triangular matrix (U): A = LU.

  • Applications:

    • Solving systems of linear equations efficiently (Gaussian elimination).

    • Calculating determinants.

    • Finding matrix inverses.

import numpy as np
from sage.all import *

A = matrix([[2, 1, 1],
           [4, 3, 3],
           [8, 7, 9]])

# LU decomposition
P, L, U = A.LU()

print("\nP:\n")
show(P)
print("\nL:\n")
show(L)
print("\nU:\n")
show(U)

# Verify PA = LU (Corrected)
print("\nVerification: PA == LU:", np.allclose(P*A, L*U)) 
P:
\(\displaystyle \left(\begin{array}{rrr} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 1 & 0 & 0 \end{array}\right)\)
L:
\(\displaystyle \left(\begin{array}{rrr} 1 & 0 & 0 \\ \frac{1}{4} & 1 & 0 \\ \frac{1}{2} & \frac{2}{3} & 1 \end{array}\right)\)
U:
\(\displaystyle \left(\begin{array}{rrr} 8 & 7 & 9 \\ 0 & -\frac{3}{4} & -\frac{5}{4} \\ 0 & 0 & -\frac{2}{3} \end{array}\right)\)
Verification: PA == LU: False

Visualization (LU):

LU decomposition can be visualized as the steps of Gaussian elimination. Each step zeros out elements below the pivot, leading to an upper triangular matrix (U). The multipliers used for elimination are stored in the lower triangular matrix (L).

2. QR Decomposition

  • Concept: Expresses a matrix A as the product of an orthogonal matrix (Q) and an upper triangular matrix (R): A = QR.

  • Applications:

    • Solving linear least squares problems.

    • Finding orthonormal bases for subspaces.

    • Computing eigenvalues and eigenvectors using the QR algorithm.

A = matrix(CDF, [[2, 1, 1],
           [4, 3, 3],
           [8, 7, 9]])

# QR decomposition
Q, R = A.QR()

# Using the 'n()' method to
# display with 2 digits of precision
print("\nQ (rounded to 2dp):\n")
show(Q.n(digits=2))  
print("\nR (rounded to 2dp):\n")
show(R.n(digits=2))

# Verify A == QR
print("\nVerification: A == QR:", np.allclose(A, Q*R))  
Q (rounded to 2dp):
\(\displaystyle \left(\begin{array}{rrr} -0.22 & -0.82 & -0.53 \\ -0.44 & -0.41 & 0.80 \\ -0.87 & 0.41 & -0.27 \end{array}\right)\)
R (rounded to 2dp):
\(\displaystyle \left(\begin{array}{rrr} -9.2 & -7.6 & -9.4 \\ 0.00 & 0.82 & 1.6 \\ 0.00 & 0.00 & -0.53 \end{array}\right)\)
Verification: A == QR: True

Visualization (QR):

The columns of Q are orthonormal (perpendicular and unit length). The upper triangular matrix R captures the information about the relationships between the original columns of A.

3. Cholesky Decomposition

  • Concept: Expresses a symmetric positive definite matrix A as the product of a lower triangular matrix (L) and its transpose: A = LL^T

  • Applications:

    • Efficiently solving systems of linear equations with symmetric positive definite matrices.

    • Monte Carlo simulations.

    • Nonlinear optimization.

A = matrix([[4, 12, -16],
           [12, 37, -43],
           [-16, -43, 98]])

# Cholesky decomposition
L = A.cholesky()

print("\nL:\n")
show(L)
print("\nLT:\n")
show(L.transpose())

# Verify A == LL^T (Corrected)
print("\nVerification: A == LL^T:", np.allclose(A, L*L.transpose())) 
L:
\(\displaystyle \left(\begin{array}{rrr} 2 & 0 & 0 \\ 6 & 1 & 0 \\ -8 & 5 & 3 \end{array}\right)\)
LT:
\(\displaystyle \left(\begin{array}{rrr} 2 & 6 & -8 \\ 0 & 1 & 5 \\ 0 & 0 & 3 \end{array}\right)\)
Verification: A == LL^T: True

Visualization (Cholesky):

Similar to LU, but with the advantage that you only need to compute and store the lower triangular matrix L.

4. Eigendecomposition

  • Concept: Expresses a square matrix A as the product: A = QΛQ^(-1), where Q is a matrix of eigenvectors, and Λ is a diagonal matrix of eigenvalues.

  • Applications:

    • Understanding the long-term behavior of dynamical systems.

    • Dimensionality reduction (Principal Component Analysis).

    • Image compression.

import numpy as np
from sage.all import *

# Example matrix
A = matrix([[2, 1, 1],
           [4, 3, 3],
           [8, 7, 9]])

# Eigendecomposition
eigenvalues, eigenvectors = A.eigenmatrix_right()

# Extract numerical values from RealRoots object
eigenvalues = [e[0] for e in eigenvalues]  

# Create diagonal matrix and change ring
# We change the ring in the eigendecomposition code to accommodate complex eigenvalues
D = diagonal_matrix(eigenvalues).change_ring(CDF)  

# Using the 'n()' method
# display with 2 digits of precision
print("\nEigenvalues:\n")
show(D.n(digits=2) )

# Using the 'n()' method to
# display with 2 digits of precision
print("\nEigenvectors:\n")
show(eigenvectors.n(digits=2))
Eigenvalues:
\(\displaystyle \left(\begin{array}{rrr} 0.28 & 0.00 & 0.00 \\ 0.00 & 0.00 & 0.00 \\ 0.00 & 0.00 & 0.00 \end{array}\right)\)
Eigenvectors:
\(\displaystyle \left(\begin{array}{rrr} 1.0 & 1.0 & 1.0 \\ -4.1 & 1.2 & 2.8 \\ 2.3 & -2.1 & 7.8 \end{array}\right)\)

Additional Notes:

  • These examples demonstrate basic decompositions. SageMath offers more advanced functions and options (e.g., pivoting strategies for LU, handling complex matrices).

  • Visualizations can be done using libraries like Matplotlib in conjunction with SageMath.