Lets assume that you would like to construct a $\mathbb{R}^{3 \times 4} $ matrix. For example:
$$ M = \begin{pmatrix} 1 & 2 & 3 & 4 \\ 5 & 6 & 7 & 8 \\ 9 & 10 & 11 & 12 \\ \end{pmatrix} $$
Matrix with shapes known at run-time can be created from the values of their components given in conventional mathematical notation, i.e., row-by-rows, using a macro:
// A 3x4 matrix
let m: General<f64> = matrix![ 1.0, 2.0, 3.0, 4.0;
5.0, 6.0, 7.0, 8.0;
9.0, 10.0, 11.0, 12.0];
The elements in a row are separated by a comma, rows are separated by semicolons.
Operations between two matrices like addition, division, and multiplication, require that both matrices to have compatible shapes. In particular:
- Addition and subtraction require both matrices to have the same number of rows and the same number of columns.
- Multiplication requires the matrix on the left-hand-side to have as many columns as the number of rows of the matrix on the right-hand-side.
Those restrictions are checked at runtime and panics in case of mismatch.
use mathru::algebra::linear::matrix::General;
pub fn main()
{
// 2x3 matrix
let a: General<f64> = General::zero(2, 3);
// 4x4 matrix
let b: General<f64> = General::zero(4, 4);
let _ = a * b; // Compiles fine but panics at runtime
}
Matrices with special symmetries and structures often arise in linear algebra and are associated with various matrix factorizations. Mathru features a collection of special matrix types, which allow for fast computation with specialized routines for particular matrix types.
The following tables summarize the types of special matrices that have been implemented:
Type | Description |
---|---|
UpperTriangular | Upper triangular matrix |
UnitUpperTriangular | Upper triangular matrix with unit diagonal |
LowerTriangular | Lower triangular matrix |
UnitLowerTriangular | Lower triangular matrix with unit diagonal |
UpperHessenberg | Upper Hessenberg matrix |
Diagonal | Diagonal matrix |
Matrix decomposition is a factorization of a matrix into a product of matrices. Those factors can either allow more efficient operations like inversion or linear system resolution, and might provide some insight regarding intrinsic properties of some data to be analysed (e.g. by observing singular values, eigenvectors, etc.) For instance, when solving a system of linear equations $Ax=b$, the matrix $A$ can be decomposed via the LU decomposition. The LU decomposition factorizes a matrix into a lower triangular matrix $L$ and an upper triangular matrix $U$. The systems $L( Ux ) = b$ and $Ux=L^{-1}b$ require fewer additions and multiplications to solve, compared with the original system $Ax=b$.
The following table summarizes the types of matrix decompositions that have been implemented:
Type | Description |
---|---|
CholeskyDec | Cholesky factorization |
LUDec | LU factorization |
QRDec | QR factorization |
HessenbergDec | Hessenberg decomposition |
EigenDec | Spectral decomposition |
SchurDec | Schur decomposition |
LU decomposition factors a matrix $A$ as the product of a lower triangular matrix $L$ and an upper triangular matrix $U$. The product includes a permutation matrix $P$ as well.
$$PA = LU$$
All square matrices can be factorized in this form.
|
|
The Cholesky decomposition of a square $n \times n$ symetric definite positive matrix $A$ is composed of a $n \times n$ lower-triangular matrix $L$ such that $A = LL^{\ast}$. $L^{\ast}$ designates the conjugate-transpose of $L$. If the input matrix $A$ does not fullfil the preconditions, such a decomposition does nt exist and the dec_cholesky returns None.
|
|
The QR decomposition of a general $m \times n$ matrix $A$ is composed of an $m \times min(n, m)$ unitary matrix $Q$, and a $min(n, m) \times n$ upper triangular matrix $R$ such that $A = QR$.
|
|
The hessenberg decomposition of a square matrix $A$ is composed of an orthogonal matrix $Q$ and an upper-Hessenberg matrix $H$ such that $A = QHQ^{\ast}$.
|
|
Given a system of linear equations in matrix form
$$Ax=b$$
We want to solve the equation for $x$, given $A$ and $b$, where $A$ is square matrix and $b$ is a column vector.The resolution $x$ is a column vector as well.
|
|