You don't need matrix inversion.
Kind of, sometimes anyway.Speed up inversion of definite matrices using the code in (6).
Limitations
This trick only works for definite matrices and you need to know the definiteness. However, I think this is commonly the case (think covariances etc.).Generalization
Let's assume you need to invert a matrix (for , just use ) in order to calculate the result of some multiplication with another matrix:(1)
- Inverting a Matrix:
- Calculating a matrix-vector product of the inverse:
- Calculating a matrix-matrix of the inverse product as in (1)
A = M.inverse() * B;
(2)
Equations aren't instructions
A pitfall when implementing equations in code is that there is a difference between '=' and ''. The former is an assignment; the latter tells us that its left side is equal to its right side. This allows us to transform (1) into(3)
(4)
So I have made the problem worse
In addition to the decomposition, there are now more matrix multiplications and the number of inverses hasn't changed. Let's avoid any matrix inversion by applying the same logic from above to solve:(5)
Actually, no
The matrix has a triangular shape. This means (5) can be trivially solved using forward substitution. An implementation might look like this:LLT = M.llt();
C = LLT.matrixL().solve(B);
A = LLT.matrixL().transpose().solve(C);
C = LLT.matrixL().solve(B);
A = LLT.matrixL().transpose().solve(C);
(6)
Is this faster?
Before going into the benchmarks, keep in mind that classic matrix inversion and the technique involving Cholesky decomposition both operate in .I ran some benchmarks on these problems:- Computing the inverse using
- Computing where is a vector
- Computing where