The helical coordinate |
It is easy to reinvent the Cholesky factorization algorithm. To do so, simply write all the components of a triangular matrix and then explicitly multiply these elements times the transpose matrix . You then find you have everything you need to recursively build the elements of from the elements of . Likewise, for a matrix, etc.
The case shows that the Cholesky algorithm requires square roots. Matrix elements are not always numbers. Sometimes, matrix elements are polynomials, such as -transforms. To avoid square roots, there is a variation of the Cholesky method. In this variation, we factor into , where is a diagonal matrix.
Once a matrix has been factored into upper and lower triangles, solving simultaneous equations is simply a matter of two back substitutions: (We looked at a special case of back substitution with Equation ().) For example, we often encounter simultaneous equations of the form . Suppose the positive-definite matrix has been factored into triangle form . To find , first backsolve for the vector . Then, we backsolve . When happens to be a band matrix, then the first back substitution is filtering down a helix, and the second is filtering back up it. Polynomial division is a special case of back substitution.
Poisson's equation requires boundary conditions, that we can honor when we filter starting from both ends. We cannot simply solve Poisson's equation as an initial-value problem. We could insert the Laplace operator into the polynomial division program, but the solution would diverge.
The helical coordinate |