-
Notifications
You must be signed in to change notification settings - Fork 950
Svd #1248
base: master
Are you sure you want to change the base?
Svd #1248
Conversation
method of Jordan-Gauss triangularisation of the matrix check if all diagonal values are not null get the matrix solution of the equation
remove comments and console.log
Using Jordan-Gauss elimination to solve linear equation AX = B. Compute the determinant of a matrix Compute the adjoint of a matrix FEATURE solve linear equation
using dot instead of mul adding unit test
errors in algorithms as u*s*v is not close to m
solve the equation us * x = m to compute v
Hi @kedevked, It's awesome that You've started the implementation of some of the most crucial Linear Algebra operations. I'm afraid, however, that the Implementation of the QR
There is only one viable Schur Decomposition Algorithm for the non-symmetric Eigenvalue Problem that I could find: The Doubly-Shifted Francis QR Algorithm. There are two JavaScript implementations of the Francis Algorithm that i know of: Numeric.js and NDJS. The latter contains some literature references that I found very helpful while implementing it. Maybe they will help You as well. Alternatively, I can offer to try and incorporate my implementation from NDJS into TFJS. |
@DirkToewe , tfjs already supports QR decomposition using householder transformation. I used that QR decomposition and implemented the algorithm described here: http://www.math.tamu.edu/~dallen/linear_algebra/chpt6.pdf. I don't really think that in top of that we need the schur decomposition before computing the svd. |
So You're exporting |
The PR is about creating a new operator tf.svd that will return the SVD decomposition of a given matrix. Therefore, I am not exporting eigen values and vectors only for testing. |
@kedevked hey, thanks for SVD implementation! I see there is a TSLint error and I made a PR to your fork that fixes it. I wish to use this API soon in my project! |
That's great @JasonShin. Hopefully @caisq could review it or give us glimpse about how to proceed to make it part of the API |
Sorry to bring this up again but I am still wondering about a few points:
|
|
Python Tensorflow implements SVD as well so why not have it in tfjs-core too? |
QR decomposition, QR iteration and (Francis implictly shifted) QR method, are three distinct algorithms in my understanding (see: Linear Algebra Handbook, chap. 42 "Symmetric Matrix Eigenvalue Techniques"). If I understand @kedevked's code correctly, the QR iteration is used to solve the symmetric positive semidefinite eigenvalue problem AAT to then solve the SV problem of A with it. Only one of my points was addressed to the QR decomposition (the one about performance). The other points are all directed at the implementation of Here are the lines of code that make me believe, QR iteration is used: // ...
function svd_(m: Tensor2D): {u: Tensor, s: Tensor, v: Tensor} {
const mT = m.dot(transpose(m));
const u = eingen(mT).vectors;
// ... // ...
function eingen_(m: Tensor): {values: Tensor1D, vectors: Tensor} {
let z;
for (let i = 0; i < 5; i++) {
const [x, y] = linalg.qr(m);
m = y.dot(x);
z = z ? z.dot(x) : x;
y.dispose();
}
// ...
A few more points that came to my attention looking at the PR for the fourth time:
As always, I may very well be mistaken in all of this. P.S.: To my knowledge, nobody is opposed to having SVD in TFJS, certainly not me. I was hoping that my first comment made that clear. |
@DirkToewe, I had a look at the article you provided for the shifting. But it does not provide a way to choose μ. Are you considering implementing the shifting ? I could continue the implementation if you can suggest an article where I can find a concrete way to choose μ or to implement the shifting. Thanks ! |
This is a rabbit hole that will probably lead You to the Golub-Reinsch algorithm or the Francis QR algorithm for nonsymmetric Eigen problems. The best shift would be an actual eigenvalue because then we would have convergence in exactly n steps. Of course we don't know the eigenvalues. If we knew the eigenvalues we wouldn't need the QR iteration in the first place. So instead of the actual eigenvalue a good estimation of the eigenvalue is used. In order to get a good estimation of the eigenvalue of a matrix, it should be as close to triangular as possible (or diagonal in the symmetric case). The closest You can get to triangular with deterministic algorithms is the Hessenberg decomposition, which is usually done before the QR iteration. Given the Hessenberg matrix, the eigenvalues are usually estimated from the lowest, rightmost 2x2 matrix. Here's the corresponding quote from Golub and Reinsch: The shift parameter s is determined by an eigenvalue of the lower 2 X 2 minor of M. Wilkinson [t3] has shown that for this choice of s, the method converges globally and almost always cubically. |
I will take the time to go through the algorithms for implementation suggested so as to come with a sound implementation. Thanks ! |
For symmetric eigenvalue problems (like the SVD) there is also another, simpler algorithm: The (classic) Jacobi eigenvalue algorithm. It also takes O(n⁴) operations and in my opinion it is much simpler to understand and implement than Francis QR or Golub-Reinsch. There's also O(n³) implementations of the Jacobi SVD but they are really complicated. |
Description
Compute the SVD for a matrix m
Use the QR decomposition to compute eingen values and vectors
Use the solve function to compute v as the result of (us)X = m
Use a fix number of iterations that can be improved
FEATURE: Singular-value decomposition (SVD) tensorflow/tfjs#110
For repository owners only:
Please remember to apply all applicable tags to your pull request.
Tags: FEATURE, BREAKING, BUG, PERF, DEV, DOC, SECURITY
For more info see: https://github.com/tensorflow/tfjs/blob/master/DEVELOPMENT.md
This change is