What's the conclusion of all this. Looks like for your application, you really
want inv(VR) and that VL is not enough. Even if your eigenvalue are distinct,
LAPACK will not guarantee you
VL^H * VR = I.
But (if dist. eigs.) VR^H * VL is a diagonal matrix with modulus 1 diagonal element,
therefore you can compute for each i=1 to n:
VL(:,i)^H * VR(:,i) = alpha.
alpha is necessarilly a scalar of modulus 1, if it's not, then the modulus of
alpha nedds to be between 0 and 1 and this means that you have multiple eigenvalues
and well, it's a little harder to handle, so I'll skip.
Then you just need to divide the row i of VL by conj(alpha) or the col i of VR
by alpha. (Either one or the other, but not both!). And that should do it.
After this scaling, you will find inv(VR) from VL^H.
Users browsing this forum: Baidu [Spider], Google [Bot], Yahoo [Bot] and 2 guests