### A question about projection operator

Posted:

**Fri Oct 07, 2016 8:16 am**
by **leobart**

I have a general matrix which has a certain number of real and a certain number of

complex eigenvalues.

I want to project the inverse of this matrix to the subspace spanned by eigenvectors

corresponding to real eigenvalues.

Is there some easy way to do this within lapack? For example for the inversion of the

matrix I use dgetrf and dgetri routines and for the diagonalization I use dgeev. Is there

some info stored in these subroutines that can help me do this?

### Re: A question about projection operator

Posted:

**Mon Dec 12, 2016 10:23 am**
by **Julien Langou**

I want to project the inverse of this matrix to the subspace spanned by eigenvectors corresponding to real eigenvalues.

Then the resulting operator would simply be a diagonal operator with the inverse eigenvalues on the diagonal.

I assume this is not the answer you are looking for, so what exactly are your trying to do?

Your question makes me think is that you might want to reorder the Schur form so that you have the real eigenvalues first in the Schur form.

So something like: DGEES (to get Schur form), the DTRSEN (to reorder the Schur form), at this point you should get Q, an orthonormal basis for the subspace spanned by the eigenvectors spanned by the real eigenvalues of your matrix.

(Note that Q is not a matrix of eigenvectors. (Since eigenvectors are likely not orthogonal.) Q is an orthonormal basis for the subspace spanned by the eigenvectors spanned by the real eigenvalue. Which is what you would need to `project on the subspace spanned by eigenvectors corresponding to real eigenvalues.`)

Julien.

### Re: A question about projection operator

Posted:

**Mon Dec 12, 2016 12:50 pm**
by **ilya-fursov**

leobart,

What do you mean by "projection"? Change of basis for the matrix, as Julien suggested, or employing a projection operator on top of your matrix of interest?

If this is the latter, denoting by M the inverse of your matrix, and by V the desired subspace you want to project onto, then it may be that you want something like:

1) P*M (output from M is projected via P onto V), or

2) P*M*P (input is projected onto the subspace V, then M acts, then its output is projected again),

where P is projection operator (matrix) onto V.

Strictly speaking, the input to your new operator should have already been from the subspace V, so in case (2) the first projection is not needed, and (2) becomes (1). But the input from subspace V should be expressed in its own basis, not in the default R^n basis, this leads to the selection of such a basis, and new basis transform matrices - finally it all depends on your intentions.

So, I'll only give an idea what to do for the case (1), i.e. projecting the output of whatever matrix M onto subspace V, where V is spanned by whatever linearly independent vectors v_i (in your case these are some subset of eigenvectors).

Denote V = matrix of column-vectors v_i.

Here, again, there are many options of how to do the projection:

- use orthogonal projection,

- or use a general projection parallel to some subspace U, where the direct sum of U and V makes the entire space R^n.

For simplicity I only consider the orthogonal projection.

Consider an arbitrary vector a from R^n. To find P*a, first express a = sum_i {alpha_i * v_i} + v*, where alpha_i are some coefficients, and v* is a vector orthogonal to V. Multiplying this equation by v_j and doing some algebra, we finally get:

P*a = V * (V^H * V)^(-1) * V^H * a,

where V^H is the conjugate transpose of matrix V.

Next, calculate P*M = V * (V^H * V)^(-1) * V^H * M. One can either calculate each item in this expression explicitly.

Or, noting that (V^H * V)^(-1) * V^H * M is the solution of normal equations, first find X = (V^H * V)^(-1) * V^H * M by solving the least-squares problem V*X = M (this equality may be approximate!), and then calculate V*X which is equal to P*M.

There are a bunch of least square solvers in LAPACK, e.g. *GELS, *GELSD, *GELSS.