4 posts
• Page **1** of **1**

I need to calculate only the lowest several eigenvalues for a Hermitian complex matrix. Is there a subroutine to do that? Thanks,

- bsmile
**Posts:**8**Joined:**Sat Dec 04, 2010 2:40 pm

ZHEEV and ZHEEVD will compute all eigenvalues. With ZHEEVR and ZHEEVX, you can compute a subset of eigenvalues (so for example the 10 smallest say).

ZHEEVR and ZHEEVX are much faster than ZHEEV and ZHEEVD if (1) you want a few of the eigenvalues and (2) you want the eigenvectors as well. "A few" is hard to quantify.

If you only want to compute the eigenvalues (and you do not want the eigenvectors) then all of these four routines will perform about the same. All these routines first reduce the matrix to tridiagonal form and then compute the eigenvalues of the tridiagonal matrix, however computing either all or a few of the eigenvalues of a tridiagonal matrix is kind of negligible time compared to the reduction to tridiagonal form. So if you just want the eigenvalues (and not the eigenvectors) then it does not really matter which subroutines (ZHEEV, ZHEEVX, ZHEEVR, ZHEEVD) you use.

Finally one limitation of LAPACK Hermitian eigensolvers is that they all first reduce the matrix to tridiagonal form. They are some reasons for doing so. However this means that the cost of the eigensolver is O(n^3). If you want a faster eigensolver (O(n^2) per eigenvalue) and just have a few eigenvalues to compute, it might be worse [please read "worth", thanks "bsmile" for correcting me] considering iterative package methods. For example PRIMME, or LOPBCG, or ARPACK, etc. Software package like Trilinos and PETSc also have a complete iterative eigensolver suite.

Julien.

ZHEEVR and ZHEEVX are much faster than ZHEEV and ZHEEVD if (1) you want a few of the eigenvalues and (2) you want the eigenvectors as well. "A few" is hard to quantify.

If you only want to compute the eigenvalues (and you do not want the eigenvectors) then all of these four routines will perform about the same. All these routines first reduce the matrix to tridiagonal form and then compute the eigenvalues of the tridiagonal matrix, however computing either all or a few of the eigenvalues of a tridiagonal matrix is kind of negligible time compared to the reduction to tridiagonal form. So if you just want the eigenvalues (and not the eigenvectors) then it does not really matter which subroutines (ZHEEV, ZHEEVX, ZHEEVR, ZHEEVD) you use.

Finally one limitation of LAPACK Hermitian eigensolvers is that they all first reduce the matrix to tridiagonal form. They are some reasons for doing so. However this means that the cost of the eigensolver is O(n^3). If you want a faster eigensolver (O(n^2) per eigenvalue) and just have a few eigenvalues to compute, it might be worse [please read "worth", thanks "bsmile" for correcting me] considering iterative package methods. For example PRIMME, or LOPBCG, or ARPACK, etc. Software package like Trilinos and PETSc also have a complete iterative eigensolver suite.

Julien.

Last edited by Julien Langou on Fri Jan 31, 2014 6:12 pm, edited 1 time in total.

- Julien Langou
**Posts:**786**Joined:**Thu Dec 09, 2004 12:32 pm**Location:**Denver, CO, USA

Dear Julien,

Thank you very much for the detailed explanation, it really helps! I noticed that you wrote "worth" to be "worse", which confused me at the first sight. Does this reflect your keen hope of us to use the lapack subroutines, which are so well written and with industrial level robustness and speed? This is exactly why I turned to the forum for help --- vasp uses the quick iterative solver davidson method to get eigenvalue/states, but iterative solvers are not robust and sometimes can be a real headache.

Thank you very much for the detailed explanation, it really helps! I noticed that you wrote "worth" to be "worse", which confused me at the first sight. Does this reflect your keen hope of us to use the lapack subroutines, which are so well written and with industrial level robustness and speed? This is exactly why I turned to the forum for help --- vasp uses the quick iterative solver davidson method to get eigenvalue/states, but iterative solvers are not robust and sometimes can be a real headache.

- bsmile
**Posts:**8**Joined:**Sat Dec 04, 2010 2:40 pm

Hi,

1) in the context of VASP, I believe the matrices are really large and you just need a few of the smallest eigenvalues, using Davidson's method (which is not in LAPACK) makes sense and the method has proven worthy in the past. New methods I am aware of are LOBPCG from Knyazev and PRIMME from Stathopoulos.

2) I would be surprised if VASP does not have a default to back up on LAPACK and compute all eigenvalues and eigenvectors for small matrices. Just for robustness sake, and comparison sake. I do not know VASP that well. Maybe you can do this by setting the block size of Davidson method to be the size of the problem.

3) LAPACK routines are good (I should say: they are excellent! awesome! etc.) for what they are intended for, that is dense matrices. Now, they are in general not appropriate for sparse matrices (or data-sparse matrices, as the matrices you see in VASP). For sparse solvers (data-sparse), LAPACK is used but within the iterative solvers for small subproblems.

4) [ but iterative solvers are not robust and sometimes can be a real headache ] => this is correct, but dense solvers most of the time will not even allow you to store the matrix, so working on the robustness of iterative solvers make sense. Hey, these eigenproblems from VASP are tough problems! Not sure I have an answer!

Cheers,

Julien.

1) in the context of VASP, I believe the matrices are really large and you just need a few of the smallest eigenvalues, using Davidson's method (which is not in LAPACK) makes sense and the method has proven worthy in the past. New methods I am aware of are LOBPCG from Knyazev and PRIMME from Stathopoulos.

2) I would be surprised if VASP does not have a default to back up on LAPACK and compute all eigenvalues and eigenvectors for small matrices. Just for robustness sake, and comparison sake. I do not know VASP that well. Maybe you can do this by setting the block size of Davidson method to be the size of the problem.

3) LAPACK routines are good (I should say: they are excellent! awesome! etc.) for what they are intended for, that is dense matrices. Now, they are in general not appropriate for sparse matrices (or data-sparse matrices, as the matrices you see in VASP). For sparse solvers (data-sparse), LAPACK is used but within the iterative solvers for small subproblems.

4) [ but iterative solvers are not robust and sometimes can be a real headache ] => this is correct, but dense solvers most of the time will not even allow you to store the matrix, so working on the robustness of iterative solvers make sense. Hey, these eigenproblems from VASP are tough problems! Not sure I have an answer!

Cheers,

Julien.

- Julien Langou
**Posts:**786**Joined:**Thu Dec 09, 2004 12:32 pm**Location:**Denver, CO, USA

4 posts
• Page **1** of **1**

Users browsing this forum: Bing [Bot], Yahoo [Bot] and 2 guests