ScaLAPACK Archives

[Scalapack] Problem report on MPI BLACS 1.1p3

Thanks a lot for your remark. We will correct that problem in a future  
Just for info, we just released a very nice BLACS/SCALAPACK python  
based installer.
It is available at
It makes it very easy to install all the libraries needed for  
SCALAPACK on a machine.
Feel free to try it and gives us your feedback so that we can improve  
Thanks again

On Jan 15, 2008, at 5:32 AM, Tamito KAJIYAMA wrote:

Dear BLACS developers,

I have been using MPI BLACS 1.1p3 together with ScaLAPACK 1.7.4
on several MPI-based computing environments.  First of all, thank you
for providing these valuable pieces of software.

I am writing this to let you know a problem I found in
BLACS/INSTALL/Makefile that the "clean" target (i.e. make clean)
does not remove a symbolic link to mpif.h.

I use several pairs of compilers and MPI libraries with which I build
BLACS from the same source tree.  After I build EXE/xfmpi_sane with
an MPI library for example, a symbolic link to mpif.h in the library  
made in the INSTALL directory.  The link will be left undeleted even  
I do "make clean" and switch to another MPI library, making the  
of MPI-related tests really confusing.

I hope this problem will be fixed in a future release of BLACS, which
helps the new users of BLACS a lot.

Best regards,

KAJIYAMA, Tamito <kajiyama@Domain.Removed>

Scalapack mailing list

Julie Langou; Research Associate in Computer Science
Innovative Computing Laboratory;
University of Tennessee from Denver, Colorado ;-)

-------------- next part --------------
An HTML attachment was scrubbed...

<Prev in Thread] Current Thread [Next in Thread>

For additional information you may use the LAPACK/ScaLAPACK Forum.
Or one of the mailing lists, or