Optional: Testing the NetSolve 'sparse_iterative_solve' interface
The NetSolve 'sparse_iterative_solve' interface to
PETSc, Aztec, and ITPACK can only be tested if the
user has enabled sparse_iterative_solve in
the $NETSOLVE_ROOT/server_config file and has
configured NetSolve with the respective paths to the PETSc library,
Aztec library, and MPI library.
The PETSc, Aztec, and ITPACK libraries are not
included as default numerical software for the server,
and must be installed separately (as well as MPI).
Refer to the Section called Enabling Sparse Iterative Solvers (PETSc, Aztec, and ITPACK) in Chapter 13 for further details.
This interface can be tested
most effectively by using sparse matrices generated from collections
such as the Harwell Boeing test collection on the Matrix Market homepage.
Refer to the section on the webpage entitled Software,
where the test matrices are available in C, Fortran, and Matlab. For
ease of testing, several of the test matrices from this collection are
included in the distribution of NetSolve.
After Matlab has been invoked,
the user can then call the test scripts petsc_test.m,
aztec_test.m, and itpack_test.m in the $NETSOLVE_ROOT/src/Testing/matlab/ directory, by typing
These scripts invoke the PETSc, Aztec, and ITPACK interfaces and check the
validity of the computed solution.
Alternatively, the user can generate a series of Harwell Boeing matrix
types (1-5), using the generate.m script. To
see a list of Harwell Boeing matrix types that can be generated, type
And then call the functions petsc.m
>> [A,rhs] = generate(1);
>> [x1,its1] = petsc(A,rhs);
>> [x2,its2] = aztec(A,rhs);
Note that the user can query for the list of arguments in the calling
sequence to the routine by using the NetSolve tool routine.