Can routine getri: LU inverse use multiple GPUs?

Open discussion for MAGMA library (Matrix Algebra on GPU and Multicore Architectures)

Can routine getri: LU inverse use multiple GPUs?

Postby railgun » Mon Jan 09, 2017 4:31 pm

Hi,
I am new to MAGMA. I noticed that some routines have _mgpu version while some not. For example there is no _mgpu function in LU inverse:
magma_cgetri_gpu, magma_dgetri_gpu, magma_sgetri_gpu, magma_zgetri_gpu.
There is also no relevant input parameter to indicate the number of GPUs. Does this mean these functions cannot use multiple GPUs? If no, how to do it?
Thank you very much!
railgun
 
Posts: 4
Joined: Mon Jan 09, 2017 2:47 pm

Re: Can routine getri: LU inverse use multiple GPUs?

Postby mgates3 » Thu Jan 12, 2017 3:07 pm

There are several possible interfaces in MAGMA, for instance:
Code: Select all
    magma_zgetrf       # matrix A in CPU memory; single or multi-GPU.
    magma_zgetrf_m     # matrix A in CPU memory, internal multi-GPU, out-of-GPU-memory implementation.
    magma_zgetrf_gpu   # matrix dA in GPU memory.
    magma_zgetrf_mgpu  # matrix dA distributed over multiple GPU memories, multi-GPU implementation.

If environment variable $MAGMA_NUM_GPUS is set, magma_zgetrf will call magma_zgetrf_m. Also, if matrix A doesn't fit in GPU memory, magma_zgetrf will call magma_zgetrf_m for the out-of-GPU-memory capability. So usually, just call magma_zgetrf. Using magma_zgetrf_mgpu is more difficult because you have to distribute the matrix to multiple GPUs beforehand.

There is currently no multi-GPU getri. Only the first step (getrf) is multi-GPU. (I just noticed there is no magma_zgetri CPU interface, either. Hopefully we can remedy that before the next release.)

Incidentally, why do you want getri? It is usually a bad idea to explicitly invert a matrix. If you are computing X = A^{-1}*B, which is the same as solving A*X = B, it is both faster and more accurate to use gesv, which is getrf + getrs, rather than getrf + getri + gemm.

-mark
mgates3
 
Posts: 750
Joined: Fri Jan 06, 2012 2:13 pm

Re: Can routine getri: LU inverse use multiple GPUs?

Postby railgun » Fri Jan 13, 2017 12:57 pm

Hi Mark,

I see. I'll take you advice and have a try.
Thank you very much!
railgun
 
Posts: 4
Joined: Mon Jan 09, 2017 2:47 pm


Return to User discussion

Who is online

Users browsing this forum: No registered users and 4 guests

cron