magma-1.4.0-beta2: numerical problem with testing_sgetri_gpu

Open discussion for MAGMA library (Matrix Algebra on GPU and Multicore Architectures)

magma-1.4.0-beta2: numerical problem with testing_sgetri_gpu

Postby waitzman » Wed Jul 17, 2013 3:30 pm

When I run
Code: Select all
testing_sgetri_gpu --range 100:100:1 --niter 1000 -c

I get bad accuracy occasionally.
Here is a list with the number of occurrences in the left column and the reported error in the right column. The bad error values are in bold.

32 -2.94e-39
33 2.94e-39
51 1.84e+19

59 5.42e-20
64 nan
71 -1.84e+19

77 -5.42e-20
81 1.00e+00
87 0.00e+00
96 -1.00e+00
106 -0.00e+00
111 -inf
132 inf

I sometimes get bad accuracy at other matrix sizes.

I am running on an iMac 27" 2012 (OS X 10.8.4) with its NVIDIA GeForce GTX 675MX and an Intel Core i5 at 3.2 GHz with 16GB of RAM.

I tried the getrf_m.tar patch from the topic "magma-1.4.0-beta2: numerical problems [dz]getrf" but it did not help (and seems like it is for an older version of magma).
Posts: 3
Joined: Wed Jul 17, 2013 2:22 pm

Return to User discussion

Who is online

Users browsing this forum: No registered users and 3 guests