You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think the GPUArrays.jl implementation just uses the scalar multiplication Base.:*(::AbstractFloat, ::Bool). So it should comply with the rmul! implementation in LinearAlgebra.jl.
In https://github.com/tam724/CUDA.jl/tree/fix_rmul! I went through the "libraries/cublas/level 1" testset and included random CuArrays with NaN entries. If the tested function accepts a scalar factor it is also additionally tested with NaN, true and false.
This broke the tests of rmul! (fixed), axpby!, rotate! and reflect!.
I think it's hard to say how this should be treated generally, because even NaN*false = 0.0 is undocumented behavior in julia (documented in a comment).
EDIT: it is documented: https://docs.julialang.org/en/v1/manual/mathematical-operations/#Arithmetic-Operators
Maybe the "strong zero" should not be relied on, because e.g. even in LinearAlgebra.jl something like this happpens:
julia>using LinearAlgebra
julia>rotate!([NaN], [NaN], false, false) # rotate!(x, y, c, s) means: x = c*x + s*y and y = -conj(s)*x + c*y
([0.0], [NaN])
But a use case for rotate! with false arguments is hard to imagine..
See the following MWE:
In julia base,
false
is defined as a "strong zero", see https://github.com/JuliaLang/julia/blob/5e9a32e7af2837e677e60543d4a15faa8d3a7297/base/bool.jl#L178. Hence,NaN*false = 0
andNaN*true = NaN
. For consistency, the following dispatch could be defined forBool
.This would bypass the fallback from
rmul!
toscal!
defined here:CUDA.jl/lib/cublas/linalg.jl
Lines 9 to 14 in 792aec5
I'd be happy to create a PR + tests.
The text was updated successfully, but these errors were encountered: