Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

World-age failure in M * v with Colors #109

Closed
timholy opened this issue Feb 21, 2017 · 25 comments
Closed

World-age failure in M * v with Colors #109

timholy opened this issue Feb 21, 2017 · 25 comments

Comments

@timholy
Copy link
Member

timholy commented Feb 21, 2017

I noticed this while trying to get Images to pass tests on 0.6.

Suppressing depwarns from StaticArrays:

julia> using Colors, StaticArrays, ColorVectorSpace

julia> @noinline foo(M, v) = M * v
foo (generic function with 1 method)

julia> M = @SMatrix rand(3,3)
3×3 StaticArrays.SMatrix{3,3,Float64,9}:
 0.35674   0.0902514   0.804478
 0.645144  0.00201846  0.354793
 0.940136  0.176271    0.110509

julia> v = SVector{3}(rand(RGB{Float64}, 3))
3-element StaticArrays.SVector{3,ColorTypes.RGB{Float64}}:
 RGB{Float64}(0.0250308,0.813153,0.433093) 
 RGB{Float64}(0.612239,0.88945,0.304911)   
 RGB{Float64}(0.0285772,0.674416,0.0770198)

julia> foo(M, v)
ERROR: MethodError: no method matching zero(::Type{ColorTypes.RGB{Float64}})
The applicable method may be too new: running in world age 21469, while current world is 21474.
Closest candidates are:
  zero(::Type{C<:ColorTypes.AbstractRGB}) where C<:ColorTypes.AbstractRGB at /home/tim/.julia/v0.6/ColorVectorSpace/src/ColorVectorSpace.jl:172 (method too new to be called from this world context.)
  zero(::Type{Base.LibGit2.GitHash}) at libgit2/oid.jl:106
  zero(::Type{Base.Pkg.Resolve.VersionWeights.VWPreBuildItem}) at pkg/resolve/versionweight.jl:80
  ...
Stacktrace:
 [1] promote_matprod(::Type{Float64}, ::Type{ColorTypes.RGB{Float64}}) at /home/tim/.julia/v0.6/StaticArrays/src/matrix_multiply.jl:7
 [2] *(...) at /home/tim/.julia/v0.6/StaticArrays/src/matrix_multiply.jl:135
 [3] foo(::StaticArrays.SMatrix{3,3,Float64,9}, ::StaticArrays.SVector{3,ColorTypes.RGB{Float64}}) at ./REPL[2]:1

julia> zero(RGB{Float64})
RGB{Float64}(0.0,0.0,0.0)

May be related to ongoing discussions about @pure in #95? I'm on e6d7387. CC @vtjnash.

@andyferris
Copy link
Member

andyferris commented Feb 21, 2017

Ahh... yeah... this is a big problem at the moment. Sorry Tim!

StaticArrays is totally broken on v0.6 for any types defined after using StaticArrays (including element types, or user-defined static arrays). It was never written to conform to what @vtjnash thought might be appropriate in a @pure function or (perhaps equivalently??) a @generated function's generator, and the rules of the game has changed since the fix to JuliaLang/julia#265 was introduced. #95 was just the prelude... The real issue became apparent in #106 (and is tangentially related to some of the discussion that occurred in JuliaLang/julia#20704).

The cause of the problem you are seeing is that our function generators call functions like size/Size, zero, one, and so-on. The thing which confuses me about the current compiler behaviour is that the domain of the generated function expands to take in newly-defined types but the generator does not gain access to methods defined for the new types (and in some cases that statement appears to be true of @pure functions, which was what I was exploring in JuliaLang/julia#20704).

I can see two solutions:

  1. Convince Jameson that flexible @pure functions and flexible generators of @generated functions are useful 😛 And that it is worthwhile to let them have dependency edges and be recompiled just like a normal function, so that they can be as practically useful as they were in v0.5 (but more consistent, since they wouldn't be susceptible to automatic recompilation of dependent functions JuliaLang/julia#265).

  2. Rewrite most of this package. (There are currently 183 @generated methods, which might be excessive, but at least it worked...). Amongst other changes, each @generated method will require a thunk to discuss the size (broadcast() will be a bit challenging in this regard).

I didn't start work on option 2 yet because... it is a lot of work! And I have really being enjoying the flexibility of @generated and @pure and I feel it is worthwhile fighting for them. There are not a huge amount of people working on very low-level packages, so probably not many people have been making use of their utility in the way we do here (I suspect @timholy is another person that has played with this stuff).

OTOH option 1 will clearly be a fair amount of work for somebody who probably isn't me (I'm not super qualified to judge; also @vtjnash has even argued that it is impossible, but I can't yet see that this is true).

To make my intentions clear - I can accept rewriting this package if that is the consensus view. I'm going to ping a bunch of people in the hope for some friendly guidance on the way forward:

cc: @c42f @SimonDanisch @StefanKarpinski @JeffBezanson (and of course I'd very much appreciate hearing views from Jameson and Tim). (sorry for bothering everyone but sorting this out seems important to me).

@ajkeller34
Copy link

I ran into a few @generated / world age problems with Unitful, or rather places where I could have used 0.5-style @generated functions for my purposes more easily. I don't think I'm quite as impacted as StaticArrays, but it's helpful to be able to define functions where the output depends only on the input types, and also the function can be updated to use newly defined methods. This is more or less how I used @generated on 0.5.

While I'm here, let me just say that if support for @generated were ever totally dropped, Unitful would be dead in the water. In Julia 0.6 it is still absolutely critical for computing unit and dimension multiplication and I don't see how I could live without it. I don't think anyone is talking about dropping it but I hope it sticks around in a form that is at least as forgiving as it is in Julia 0.6, and doesn't become even more restrictive in its use cases.

@andyferris
Copy link
Member

Bump. Any feedback?

@c42f
Copy link
Member

c42f commented Feb 23, 2017

For what it's worth, I think the new behavior of generated (and pure) is inconsistent with other functions in a way that wasn't nearly as visible in 0.5, and I don't think it should be the job of packages to work around this.

That's not much help coming from me, I know :-) If I can find some time I'll help out with a rewrite if necessary.

@timholy
Copy link
Member Author

timholy commented Feb 23, 2017

Given Jameson's reservations, I've put quite a bit of effort into expunging @generated and @pure from much of the code I write. I've found I could get rid of quite a lot of them, and when that works there's often a genuine improvement in clarity/usability. (@which is often a bit unsatisfying when it points to a generated function.)

But echoing @ajkeller34 and @andyferris, it's also the case that currently there isn't a way to get rid of them entirely and still be able to do everything we want to do. @vtjnash's work on improving efficiency for type-unstable code is a huge step in a promising direction (it reduces the importance of inferrability, which is the main remaining motivation for using @pure and @generated), but even if that works out brilliantly I suspect that we'll have a subset cases where it will be unsatisfying to get anything less than the performance at the physical limit of the machine.

@andyferris
Copy link
Member

I've found I could get rid of quite a lot of them, and when that works there's often a genuine improvement in clarity/usability.

I do admit that weeding down the generated functions in StaticArrays to a much smaller number could be quite good for code clarity. A package like this one really needs

  • Some way of creating, propagating and interrogating constants/traits like Size{(3,)}(). We've got this down to a minimal number of (truly) pure functions.
  • A few different expression unrolling patterns. At the moment just about every interesting method is unrolled (and also inlined) with its own generated function, but many of these share a similar pattern.

@c42f
Copy link
Member

c42f commented Feb 23, 2017

@andyferris you beat me to it :-) I was just about to submit the following saying more or less the same thing:

@timholy absolutely agreed - @generated is something to be avoided unless you really need it.

I was just discussing this with @andyferris again, and we agreed that a useful step would be to identify the patterns of code generation which are used in this package, and attempt to centralize them into a minimal number of generated functions. After doing this we might be able to identify which minimum set of language extensions would be required to get rid of @generated from the package entirely.

If this worked out I think it would be a useful step regardless of what happens to @generated in the future.

@andyferris
Copy link
Member

OK, preliminary results show that using a "central" generated function (a bit like ntuple) and passing closures to it from a standard function results in worse (slower) code than simply expanding the tuples/loops as we do now, in all but the simplest of cases.

If we can't improve that, I guess we'll need lots of thunks and lots of generated functions...

@timholy
Copy link
Member Author

timholy commented Feb 23, 2017

Can you post what you tried somehow? (branch or gist) I doubt I'd see any improvements you didn't already try, but I'd be happy to peek.

@vtjnash
Copy link

vtjnash commented Feb 23, 2017

A few different expression unrolling patterns. At the moment just about every interesting method is unrolled (and also inlined) with its own generated function, but many of these share a similar pattern.

This is something generated functions are really good at right now. I suspect we could define something in base like @tco function which is a bit like your ntuple / closure and could cut down on some of the boiler plate nature of writing these functions.

Some way of creating, propagating and interrogating constants/traits

This is something generated functions were never intended for and are thus now really bad at. Traits and @pure provide a much more accurate model.

@andyferris
Copy link
Member

Can you post what you tried somehow? (branch or gist) I doubt I'd see any improvements you didn't already try, but I'd be happy to peek.

The branch is called "julia-v0.6"... lets discuss on this gist. I'd really appreciate another set of eyes on this, so thanks for the offer :)

@andyferris
Copy link
Member

This is something generated functions were never intended for and are thus now really bad at. Traits and @pure provide a much more accurate model.

Yep, we're definitely using more traits and @pure for these things ;) (My dot-points above more-or-less split into @pure and @generated functionalities).

@timholy
Copy link
Member Author

timholy commented Feb 24, 2017

I get good codegen from this gist.

@andyferris
Copy link
Member

andyferris commented Feb 25, 2017

@timholy I'd love to just use lispy recursion, and I think a great deal of StaticArrays is easily formatted in this way.

But if MAX_TUPLE_SPLAT or whatever is just 16, then we can't easily get "large" arrays to be efficient, e.g. matrices greater than 4x4... Making this configurable somehow (say, locally for a method) would make this viable.

@timholy
Copy link
Member Author

timholy commented Feb 25, 2017

Yes, that is a problem. We could work around this by nesting the tuples (too horrible to contemplate), and it probably wouldn't be too hard to add a @limit_tuple_type 100 function foo(...) mechanism based on Expr(:meta), though I suspect that would encounter objections. (@vtjnash?)

I updated my gist with a variant that seems to work OK (but it does use a generated function). The only difference from your variant is that a is not "baked in" to the function. I called it map2 because you're mapping over elements in the second "vector" argument.

@andyferris
Copy link
Member

Yes, that is a problem. We could work around this by nesting the tuples (too horrible to contemplate)

I just started to play with this here but ran into this.

And yes, it could become a bit horrible...

it probably wouldn't be too hard to add a @limit_tuple_type 100 function foo(...) mechanism based on Expr(:meta), though I suspect that would encounter objections. (@vtjnash?)

This would certainly solve the problem. It would be interesting to see what you can't write (efficiently) without a generated function (I'm thinking broadcast could be tricky).

I updated my gist with a variant that seems to work OK (but it does use a generated function). The only difference from your variant is that a is not "baked in" to the function. I called it map2 because you're mapping over elements in the second "vector" argument.

I'm beginning to wonder if constructing the closure might have something to do with the the extra stuff in the codegen that I was seeing?

@andyferris
Copy link
Member

FYI related problem: Ferrite-FEM/Tensors.jl#7

@KristofferC
Copy link
Contributor

So instead of:

@generated function f{N,T}(x::NTuple{N,T})
    exp = :(zero(T))
    return exp
end

You would do:

@generated function f{N,T}(x::NTuple{N,T})
    exp = :(o)
    return quote
        o = zero(T)
        $exp
    end
end

Have I understood that correctly?

@vtjnash
Copy link

vtjnash commented Feb 27, 2017

those are both the same as: f{N, T}(x::NTuple{N, T}) = zero(T)

what is wrong is attempting to do: @generated function f{N,T}(x::NTuple{N,T}); return zero(T); end, since the set of method definitions for zero (or any method called from the generator) is a closed set.

@vtjnash
Copy link

vtjnash commented Feb 27, 2017

Yes, that is a problem

Working around MAX_TUPLE_SPLAT isn't generically a better solution than @generated. I would guess that the generator is potentially easier to read; and the fix for #265 ensures that it won't have any action-at-a-distance (due to later method definitions) affecting how the loop body gets unrolled. The main problem with using lisp-y tuple splatting is that in lisp it is an O(1) operations, while in Julia, it is O(n). That's fine for small n (like Array dimensions), but can be very bad for most operations (like counting, units, simulating general unrolling, static arrays, tuples, etc.)

@KristofferC
Copy link
Contributor

KristofferC commented Feb 27, 2017

Thanks, I get it now! (famous last words)

@andyferris
Copy link
Member

The main problem with using lisp-y tuple splatting is that in lisp it is an O(1) operations, while in Julia, it is O(n). That's fine for small n (like Array dimensions), but can be very bad for most operations (like counting, units, simulating general unrolling, static arrays, tuples, etc.)

This is a very good point.

Of course, the real tool you want for StaticArrays is setindex! on a Ref{<:StaticArray}. This lets you behave like C and mutate an element of a statically (i.e. stack) allocated array as an O(1) operation (at the moment the package does everything with tuples and relies on compiler optimizations for performance). Along with (potentially automatic) loop unrolling, we'd have very readable and generic code.

@vtjnash
Copy link

vtjnash commented Feb 27, 2017

Yes, JuliaLang/julia#11902 and JuliaLang/julia#17115 seemed to be stalled looking for more analysis of the how they'll interact with generic code and make it more/less error-prone/descriptive

@andyferris
Copy link
Member

Yeah, depending on how far you take it (e.g. interprocedurally), the escape/reference analysis problem seems like it could potentially become an entire, new "inference" system...

Do you think (at least a basic version of) these are reasonably expected in v1.0?

@andyferris
Copy link
Member

I think this is fixed by #121

oschulz pushed a commit to oschulz/StaticArrays.jl that referenced this issue Apr 4, 2023
* switch to DataAPI

* tests for refvalue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants