You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In addition, the logic in BFloat16s.jl isn't great, as we determine support based on the host processor. It's not clear if we can do better though; this looks a lot like the literal Int issue (where we can't make GPU code use Int32 when the host is Int64).
The text was updated successfully, but these errors were encountered:
Julia 1.11 introduces BFloat16 codegen support, so let's use this issue to track support for that.
Right now, it looks like we support the type, but somehow still emit conversions:
In addition, the logic in BFloat16s.jl isn't great, as we determine support based on the host processor. It's not clear if we can do better though; this looks a lot like the literal
Int
issue (where we can't make GPU code useInt32
when the host isInt64
).The text was updated successfully, but these errors were encountered: