Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Constant memory is stored..." in MLUtils.unsqueeze #1866

Closed
BioTurboNick opened this issue Sep 20, 2024 · 12 comments
Closed

"Constant memory is stored..." in MLUtils.unsqueeze #1866

BioTurboNick opened this issue Sep 20, 2024 · 12 comments

Comments

@BioTurboNick
Copy link

Not sure what to make of this? Why would this basic function in MLUtils be a problem for Enzyme?

So opening an issue as suggested. I can't tell if 0.13 fixes it because Flux et al doesn't yet support it.

ERROR: Constant memory is stored (or returned) to a differentiable variable.
As a result, Enzyme cannot provably ensure correctness and throws this error.
This might be due to the use of a constant variable as temporary storage for active memory (https://enzyme.mit.edu/julia/stable/faq/#Activity-of-temporary-storage).
If Enzyme should be able to prove this use non-differentable, open an issue!
To work around this issue, either:
 a) rewrite this variable to not be conditionally active (fastest, but requires a code change), or
 b) set Enzyme.API.runtimeActivity!(true) immediately after loading Enzyme (which maintains correctness, but may slightly reduce performance).
Mismatched activity for:   ret {} addrspace(10)* %20, !dbg !481 const val:   %20 = call nonnull "enzyme_type"="{[-1]:Pointer}" {} addrspace(10)* @ijl_reshape_array({} addrspace(10)* noundef addrspacecast ({}* inttoptr (i64 280629280561936 to {}*) to {} addrspace(10)*), {} addrspace(10)* noundef nonnull %0, {} addrspace(10)* noundef nonnull %box) #381, !dbg !494
Type tree: {}
 llvalue=  %20 = call nonnull "enzyme_type"="{[-1]:Pointer}" {} addrspace(10)* @ijl_reshape_array({} addrspace(10)* noundef addrspacecast ({}* inttoptr (i64 280629280561936 to {}*) to {} addrspace(10)*), {} addrspace(10)* noundef nonnull %0, {} addrspace(10)* noundef nonnull %box) #381, !dbg !494

Stacktrace:
 [1] #unsqueeze#95
   @ ~/.julia/packages/MLUtils/LmmaQ/src/utils.jl:40

Stacktrace:
  [1] #unsqueeze#95
    @ ~/.julia/packages/MLUtils/LmmaQ/src/utils.jl:40
  [2] unsqueeze
    @ ~/.julia/packages/MLUtils/LmmaQ/src/utils.jl:37 [inlined]
  [3] unsqueeze
    @ ./deprecated.jl:105
  [4] ...omitted...
  [5] ...omitted...
  [6] _applyloss
    @ ~/.julia/packages/Flux/HBF2N/ext/FluxEnzymeExt/FluxEnzymeExt.jl:15 [inlined]
  [7] _applyloss
    @ ~/.julia/packages/Flux/HBF2N/ext/FluxEnzymeExt/FluxEnzymeExt.jl:0 [inlined]
  [8] augmented_julia__applyloss_6153_inner_1wrap
    @ ~/.julia/packages/Flux/HBF2N/ext/FluxEnzymeExt/FluxEnzymeExt.jl:0
  [9] macro expansion
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:7187 [inlined]
 [10] enzyme_call
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:6794 [inlined]
 [11] AugmentedForwardThunk
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:6682 [inlined]
 [12] autodiff(::EnzymeCore.ReverseMode{…}, ::EnzymeCore.Const{…}, ::Type{…}, ::EnzymeCore.Const{…}, ::EnzymeCore.Duplicated{…}, ::EnzymeCore.Const{…}, ::EnzymeCore.Const{…})
    @ Enzyme ~/.julia/packages/Enzyme/TiboG/src/Enzyme.jl:264
 [13] autodiff
    @ ~/.julia/packages/Enzyme/TiboG/src/Enzyme.jl:332 [inlined]
 [14] macro expansion
    @ ~/.julia/packages/Flux/HBF2N/ext/FluxEnzymeExt/FluxEnzymeExt.jl:34 [inlined]
 [15] macro expansion
    @ ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:328 [inlined]
 [16] train!(loss::Function, model::EnzymeCore.Duplicated{…}, data::Tuple{…}, opt::@NamedTuple{…}; cb::Nothing)
    @ FluxEnzymeExt ~/.julia/packages/Flux/HBF2N/ext/FluxEnzymeExt/FluxEnzymeExt.jl:30
 [17] train!(loss::Function, model::EnzymeCore.Duplicated{…}, data::Tuple{…}, opt::@NamedTuple{…})
    @ FluxEnzymeExt ~/.julia/packages/Flux/HBF2N/ext/FluxEnzymeExt/FluxEnzymeExt.jl:27
 [18] ...omitted...
 [19] top-level scope
    @ REPL[5]:1
Some type information was truncated. Use `show(err)` to see complete types.
Julia Version 1.10.5
Commit 6f3fdf7b362 (2024-08-27 14:19 UTC)
Build Info:
  Official https://julialang.org/ release
Platform Info:
  OS: Linux (aarch64-linux-gnu)
  CPU: 12 × unknown
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-15.0.7 (ORCJIT, generic)
Threads: 1 default, 0 interactive, 1 GC (on 12 virtual cores)
@wsmoses
Copy link
Member

wsmoses commented Sep 20, 2024

hm can you post the stack trace and reproducer?

@BioTurboNick
Copy link
Author

BioTurboNick commented Sep 20, 2024

using Enzyme
using Flux
using MLUtils

function train_network()
    network = Dense(1 => 1)
    evaluation_data = rand(40, 40, 16), rand(2, 20, 16)
    model = Duplicated(network, make_zero(network))
    _applyloss(loss, model, d...) = loss(model, d...)
    loss(m, x, y) = MLUtils.unsqueeze(y, 2)
    Enzyme.autodiff(ReverseWithPrimal, _applyloss, Active, Const(loss), model, map(Const, evaluation_data)...)
    return network
end

ERROR: Constant memory is stored (or returned) to a differentiable variable.
As a result, Enzyme cannot provably ensure correctness and throws this error.
This might be due to the use of a constant variable as temporary storage for active memory (https://enzyme.mit.edu/julia/stable/faq/#Activity-of-temporary-storage).
If Enzyme should be able to prove this use non-differentable, open an issue!
To work around this issue, either:
 a) rewrite this variable to not be conditionally active (fastest, but requires a code change), or
 b) set Enzyme.API.runtimeActivity!(true) immediately after loading Enzyme (which maintains correctness, but may slightly reduce performance).
Mismatched activity for:   ret {} addrspace(10)* %8, !dbg !48 const val:   %8 = call nonnull "enzyme_type"="{[-1]:Pointer}" {} addrspace(10)* @ijl_reshape_array({} addrspace(10)* noundef addrspacecast ({}* inttoptr (i64 280493283592016 to {}*) to {} addrspace(10)*), {} addrspace(10)* noundef nonnull %0, {} addrspace(10)* noundef nonnull %box) #26, !dbg !51
Type tree: {}
 llvalue=  %8 = call nonnull "enzyme_type"="{[-1]:Pointer}" {} addrspace(10)* @ijl_reshape_array({} addrspace(10)* noundef addrspacecast ({}* inttoptr (i64 280493283592016 to {}*) to {} addrspace(10)*), {} addrspace(10)* noundef nonnull %0, {} addrspace(10)* noundef nonnull %box) #26, !dbg !51

Stacktrace:
 [1] unsqueeze
   @ ~/.julia/packages/MLUtils/LmmaQ/src/utils.jl:37
 [2] unsqueeze
   @ ./deprecated.jl:105

Stacktrace:
  [1] unsqueeze
    @ ~/.julia/packages/MLUtils/LmmaQ/src/utils.jl:37 [inlined]
  [2] unsqueeze
    @ ./deprecated.jl:105
  [3] loss
    @ ~/Repos/DLEnzymeBug/src/train.jl:17 [inlined]
  [4] _applyloss
    @ ~/Repos/DLEnzymeBug/src/train.jl:16 [inlined]
  [5] _applyloss
    @ ~/Repos/DLEnzymeBug/src/train.jl:0 [inlined]
  [6] augmented_julia__applyloss_33928_inner_1wrap
    @ ~/Repos/DLEnzymeBug/src/train.jl:0
  [7] macro expansion
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:7187 [inlined]
  [8] enzyme_call
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:6794 [inlined]
  [9] AugmentedForwardThunk
    @ ~/.julia/packages/Enzyme/TiboG/src/compiler.jl:6682 [inlined]
 [10] autodiff(::EnzymeCore.ReverseMode{…}, ::EnzymeCore.Const{…}, ::Type{…}, ::EnzymeCore.Const{…}, ::EnzymeCore.Duplicated{…}, ::EnzymeCore.Const{…}, ::EnzymeCore.Const{…})
    @ Enzyme ~/.julia/packages/Enzyme/TiboG/src/Enzyme.jl:264
 [11] autodiff
    @ ~/.julia/packages/Enzyme/TiboG/src/Enzyme.jl:332 [inlined]
 [12] train_network()
    @ DeepLoco ~/Repos/DLEnzymeBug/src/train.jl:15
 [13] top-level scope
    @ REPL[16]:1
Some type information was truncated. Use `show(err)` to see complete types.

@wsmoses
Copy link
Member

wsmoses commented Sep 28, 2024

So in part I think the issue here is that your case is over simplified.

Specifically the error complains that you are trying to differentiate with respect to a value which is proven to be non-differentiable (aka 0). Also, relatedly, you are using reverse-mode autodiff on a function which returns an array, when it only supports functions which return a scalar

@wsmoses wsmoses closed this as completed Sep 28, 2024
@BioTurboNick
Copy link
Author

Huh?

How can it be oversimplified? I reduced a very complex case to this MWE.

@BioTurboNick
Copy link
Author

BioTurboNick commented Sep 28, 2024

Also, relatedly, you are using reverse-mode autodiff on a function which returns an array, when it only supports functions which return a scalar

Zygote does just fine differentiating over unsqueeze.

@BioTurboNick
Copy link
Author

Specifically the error complains that you are trying to differentiate with respect to a value which is proven to be non-differentiable (aka 0).

Where does the error message say this? In fact it says the exact opposite, that it can't prove it to be non-differentiable.

@BioTurboNick
Copy link
Author

Finally, if I change the loss function to return a scalar, with |> first, I still get the same error.

I have to say, it's extremely frustrating that I put effort into providing you information and you just dismiss it with a cursory glance.

@wsmoses
Copy link
Member

wsmoses commented Sep 28, 2024

@BioTurboNick I did run it and the things mentioned were all required to solve (see below).

_applyloss(model, x, y) = MLUtils.unsqueeze(y, 2)
Enzyme.autodiff(ReverseWithPrimal, _applyloss, Active, model, Const(evaluation_data[1]), Const(evaluation_data[2]))


_applyloss(model, x, y) = MLUtils.unsqueeze(y, 2)
Enzyme.autodiff(ReverseWithPrimal, _applyloss, Active, model, Const(evaluation_data[1]), Duplicated(evaluation_data[2], make_zero(evaluation_data[2])))

# The value is being differentiated is entirely a function of y a known constant. Making this differentiable (e.g.) meaningful to differentiate, now yields
#.     ERROR: Enzyme Mutability Error: Cannot add one in place to immutable value [0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0;;; 0.0; 0.0]
#.     Stacktrace:
#.      [1] error(s::String)
#.        @ Base ./error.jl:35
#.      [2] add_one_in_place
#.        @ ~/git/Enzyme.jl/src/compiler.jl:8785 [inlined]
#.      [3] augmented_julia__applyloss_4495_inner_1wrap
#.        @ ./REPL[33]:0
#.      [4] macro expansion
#.        @ ~/git/Enzyme.jl/src/compiler.jl:9227 [inlined]
#.      [5] enzyme_call
#.        @ ~/git/Enzyme.jl/src/compiler.jl:8793 [inlined]
#.      [6] AugmentedForwardThunk
#.        @ ~/git/Enzyme.jl/src/compiler.jl:8630 [inlined]
#.      [7] autodiff
#.        @ ~/git/Enzyme.jl/src/Enzyme.jl:384 [inlined]
#.      [8] autodiff(::ReverseMode{true, false, FFIABI, false, false}, ::typeof(_applyloss), ::Type{Active}, ::Duplicated{Dense{typeof(identity), Matrix{Float32}, Vector{Float32}}}, ::Const{Array{Float64, 3}}, ::Duplicated{Array{Float64, 3}})
#.        @ Enzyme ~/git/Enzyme.jl/src/Enzyme.jl:512
#.      [9] top-level scope
#.        @ REPL[34]:1



_applyloss(model, x, y) = MLUtils.unsqueeze(y, 2)[1]
Enzyme.autodiff(ReverseWithPrimal, _applyloss, Active, model, Const(evaluation_data[1]), Duplicated(evaluation_data[2], make_zero(evaluation_data[2])))

# Fixing to return a float rather than an array, it runs successfully
# ((nothing, nothing, nothing), 0.7901278956371048)

@wsmoses
Copy link
Member

wsmoses commented Sep 28, 2024

Alternatively you can use runtime activity as specified in the error message:

_applyloss(model, x, y) = MLUtils.unsqueeze(y, 2)
Enzyme.autodiff(ReverseWithPrimal, _applyloss, Active, model, Const(evaluation_data[1]), Const(evaluation_data[2]))

ERROR: Constant memory is stored (or returned) to a differentiable variable.
As a result, Enzyme cannot provably ensure correctness and throws this error.
This might be due to the use of a constant variable as temporary storage for active memory (https://enzyme.mit.edu/julia/stable/faq/#Runtime-Activity).
If Enzyme should be able to prove this use non-differentable, open an issue!
To work around this issue, either:
 a) rewrite this variable to not be conditionally active (fastest, but requires a code change), or
 b) set the Enzyme mode to turn on runtime activity (e.g. autodiff(set_runtime_activity(Reverse), ...) ). This will maintain correctness, but may slightly reduce performance.
Mismatched activity for:   ret {} addrspace(10)* %20, !dbg !56 const val:   %20 = call nonnull "enzyme_type"="{[-1]:Pointer}" {} addrspace(10)* @ijl_reshape_array({} addrspace(10)* noundef addrspacecast ({}* inttoptr (i64 4467185552 to {}*) to {} addrspace(10)*), {} addrspace(10)* noundef nonnull %0, {} addrspace(10)* noundef nonnull %box) #16, !dbg !69
Type tree: {}
 llvalue=  %20 = call nonnull "enzyme_type"="{[-1]:Pointer}" {} addrspace(10)* @ijl_reshape_array({} addrspace(10)* noundef addrspacecast ({}* inttoptr (i64 4467185552 to {}*) to {} addrspace(10)*), {} addrspace(10)* noundef nonnull %0, {} addrspace(10)* noundef nonnull %box) #16, !dbg !69

Stacktrace:
 [1] #unsqueeze#95
   @ ~/.julia/packages/MLUtils/LmmaQ/src/utils.jl:40

Stacktrace:
  [1] #unsqueeze#95
    @ ~/.julia/packages/MLUtils/LmmaQ/src/utils.jl:40
  [2] unsqueeze
    @ ~/.julia/packages/MLUtils/LmmaQ/src/utils.jl:37 [inlined]
  [3] unsqueeze
    @ ./deprecated.jl:105
  [4] _applyloss
    @ ./REPL[37]:1 [inlined]
  [5] _applyloss
    @ ./REPL[37]:0 [inlined]
  [6] augmented_julia__applyloss_4670_inner_1wrap
    @ ./REPL[37]:0
  [7] macro expansion
    @ ~/git/Enzyme.jl/src/compiler.jl:9227 [inlined]
  [8] enzyme_call
    @ ~/git/Enzyme.jl/src/compiler.jl:8793 [inlined]
  [9] AugmentedForwardThunk
    @ ~/git/Enzyme.jl/src/compiler.jl:8630 [inlined]
 [10] autodiff
    @ ~/git/Enzyme.jl/src/Enzyme.jl:384 [inlined]
 [11] autodiff(::ReverseMode{true, false, FFIABI, false, false}, ::typeof(_applyloss), ::Type{Active}, ::Duplicated{Dense{typeof(identity), Matrix{Float32}, Vector{Float32}}}, ::Const{Array{Float64, 3}}, ::Const{Array{Float64, 3}})
    @ Enzyme ~/git/Enzyme.jl/src/Enzyme.jl:512
 [12] top-level scope
    @ REPL[38]:1


_applyloss(model, x, y) = MLUtils.unsqueeze(y, 2)
Enzyme.autodiff(set_runtime_activity(ReverseWithPrimal), _applyloss, Active, model, Const(evaluation_data[1]), Const(evaluation_data[2]))


ERROR: Enzyme Mutability Error: Cannot add one in place to immutable value [0.7901278956371048; 0.5243828388610502;;; 0.05206431580565829; 0.042424944620979566;;; 0.9041969124869047; 0.8867368854209283;;; 0.9296222227818691; 0.1916885078737457;;; 0.3842032890417951; 0.8205206485112058;;; 0.48845789464859124; 0.7108888144573352;;; 0.5510313643934687; 0.6160853370701236;;; 0.7986656335263023; 0.6504117625009753;;; 0.7319334623791899; 0.8911413522768731;;; 0.7562918496386102; 0.8572838054099116;;; 0.11373884740758178; 0.9614040950173122;;; 0.1115953767574992; 0.9246809265409892;;; 0.7319682245574356; 0.35844381499101263;;; 0.6103653600915675; 0.9127965123437306;;; 0.06183358620807333; 0.7272896668996645;;; 0.29710672751744804; 0.5924773129686981;;; 0.0581417178487057; 0.003902614113168723;;; 0.44566858511932894; 0.7279402484452306;;; 0.8625474808271698; 0.1346296207425447;;; 0.693103566274063; 0.9131398696737539;;;; 0.7451303711184283; 0.588955793494022;;; 0.8820514947243479; 0.43310636186416795;;; 0.7370448072989001; 0.49054299688379954;;; 0.3426924680726934; 0.5837453515776806;;; 0.986537724834232; 0.8689986779912566;;; 0.7389496653156157; 0.46046468206802726;;; 0.2761668666413404; 0.6740989168394811;;; 0.8710895666085202; 0.07179792915930483;;; 0.7880963765336483; 0.07516475748396167;;; 0.32670382910803286; 0.7321731572589278;;; 0.3383513386592435; 0.15291712859330453;;; 0.9975186298037962; 0.880819093111514;;; 0.8391433469523678; 0.6118204441969567;;; 0.21929403215609378; 0.07616915593554208;;; 0.30613000631354137; 0.9308631116622921;;; 0.9961273500385266; 0.5401890107965044;;; 0.7854262357279438; 0.7591781534693708;;; 0.045348569913009396; 0.7076213452834544;;; 0.07471383280144617; 0.5731654747124457;;; 0.6595426319624617; 0.8747292582466809;;;; 0.775723804555619; 0.754545129292479;;; 0.6159040713423147; 0.8254254607703115;;; 0.9457104574206049; 0.33658652167692915;;; 0.5874312444483137; 0.9537502925772252;;; 0.10701270097070914; 0.31300733781204215;;; 0.8420168538433962; 0.8837481583326179;;; 0.8834733613651018; 0.25445977675991904;;; 0.19436919349384785; 0.536600351141031;;; 0.2655817832107039; 0.9447785164526284;;; 0.4117701781930765; 0.5353834402143219;;; 0.09902711166206335; 0.7410534131507615;;; 0.9623033662145312; 0.66746463100968;;; 0.0674760361522968; 0.7474310584131016;;; 0.5255125866215571; 0.11353516619581883;;; 0.6068506323557825; 0.01461364751090899;;; 0.6011169511780139; 0.7879984415126353;;; 0.34904924217516664; 0.500875830241042;;; 0.5047829879539872; 0.4407124101562021;;; 0.8554757082877474; 0.38632911607986486;;; 0.2893702410382585; 0.045367131998712784;;;; 0.42062245470538484; 0.8475726324293369;;; 0.15349663299822758; 0.19178411054535316;;; 0.12255869306064726; 0.012308952140697493;;; 0.15976400207884267; 0.8780693756163374;;; 0.5247464510640781; 0.1372407544564873;;; 0.6658957671327919; 0.1120732724740292;;; 0.7467444873644086; 0.996278167367711;;; 0.5168387179646184; 0.16539336915889724;;; 0.44122683032765064; 0.13938745325959545;;; 0.6611127022220211; 0.12863604168081544;;; 0.3671392587089263; 0.9587410789962366;;; 0.7851092619580454; 0.446652576425024;;; 0.38370282963687663; 0.507825563841686;;; 0.33740125861107473; 0.958402461847042;;; 0.6115724810054921; 0.06008021001117758;;; 0.9650812350462684; 0.07691910320294226;;; 0.8405557857223313; 0.15042692632428822;;; 0.10032656991078981; 0.29323482017444935;;; 0.16236386038938244; 0.2165040067803593;;; 0.5771485675118592; 0.25216499258707636;;;; 0.12464414648363464; 0.1282805903496682;;; 0.8835309069333261; 0.9056963500237283;;; 0.39046079737666484; 0.289899463901215;;; 0.45701861554801126; 0.5799224614159303;;; 0.45812059660466575; 0.05080687851571597;;; 0.20054023654699893; 0.48262636270729176;;; 0.08437762372122826; 0.08626388059971846;;; 0.7314507189815865; 0.09732689610135592;;; 0.1723888803749073; 0.784825009954895;;; 0.533007890611331; 0.2479948188378832;;; 0.6971035652170204; 0.4387543393482787;;; 0.8058588619991118; 0.8729636193656478;;; 0.4269397997594223; 0.7556584713004968;;; 0.5806854626650837; 0.7987394828005392;;; 0.43751081898989375; 0.7943483012013131;;; 0.14082656983736863; 0.12098250902105356;;; 0.9462154506854126; 0.036955127348105354;;; 0.4363176075872659; 0.6384982219378392;;; 0.1966389380739254; 0.8595247476416722;;; 0.4684678463153047; 0.8907893994079299;;;; 0.15108960850841124; 0.6526254644627042;;; 0.6770792892469655; 0.8035514273908744;;; 0.5349862483871712; 0.09586957952983521;;; 0.796634083902328; 0.5551805990111403;;; 0.5784880448156338; 0.9362292417870834;;; 0.8219918304049835; 0.3161337781705831;;; 0.7467532273002657; 0.21428944929557614;;; 0.11208302001407788; 0.21630551603060355;;; 0.35294527373210616; 0.45919191300982476;;; 0.03207242761607276; 0.7577303887310568;;; 0.22916010022019806; 0.3048225980511653;;; 0.44745003909684955; 0.6823683204273099;;; 0.35698166275626764; 0.4836763059198108;;; 0.7600926595261142; 0.48386073513738026;;; 0.977020835615542; 0.03123806184010669;;; 0.6044779055045146; 0.1606262976049012;;; 0.42403815292186553; 0.6871422041989429;;; 0.2909439251067425; 0.3666796635768651;;; 0.8526856890289601; 0.2735647650345726;;; 0.9452028731624023; 0.4312229276058708;;;; 0.9758338580021279; 0.037128821103399545;;; 0.5162437580805757; 0.6991542624125234;;; 0.9053866913955211; 0.418039448185169;;; 0.7940687103770604; 0.07131778995416838;;; 0.9662386680305264; 0.323604483874839;;; 0.9327399740867668; 0.6803566318758221;;; 0.11286777284423022; 0.4911416990655427;;; 0.20786594872728092; 0.3599072478643196;;; 0.8082969191673391; 0.9611570657185031;;; 0.08866919178519705; 0.947352290253721;;; 0.33062960721339807; 0.5440087713111814;;; 0.286250636537553; 0.18111465582010133;;; 0.8738698490409756; 0.17193232509844858;;; 0.601766640833718; 0.9260354174360329;;; 0.6662877150843629; 0.6381459568348317;;; 0.8226680972543661; 0.9918228116341582;;; 0.24831504551832562; 0.6238380175231256;;; 0.7829746502685208; 0.5045771504224875;;; 0.11095792722855702; 0.34113596535946733;;; 0.986835612996101; 0.44450829929702385;;;; 0.5591883817115665; 0.9613178992958055;;; 0.959671120388717; 0.20021054330269472;;; 0.5155653143440803; 0.3826910622452385;;; 0.388714296309009; 0.7530705082523761;;; 0.7851200909659769; 0.8352112923080135;;; 0.0972360501334193; 0.6563499817528663;;; 0.8622363098002122; 0.8477976634356362;;; 0.8239573138120928; 0.06825434375051753;;; 0.3146057277339074; 0.35653248507982316;;; 0.5464804075137767; 0.7438178729357515;;; 0.007264051506956859; 0.8189514577970046;;; 0.15364545978131994; 0.7089098849706591;;; 0.30796969822599074; 0.6972801722172022;;; 0.5115029193925821; 0.2979882420385306;;; 0.27377495867139856; 0.7291783285617869;;; 0.45456835485678126; 0.6366029550499981;;; 0.12265740401415359; 0.2566227526041852;;; 0.07564033249380675; 0.6941651635824645;;; 0.9262397839327383; 0.7782480804941648;;; 0.2711272272270119; 0.9434743054354069;;;; 0.9966357112890526; 0.5157856479131359;;; 0.23499308917317752; 0.3615394208135425;;; 0.2928450075464878; 0.11559860392430665;;; 0.2820358219064689; 0.22573439121841066;;; 0.24505059364111204; 0.8904281543389335;;; 0.7622729657271802; 0.8108994773895168;;; 0.15178521059052885; 0.5084049612214808;;; 0.26297852862174353; 0.33176415426684824;;; 0.9964454740812851; 0.5352604737506225;;; 0.63053185435975; 0.6078761539915171;;; 0.7657809959474151; 0.9633167110514528;;; 0.7797999534189867; 0.2187339209753536;;; 0.4739700493171073; 0.3607880898305762;;; 0.46456725080409267; 0.16038599392647634;;; 0.07393873853912214; 0.23228868583144624;;; 0.7842473074503128; 0.2434139597427043;;; 0.5830684398197841; 0.545714775563;;; 0.22548562689867369; 0.9133625862461482;;; 0.1044675895064896; 0.13992570627466694;;; 0.5603786453172073; 0.6920283137324378;;;; 0.9036285333635722; 0.6485835319439448;;; 0.5530179519543973; 0.7877942236867495;;; 0.09086019588907268; 0.227243327955148;;; 0.22645966536554074; 0.16686912947610444;;; 0.37722964348650345; 0.9970872401788294;;; 0.8505936279594136; 0.7810217554378794;;; 0.6978993228662076; 0.371236463798733;;; 0.36928037697049954; 0.5044572607066411;;; 0.3661250105118964; 0.6168443097940579;;; 0.6278559788729429; 0.20250369321824047;;; 0.02474476601258968; 0.8034819194869126;;; 0.21313601587236963; 0.25103407522383525;;; 0.05764067066010037; 0.9514735506499645;;; 0.7866692417245342; 0.2215295277829865;;; 0.37663293219930005; 0.027662828182963506;;; 0.033186295332082394; 0.5726022402493752;;; 0.18522449572584354; 0.5564862486968513;;; 0.7425518122047915; 0.8051152467344617;;; 0.8802033204372185; 0.7874675562842501;;; 0.6451193039137645; 0.7848295493183577;;;; 0.6750311369376193; 0.011290897217101303;;; 0.5768974017805718; 0.34310572726105193;;; 0.41444409523223014; 0.09330411968916008;;; 0.6133581239094767; 0.7807230547543523;;; 0.7054425659080862; 0.5227323642661871;;; 0.6210747262137837; 0.16351357435823788;;; 0.5539457239604254; 0.04129232819758577;;; 0.9571388509392259; 0.7282201084758019;;; 0.46486606707746614; 0.49707720059439475;;; 0.5059058915045033; 0.8534432222519022;;; 0.5738062558136338; 0.8042512753269297;;; 0.6817895336678493; 0.0905187371501428;;; 0.8161469761413177; 0.5055752204982046;;; 0.736853851002642; 0.3804765500963534;;; 0.244652520238282; 0.5585043159930319;;; 0.2719532065332704; 0.3384217195406597;;; 0.591258871460201; 0.6410851890502542;;; 0.09801515000623495; 0.864688392959612;;; 0.9583817421110592; 0.28700112236185993;;; 0.7375461508412647; 0.3128552571412804;;;; 0.9353977266531259; 0.04617165803985701;;; 0.7317086458772264; 0.7669190454383958;;; 0.51356971588423; 0.7740077841897177;;; 0.3614634001228956; 0.6731943913704299;;; 0.10725637794619669; 0.684145097084503;;; 0.2452935107104668; 0.24608080252941567;;; 0.9535679056264544; 0.9515567881923659;;; 0.6595956992502129; 0.9034943776336494;;; 0.7654065387987141; 0.48781042441658073;;; 0.2298246354566743; 0.19392280751400748;;; 0.547375324121975; 0.9675474550946606;;; 0.8004861140100529; 0.33565748332480616;;; 0.5738977837560054; 0.7071721403721838;;; 0.6017774600192565; 0.8263262307553648;;; 0.09930612247494419; 0.8256093993993212;;; 0.8687394842818498; 0.7499091586602629;;; 0.42208893694323646; 0.7967831396481705;;; 0.9466752568160212; 0.5078366019951532;;; 0.38839053772427157; 0.3918163630618555;;; 0.5850582838758792; 0.13906809613454874;;;; 0.9537930704413619; 0.23330627196377607;;; 0.17474454479385737; 0.018783989838793258;;; 0.2016702036706446; 0.38741013063122576;;; 0.035100409695755785; 0.8757891815520756;;; 0.539297867800286; 0.6594257128762506;;; 0.8382343646717909; 0.9956973449032754;;; 0.1182294888540959; 0.40705681318288367;;; 0.9760801789273651; 0.9280312337417336;;; 0.456587411587174; 0.8531091995880546;;; 0.7192793760234742; 0.8445084331472154;;; 0.30737749220389443; 0.25857252852078527;;; 0.136043576110508; 0.6892927973441653;;; 0.7594539880355677; 0.10657743815456111;;; 0.1968832280406595; 0.9251188123943426;;; 0.24360126421702177; 0.9511389046106895;;; 0.4007078333022569; 0.6360809030534454;;; 0.7648957733583945; 0.4512559091272119;;; 0.5342203068119074; 0.6019817294422262;;; 0.24663927623020565; 0.1438739931875208;;; 0.6813344195840108; 0.7065665238920538;;;; 0.7992307046607268; 0.8740681279618276;;; 0.6975793672222598; 0.08715055988914899;;; 0.735092484724648; 0.48204328798024443;;; 0.14029312903453162; 0.4415159573196803;;; 0.7614288220341079; 0.9431626643267224;;; 0.4204542247299138; 0.2484101855146592;;; 0.9533234207717678; 0.40981127161573105;;; 0.7962528560495624; 0.22560052346342796;;; 0.05707780161664866; 0.12104900065289836;;; 0.1864714903875636; 0.5541584944425325;;; 0.4073753193624795; 0.08290510758415481;;; 0.38712214159056857; 0.7149161762509139;;; 0.3531026888806744; 0.4720525390347766;;; 0.17122340714388573; 0.9717950288187717;;; 0.147318199179569; 0.3296602895292392;;; 0.16783813531356817; 0.25018587484155697;;; 0.8510279938968152; 0.11731670849558373;;; 0.5344241632250079; 0.8095357323036023;;; 0.8727984719197671; 0.28490394574907585;;; 0.3050187163599771; 0.8885343461623553;;;; 0.7470316886762396; 0.012115625511481598;;; 0.9261021022613314; 0.4223176445831198;;; 0.1542555196452926; 0.016666160294160903;;; 0.48307355322413403; 0.4299154066595645;;; 0.47055786821224055; 0.4566034851477221;;; 0.7167785337531817; 0.45326917917278986;;; 0.9573589672987911; 0.06982097895977957;;; 0.45952326623632245; 0.4824940636297972;;; 0.020975089438319383; 0.8179070789471249;;; 0.013449681767100818; 0.150258646716662;;; 0.19447722628788677; 0.7984570812936764;;; 0.1311538458063164; 0.30389256999842484;;; 0.48103537519875683; 0.2569399990973542;;; 0.7248114704617331; 0.48443423934945173;;; 0.7624140523039958; 0.4460770604407165;;; 0.9818662078064444; 0.5570615807616569;;; 0.27929253734217185; 0.3698706071864135;;; 0.05971240790858401; 0.6731806272327034;;; 0.7298937287320632; 0.2195268965161311;;; 0.44271538793319065; 0.14477914608464637;;;; 0.7424721984278198; 0.8278986591780595;;; 0.34213232849637665; 0.1320744578597809;;; 0.9699010446975913; 0.3744656046579842;;; 0.24552829825169709; 0.15801412866514197;;; 0.3599723157587008; 0.46253484921820587;;; 0.8102360393427793; 0.057947629096336994;;; 0.43392720461591816; 0.09264547373153253;;; 0.06865722887809611; 0.06472184400976144;;; 0.07367460345228871; 0.757961025975886;;; 0.48575549266127405; 0.8929860276608905;;; 0.5809070680932149; 0.44161851112434214;;; 0.7870521187143621; 0.5836232532604528;;; 0.051417956104615414; 0.8993823970858013;;; 0.7091544083993706; 0.9918584101231409;;; 0.710986687289085; 0.10305370726935259;;; 0.11946392944619277; 0.16130623760752782;;; 0.6544169050248001; 0.8995025687688125;;; 0.11280953879757527; 0.26765216195960884;;; 0.7770424946456892; 0.6370790686033622;;; 0.5337764553322083; 0.29709197966232237]
Stacktrace:
 [1] error(s::String)
   @ Base ./error.jl:35
 [2] add_one_in_place
   @ ~/git/Enzyme.jl/src/compiler.jl:8785 [inlined]
 [3] augmented_julia__applyloss_4704_inner_1wrap
   @ ./REPL[37]:0
 [4] macro expansion
   @ ~/git/Enzyme.jl/src/compiler.jl:9227 [inlined]
 [5] enzyme_call
   @ ~/git/Enzyme.jl/src/compiler.jl:8793 [inlined]
 [6] AugmentedForwardThunk
   @ ~/git/Enzyme.jl/src/compiler.jl:8630 [inlined]
 [7] autodiff
   @ ~/git/Enzyme.jl/src/Enzyme.jl:384 [inlined]
 [8] autodiff(::ReverseMode{true, true, FFIABI, false, false}, ::typeof(_applyloss), ::Type{Active}, ::Duplicated{Dense{typeof(identity), Matrix{Float32}, Vector{Float32}}}, ::Const{Array{Float64, 3}}, ::Const{Array{Float64, 3}})
   @ Enzyme ~/git/Enzyme.jl/src/Enzyme.jl:512
 [9] top-level scope
   @ REPL[39]:1


_applyloss(model, x, y) = MLUtils.unsqueeze(y, 2)[1]
Enzyme.autodiff(set_runtime_activity(ReverseWithPrimal), _applyloss, Active, model, Const(evaluation_data[1]), Const(evaluation_data[2]))
# Again returning a float is required, now it is again resolved
# ((nothing, nothing, nothing), 0.7901278956371048)

@wsmoses
Copy link
Member

wsmoses commented Sep 28, 2024

The docs linked in the error message should explain pretty much this exact situation (https://enzyme.mit.edu/julia/stable/faq/#Runtime-Activity)

And if they're not clear, let us know!

@BioTurboNick
Copy link
Author

Thank you, I appreciate the additional information. The other issues you identified were ancillary - I was just removing lines of code until the original error disappeared.

set_runtime_activity seems like a workaround based on what it says about it. What I'm having trouble with is that the error is inside MLUtils.unsqueeze, which is AFAICT a common function used in the ecosystem, and not originating in my code. So shouldn't Enzyme be able to deal with it without my intervention as user of the ecosystem?

@BioTurboNick
Copy link
Author

BioTurboNick commented Oct 4, 2024

_applyloss(model, x, y) = MLUtils.unsqueeze(y, 2)[1]
Enzyme.autodiff(ReverseWithPrimal, _applyloss, Active, model, Const(evaluation_data[1]), Duplicated(evaluation_data[2], make_zero(evaluation_data[2])))

# Fixing to return a float rather than an array, it runs successfully
# ((nothing, nothing, nothing), 0.7901278956371048)

@wsmoses Flux.train!'s Enzyme extension makes the both the training input and training target data Const. Should the Flux extension actually be making the training target data Duplicated instead? It looks like you committed it originally to make them all Const.

https://github.com/FluxML/Flux.jl/blob/eece505219d1f19b08254203d6414ba03b02d0a6/ext/FluxEnzymeExt/FluxEnzymeExt.jl#L34-L35

EDIT: Or perhaps both need to be Duplicated, that fixed it for me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants