-
-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Correctly initializes a SparseConnectivityTracer.GradientTracer
cache
#117
Comments
Also what AD backend edit:
but not them:
|
It's meant to be used with ForwardDiff; it essentially provides caches that swap between Dual-numbers (that carry the value and partial values with them) and "normal" numbers. That was at least the initial intent :) |
@ChrisRackauckas mentioned on discourse that it should works with any differentiation tools that is based on type. My understanding is that it include all Knowing that, I'm wondering if there is a way to avoid |
Well the issue here is a user issue. The function being tested doesn't make sense. function f!(y, x::Vector{T}) where T
y .= get_tmp(cache, T)
println(y)
return y
end This violates the whole purpose of a cache, since there isn't anything being cached. Because of this, you get an uninitialized memory error because nothing has been written to the memory that you're using. You might've intended: function f!(y, x::Vector{T}) where T
tmp = get_tmp(cache, T)
tmp .= 0
y .= tmp
println(y)
return y
end or function f!(y, x::Vector{T}) where T
tmp = get_tmp(cache, T)
tmp .= x
y .= tmp
println(y)
return y
end either of which are fine. |
That's not very a professional phrasing. Maybe you should read again your SciML Community Standards, but let's put this aside. That was only a MRE. I am well aware that the function being differentiated is useless. My application is for memoization. Here's a more realistic MRE: using DifferentiationInterface, SparseConnectivityTracer, SparseMatrixColorings
using PreallocationTools
use_sparse = false # works with `false`, crashes with `true`
backend = if use_sparse
AutoSparse(
AutoForwardDiff();
sparsity_detector = TracerSparsityDetector(),
coloring_algorithm = GreedyColoringAlgorithm()
)
else
AutoForwardDiff()
end
i_counter, n = 0, 3
last_x_cache, last_y_cache = DiffCache(zeros(n)), DiffCache(zeros(n))
function f!(y, x::Vector{T}) where T
last_x, last_y = get_tmp(last_x_cache, T), get_tmp(last_y_cache, T)
if last_x != x
# some heavy computations here:
y .= x.^2 .+ 3x
global i_counter += 1
last_x .= x
last_y .= y
else
y .= last_y
end
return y
end
x, y, J = ones(n), zeros(n), zeros(n, n)
map(i -> jacobian!(f!, y, J, backend, x), 1:1000)
display(J) # J = diag{5, 5, 5}
@show i_counter; # executed only once, memoization works If you set edit : maybe you should mention in the Doc that we cannot assume that the initial value of |
I'm not entirely sure what was read into that but there was no ill-intent with it. I just meant that the function does not have a sensical definition because the caches are never written into, so the outputs are random with an undefined probability distribution. That would be considered a user error here and the error message is correctly saying that an uninitialized value is accessed. It's the same issue as
No that is incorrect. There is a fallback Any cache that is used for the other cases, which is how it supports sparsity tracing with symbolics. https://github.com/adrhill/SparseConnectivityTracer.jl/blob/main/src/tracers.jl#L15 With AbstractTracer defined as real it should work exactly the same way Symbolics does.
This again is a user error not a package bug, but it's a bit more subtle. The issue is that To make your function robust to julia> x = Vector{Any}(undef, 1)
1-element Vector{Any}:
#undef
julia> isassigned(x,1)
false In other words, here's the form of the function that would correctly be memoizing without the incorrect edge cases: function f!(y, x::Vector{T}) where T
last_x, last_y = get_tmp(last_x_cache, T), get_tmp(last_y_cache, T)
@asset !isempty(last_x) # need to make sure there is at least one element
if !isassigned(last_x, 1) || last_x != x # Short circuit || is required to avoid error access
# some heavy computations here:
y .= x.^2 .+ 3x
global i_counter += 1
last_x .= x
last_y .= y
else
y .= last_y
end
return y
end And that's a version that should just work fine with sparsity detection since it will not check the contents of Now again, in "most" scenarios your code would look like it was okay so it might not seem like a user error, but you do have to be careful about uninitialized memory since assuming that it will be zero is incorrect, as can be verified by something like this: julia> for i in 1:10
x = Vector{UInt8}(undef, 1)
@show @inbounds x[1]
end
#= REPL[54]:3 =# @inbounds(x[1]) = 0xf8
#= REPL[54]:3 =# @inbounds(x[1]) = 0xff
#= REPL[54]:3 =# @inbounds(x[1]) = 0x00
#= REPL[54]:3 =# @inbounds(x[1]) = 0x10
#= REPL[54]:3 =# @inbounds(x[1]) = 0x00
#= REPL[54]:3 =# @inbounds(x[1]) = 0x10
#= REPL[54]:3 =# @inbounds(x[1]) = 0xff
#= REPL[54]:3 =# @inbounds(x[1]) = 0xd0
#= REPL[54]:3 =# @inbounds(x[1]) = 0x40
#= REPL[54]:3 =# @inbounds(x[1]) = 0x00 Did that elaboration help? |
Yes thanks for the elaboration, it is more clear now. In short, the memory is never initialized with So the issue is in the documentation IMO. It should be mentioned in https://docs.sciml.ai/PreallocationTools/stable/#DiffCache or in https://docs.sciml.ai/PreallocationTools/stable/preallocationtools/#PreallocationTools.DiffCache that memory is never initialized, and the supplied values at construction are simply ignored. For exemple, I thought that: last_x_cache, last_y_cache = DiffCache(fill(NaN, n)), DiffCache(zeros(n)) would be a simple fix but now I know that it would not with your clarification. |
Describe the bug 🐞
Correctly initializes a
SparseConnectivityTracer.GradientTracer
cache at the value specified at construction.Expected behavior
When differentiating sparse Jacobians with
DifferentiationInterface.jl
andSparseConnectivityTracer
, the floats and duals in aDiffCache
are correctly intializes with the specified value at construction, but not theSparseConnectivityTracer.GradientTracer
(they start undefined).Minimal Reproducible Example 👇
Error & Stacktrace⚠️
Environment (please complete the following information):
using Pkg; Pkg.status()
using Pkg; Pkg.status(; mode = PKGMODE_MANIFEST)
versioninfo()
Additional context
It's for the integration of
DifferentiationInterface.jl
in `ModelPredictiveControl.jl.The text was updated successfully, but these errors were encountered: