Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrap HyperHessians.jl as an AD backend #32

Open
Vaibhavdixit02 opened this issue Dec 24, 2021 · 4 comments
Open

Wrap HyperHessians.jl as an AD backend #32

Vaibhavdixit02 opened this issue Dec 24, 2021 · 4 comments

Comments

@Vaibhavdixit02
Copy link
Member

Though it only focuses on hessians, combining with a gradient call from another package and hessian from https://github.com/KristofferC/HyperHessians.jl would be interesting for second order methods.

@DanielVandH
Copy link
Member

DanielVandH commented Jul 8, 2022

Were there any more thoughts on how this might be best implemented? I was thinking about how to fit this into the interface since I need to use some of these methods. Maybe as an extra field e.g. in AutoForwardDiff to allow for a different Hessian method (e.g.hyper::Bool), though that might be a bit awkward in the code. Perhaps once your comment here SciML/Optimization.jl#314 (comment) (and Chris' following comment) is resolved down the line this would be very convenient to add.

@ChrisRackauckas
Copy link
Member

Since it has limitations, it would be nice to have it as an option in AutoForwardDiff, like fasthes = true. We'd also have to setup the manual seeding for sparsity.

@DanielVandH
Copy link
Member

manual seeding for sparsity

What do you mean with this?

@ChrisRackauckas
Copy link
Member

The way the sparse coloring is done, https://book.sciml.ai/notes/09/

@Vaibhavdixit02 Vaibhavdixit02 transferred this issue from SciML/Optimization.jl Apr 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants