Skip to content

Conversation

@yitan1
Copy link

@yitan1 yitan1 commented Dec 4, 2025

This closes #227 by following changes

  1. Add @non_differentiable for Base.promote_op
  2. Fix tensorscalar to avoid inplace function
  3. Remove unnecessary return from rrule

@yitan1 yitan1 changed the title Add support for higher-order derivatives #227 Add support for higher-order derivatives Dec 4, 2025
@github-actions
Copy link
Contributor

github-actions bot commented Dec 4, 2025

Your PR no longer requires formatting changes. Thank you for your contribution!

@codecov
Copy link

codecov bot commented Dec 4, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.

Files with missing lines Coverage Δ
ext/TensorOperationsChainRulesCoreExt.jl 87.71% <100.00%> (+87.71%) ⬆️

... and 27 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Member

@lkdvos lkdvos left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have only one minor comment, otherwise happy to merge this.

@non_differentiable TensorOperations.tensorcontract_structure(args...)
@non_differentiable TensorOperations.tensorcontract_type(args...)
@non_differentiable TensorOperations.tensoralloc_contract(args...)
@non_differentiable Base.promote_op(args...)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you know if your example also works if this isn't included? We can't really do this here since that is type piracy, so we would have to find an alternative way around this...

Suggested change
@non_differentiable Base.promote_op(args...)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support for second-order differentiation (grad of grad)

2 participants