Skip to content

Conversation

@HyunggyuJang
Copy link

Summary

Fixes a bug in the function where the same gradient was being added twice to each variable's gradient set.

Problem

Lines 365-366 contained a duplicate gradient addition after the proper gradient addition with metadata handling. This caused memory waste and potential gradient aggregation issues.

Solution

Remove the redundant gradient addition while preserving correct metadata propagation.

Fix duplicate gradient creation in _backward_idempotent function causing memory waste and potential gradient aggregation issues.

- Remove redundant gradient addition at lines 365-366
- Preserve existing gradient addition with proper metadata handling
- Ensure single gradient per variable in idempotent operations
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant