You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
now during backprop, we would see incorrect behaviour right?
because the custom autograd function Exponential saves the output for backprop here instead of saving the input for backprop.
The text was updated successfully, but these errors were encountered:
Liger-Kernel/src/liger_kernel/ops/swiglu.py
Lines 59 to 60 in 58fd2bc
Lets take a custom autograd function:
and if we have an op like swiglu that modifies inputs in backwards:
now during backprop, we would see incorrect behaviour right?
because the custom autograd function
Exponential
saves the output for backprop here instead of saving the input for backprop.The text was updated successfully, but these errors were encountered: