Skip to content

Conversation

samanklesaria
Copy link
Collaborator

Now that deprecated functionality has been removed, we can remove the restrictions given when running the test suite during CI. This PR depends on the functionality in the branch pseudo_main.

Copy link

pytorch-bot bot commented Aug 20, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/audio/4061

Note: Links to docs will display an error until the docs builds have been completed.

❌ 8 New Failures

As of commit e45fe5e with merge base a645da6 (image):

NEW FAILURES - The following jobs have failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed label Aug 20, 2025
@samanklesaria
Copy link
Collaborator Author

samanklesaria commented Aug 21, 2025

The torchscript consistency tests don't seem to work with manual autograd in python (torch.autograd.Function overloads). Pretty much all the torchscript consistency checks seem to have this issue. The only way forward is to either:
A) Investigate more whether it's possible to get torch.autograd.Function working with torchscript
B) Delete the torchscript consistency checks.
As torchscript is essentially deprecated anyway, I propose option (B)

@pearu
Copy link
Collaborator

pearu commented Aug 28, 2025

It is possible to fix torchscript consistency tests by avoiding autograd functions under jit-scripting. For instance, rnnt_loss tests pass when applying the following patch:

diff --git a/src/torchaudio/functional/functional.py b/src/torchaudio/functional/functional.py
index 5d4284d5..a72c6091 100644
--- a/src/torchaudio/functional/functional.py
+++ b/src/torchaudio/functional/functional.py
@@ -1821,7 +1821,13 @@ def _rnnt_loss(
     if blank < 0:  # reinterpret blank index if blank < 0.
         blank = logits.shape[-1] + blank
 
-    costs = RnntLoss.apply(
+    if torch.jit.is_scripting():
+        # TorchScript does not support autograd functions
+        func = torch.ops.torchaudio.rnnt_loss_forward
+    else:
+        func = RnntLoss.apply
+
+    costs = func(
         logits,
         targets,
         logit_lengths,
@@ -1831,6 +1837,9 @@ def _rnnt_loss(
         fused_log_softmax
     )
 
+    if torch.jit.is_scripting():
+        costs = costs[0]
+
     if reduction == "mean":
         return costs.mean()
     elif reduction == "sum":

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants