Skip to content

Commit c141e0a

Browse files
authored
Update attention_processor.py
1 parent 2ff4275 commit c141e0a

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

src/diffusers/models/attention_processor.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,7 @@
2626

2727
if is_torch_xla_available():
2828
from torch_xla.experimental.custom_kernel import flash_attention
29+
import torch_xla.distributed.spmd as xs
2930
import torch_xla.runtime as xr
3031
XLA_AVAILABLE = True
3132
else:

0 commit comments

Comments
 (0)