-
Notifications
You must be signed in to change notification settings - Fork 151
Issues: linkedin/Liger-Kernel
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
AutoLigerKernelForCausalLM.from_pretrained discards hub_kwargs_names
#250
opened Sep 16, 2024 by
uris-opti
Benchmarking phi3 on single A100 40gb GPU: unable to reproduce benchmark results
#236
opened Sep 9, 2024 by
cosmicBboy
Division by zero when total_n_ignore = 0
bug
Something isn't working
p0
#177
opened Aug 30, 2024 by
44670
[feat] Add support for encoder-only transformers (e.g. BERT)
feature
#131
opened Aug 27, 2024 by
OxxoCodes
[feat] support for DeepseekV2
feature
help wanted
Extra attention is needed
huggingface
#129
opened Aug 27, 2024 by
tmm1
[AMD] Implement Flash Attention in Triton to enable transformers to run with Flash Attention on AMD GPUs.
AMD
feature
#126
opened Aug 27, 2024 by
ByronHsu
Previous Next
ProTip!
Updated in the last three days: updated:>2024-09-16.