Skip to content

[PyTorch] Enable generic QK norm support (+ RMSNorm/LayerNorm) #1966

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Jul 25, 2025

Conversation

negvet
Copy link
Collaborator

@negvet negvet commented Jul 18, 2025

Description

For the training stabilization purposes, QK tensors might be normalized.

  • Enables generic QK norm support.
  • Enables RMSNorm/LayerNorm as a normalization types (in addition to L2Normalization).
  • Enables qk normalization before and after RoPE (following both Qwen and Llama approaches).

Extention of #1864
Fixes #1958

Type of change

  • Documentation change (change only to the documentation, either a fix or a new content)
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Infra/Build change
  • Code refactoring

Checklist:

  • I have read and followed the contributing guidelines
  • The functionality is complete
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes

Copy link
Contributor

@Marks101 Marks101 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great, thanks for looking into this so quickly. This all looks good, but:

I have to admit that there is one more thing that I just noticed. The Qwen3 models apply QK normalization before RoPE, see here, this is in contrast to this implementation which is based on LLama4.
I was not aware that there are two different formulations for this. Sorry for that.

@negvet negvet changed the title Support RMSNorm for QK [PyTorch] Enable generic QK norm support (+ RMSNorm/LayerNorm) Jul 22, 2025
Signed-off-by: Evgeny <[email protected]>
@negvet
Copy link
Collaborator Author

negvet commented Jul 22, 2025

I have to admit that there is one more thing that I just noticed. The Qwen3 models apply QK normalization before RoPE, see here, this is in contrast to this implementation which is based on LLama4. I was not aware that there are two different formulations for this. Sorry for that.

This flexibility might be worth it; I supported it.
It brings some complexity and additional argument, but it is manageable, I guess.
Let's give it a review.

@negvet
Copy link
Collaborator Author

negvet commented Jul 22, 2025

/te-ci pytorch

@negvet negvet marked this pull request as ready for review July 22, 2025 09:05
@negvet negvet requested a review from Marks101 July 22, 2025 09:05
Copy link
Contributor

@Marks101 Marks101 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great, that was super quick! This looks very good to me 🥳

@negvet
Copy link
Collaborator Author

negvet commented Jul 24, 2025

/te-ci pytorch

Copy link
Collaborator

@timmoon10 timmoon10 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@negvet negvet merged commit 374849e into NVIDIA:main Jul 25, 2025
10 of 12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[PyTorch] Add generic support for QK layernorm
3 participants