Skip to content

Conversation

Conarnar
Copy link
Contributor

Summary

Added JavaScript bindings for the tokenizer library so that we can use them to run LLMs in a web browser.

Test plan

I will add end to end tests later.

Copy link

pytorch-bot bot commented Aug 21, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/13566

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure

As of commit 9a8993c with merge base bbc281f (image):

NEW FAILURE - The following job has failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Aug 21, 2025
Copy link

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

@Conarnar Conarnar requested a review from JacobSzwejbka August 21, 2025 00:35
@@ -871,6 +871,10 @@ if(EXECUTORCH_BUILD_WASM)
add_subdirectory(${CMAKE_CURRENT_SOURCE_DIR}/extension/wasm)
endif()

if(EXECUTORCH_BUILD_TOKENIZERS_WASM)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if we have too many cmake options lmao.

Copy link
Contributor

@JacobSzwejbka JacobSzwejbka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know you've tested this in the demo repo and the tokenizer unit tests are a bit of a pain to set up because they are a submodule.


EMSCRIPTEN_BINDINGS(TokenizerModule) {
#define JS_BIND_TOKENIZER(NAME) \
class_<JsTokenizer<NAME>>(#NAME) \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you measure the codesize of this lib. I'm curious how bad it is having to individually bind each template instantiation

Copy link
Contributor Author

@Conarnar Conarnar Aug 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

102K js, 1.4M wasm
Changing JsTokenizer to not be a template and only need to bind one class reduces the js to 101K.
Removing everything except SentencePiece leaves it with 99K js, 807K wasm so it seems like most of the codesize comes from including each of the different types of tokenizer.

@mergennachin
Copy link
Contributor

What are the pros-and-cons between putting here vs https://github.com/meta-pytorch/tokenizers library directly?

@Conarnar
Copy link
Contributor Author

What are the pros-and-cons between putting here vs https://github.com/meta-pytorch/tokenizers library directly?

Pros:

  • Puts all the Wasm modules together.

Cons:

  • Separated from the rest of the Tokenizers libraries, tests, resources, etc.

@JacobSzwejbka
Copy link
Contributor

What are the pros-and-cons between putting here vs https://github.com/meta-pytorch/tokenizers library directly?

I have strong preference to have all the wasm code colocated right now since its in a more exploratory phase of development.

@Conarnar Conarnar merged commit 5f2b1d3 into pytorch:main Aug 25, 2025
103 of 104 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants