Skip to content

Latest commit

 

History

History
12 lines (7 loc) · 322 Bytes

readme.MD

File metadata and controls

12 lines (7 loc) · 322 Bytes

Tokenizer

This is an example tokenizer that demonstrates how to tokenize text using the GPT-2-like approach.

Installation

To use this tokenizer, you need to have Python installed. You can install the required dependencies by running the following command:

pip install -r requirements.txt