Skip to content

Alessio2405/TokenizerGPT-like

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Tokenizer

This is an example tokenizer that demonstrates how to tokenize text using the GPT-2-like approach.

Installation

To use this tokenizer, you need to have Python installed. You can install the required dependencies by running the following command:

pip install -r requirements.txt

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages