Skip to content

Latest commit

 

History

History
15 lines (10 loc) · 938 Bytes

README_en.md

File metadata and controls

15 lines (10 loc) · 938 Bytes

llm-jp-eval-inference

[ English | 日本語 ]

In this repository, we primarily release implementations of fast batch inference processing for llm-jp-eval using the following libraries: For installation and inference execution, please refer to the README.md within each module.

In addition, a tool for run management using Weights & Biases are published in wandb_run_management.

Inference and Evaluation Execution Methods

Please refer to the Inference Execution Method and Evaluation Method in llm-jp-eval.