The following environment variables let you customize LLMLean.
Example:
LLMLEAN_API
:together
: to use Together.ai APIopenai
: to use OpenAI API
LLMLEAN_API_KEY
:- E.g. API key for Together API, or OpenAI API key
LLMLEAN_ENDPOINT
: API endpoint- E.g.
https://api.together.xyz/v1/completions
for Together API
- E.g.
LLMLEAN_PROMPT
:fewshot
: for base modelsinstruction
: for instruction-tuned models
LLMLEAN_MODEL
:- Example for Together API:
mistralai/Mixtral-8x7B-Instruct-v0.1
- Example for Open AI:
gpt-4o
- Example for Together API:
LLMLEAN_NUMSAMPLES
:- Example:
10
- Example:
LLMLEAN_API
:ollama
: to use ollama (default)
LLMLEAN_ENDPOINT
:- With ollama it is
http://localhost:11434/api/generate
- With ollama it is
LLMLEAN_PROMPT
:fewshot
: for base modelsinstruction
: for instruction-tuned models
LLMLEAN_MODEL
:- Example:
solobsd/llemma-7b
- Example:
LLMLEAN_NUMSAMPLES
:- Example:
10
- Example: