Skip to content

uclh-criu/ari-llm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ari-llm

Simple local scripts to:

  • read prompt .txt files from prompts/
  • run them with either Hugging Face or Ollama
  • append results to a CSV
  • print saved results in the terminal

Files

  • script.py: main runner (generation + CSV logging)
  • results.py: prints prompt_file and output from CSV
  • requirements.txt: Python dependencies
  • prompts/: input prompt text files

Install

pip install -r requirements.txt

Configure

Edit the variables at the top of script.py:

  • PROVIDER: "hf" or "ollama"
  • MODEL_ID: model name for the selected provider
  • USE_STRUCTURED_OUTPUT: True/False (used by Ollama path)
  • PROMPTS_DIR: input directory (default prompts)
  • OUTPUT_CSV: output file (default ari_llm_outputs.csv)
  • MAX_NEW_TOKENS: generation budget
  • TEMPERATURE: sampling temperature (0.0 is forced automatically for structured Ollama)
  • OLLAMA_URL: Ollama server URL

Run generation

python script.py

View results

python results.py

About

Simple scripts to push prompts through a local LLM in the TRE

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages