Headlong is a framework for human users to create and curate high quality chain-of-thought datasets and use them in AI Agents.
The webapp frontend is in packages/webapp
- it's a vite Typescript project.
The webapp depends on a thought_server
(found in packages/thought_server
) which is written in Python and wraps LLMs for thought generation.
The environment is in packages/env
- it's a node daemon written in Typescript. you should run this in a docker container or EC2 instance.
The environment uses GPT4 function calling to use tools, including a terminalServer
that itself wraps ht
(headless terminal).
The webapp communicates with the environment via a Supabase thoughts
table and Supabase's realtime system.
### == ht ==
# Download latest ht binary from https://github.com/andyk/ht/releases/latest
# and make sure it is on your path.
### == thought server ==
# in a new terminal
cd packages/thought_server
# You need python >= 3.10 since we use the `match` syntax.
virtualenv venv
. ./venv/bin/activate
pip install -r requirements.txt
# make sure you create or get a copy of thinkers.yaml and put it into ./
. ./launch.sh
### == headlong UI webapp ==
cd packages/webapp
npm install
npm run dev
## By default your webapp will connect to the main env running in EC2
## via supabase realtime. If you want to override that and use a local
## env, then you'll need to run the terminalServer and env locally.
## We strongly recommend you run these in a docker instance.
### == terminal server ==
# in a new terminal
cd packages/env
npm install
npm run terminalServer
### == env daemon ==
# in a new terminal
cd packages/env
npm run env