Skip to content

Commit d0c308f

Browse files
committed
add configure documentation
1 parent 8d9f089 commit d0c308f

File tree

1 file changed

+132
-0
lines changed

1 file changed

+132
-0
lines changed

CONFIGURE.md

Lines changed: 132 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,132 @@
1+
# Configuring Durable-LLM
2+
3+
## Introduction
4+
5+
Durable-LLM supports multiple LLM providers and can be configured using environment variables or a configuration block. This document outlines the various configuration options available.
6+
7+
## General Configuration
8+
9+
You can configure Durable-LLM using a configuration block:
10+
11+
```ruby
12+
Durable::Llm.configure do |config|
13+
# Configuration options go here
14+
end
15+
```
16+
17+
## Provider-specific Configuration
18+
19+
### OpenAI
20+
21+
To configure the OpenAI provider, you can set the following environment variables:
22+
23+
- `OPENAI_API_KEY`: Your OpenAI API key
24+
- `OPENAI_ORGANIZATION`: (Optional) Your OpenAI organization ID
25+
26+
Alternatively, you can configure it in the configuration block:
27+
28+
```ruby
29+
Durable::Llm.configure do |config|
30+
config.openai.api_key = 'your-api-key'
31+
config.openai.organization = 'your-organization-id' # Optional
32+
end
33+
```
34+
35+
### Anthropic
36+
37+
To configure the Anthropic provider, you can set the following environment variable:
38+
39+
- `ANTHROPIC_API_KEY`: Your Anthropic API key
40+
41+
Alternatively, you can configure it in the configuration block:
42+
43+
```ruby
44+
Durable::Llm.configure do |config|
45+
config.anthropic.api_key = 'your-api-key'
46+
end
47+
```
48+
49+
### Hugging Face
50+
51+
To configure the Hugging Face provider, you can set the following environment variable:
52+
53+
- `HUGGINGFACE_API_KEY`: Your Hugging Face API key
54+
55+
Alternatively, you can configure it in the configuration block:
56+
57+
```ruby
58+
Durable::Llm.configure do |config|
59+
config.huggingface.api_key = 'your-api-key'
60+
end
61+
```
62+
63+
### Groq
64+
65+
To configure the Groq provider, you can set the following environment variable:
66+
67+
- `GROQ_API_KEY`: Your Groq API key
68+
69+
Alternatively, you can configure it in the configuration block:
70+
71+
```ruby
72+
Durable::Llm.configure do |config|
73+
config.groq.api_key = 'your-api-key'
74+
end
75+
```
76+
77+
## Using Environment Variables
78+
79+
You can also use environment variables configure any provider. The format is:
80+
81+
```
82+
DLLM__PROVIDER__SETTING
83+
```
84+
85+
For example:
86+
87+
```
88+
DLLM__OPENAI__API_KEY=your-openai-api-key
89+
DLLM__ANTHROPIC__API_KEY=your-anthropic-api-key
90+
```
91+
92+
## Loading Configuration from Datasette
93+
94+
Durable-LLM can load configuration from a io.datasette.llm configuration file located at `~/.config/io.datasette.llm/keys.json`. If this file exists, it will be parsed and used to set API keys for the supported providers.
95+
96+
## Default Provider
97+
98+
You can set a default provider in the configuration:
99+
100+
```ruby
101+
Durable::Llm.configure do |config|
102+
config.default_provider = 'openai'
103+
end
104+
```
105+
106+
The default provider is set to 'openai' if not specified.
107+
108+
## Supported Models
109+
110+
Each provider supports a set of models. You can get the list of supported models for a provider using the `models` method:
111+
112+
```ruby
113+
Durable::Llm::Providers::OpenAI.models
114+
Durable::Llm::Providers::Anthropic.models
115+
Durable::Llm::Providers::Huggingface.models
116+
Durable::Llm::Providers::Groq.models
117+
```
118+
119+
Note that some services (Anthropic, for example) don't offer a models endpoint, so they are hardcoded; others (Huggingface) have a inordinately long list, so also have a hardcoded list, at least for now.
120+
121+
## Streaming Support
122+
123+
Some providers support streaming responses. You can check if a provider supports streaming:
124+
125+
```ruby
126+
Durable::Llm::Providers::OpenAI.stream?
127+
```
128+
129+
## Conclusion
130+
131+
By properly configuring Durable-LLM, you can easily switch between different LLM providers and models in your application. Remember to keep your API keys secure and never commit them to version control.
132+

0 commit comments

Comments
 (0)