You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: async-openai/README.md
+81-39Lines changed: 81 additions & 39 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,40 +17,33 @@ project is and will be done manually when `async-openai` releases a new version.
17
17
with `async-openai` releases, which means when `async-openai` releases `x.y.z`, `async-openai-wasm` also releases
18
18
a `x.y.z` version.
19
19
20
-
`async-openai-wasm` is an unofficial Rust library for OpenAI.
21
-
22
-
- It's based on [OpenAI OpenAPI spec](https://github.com/openai/openai-openapi)
23
-
- Current features:
24
-
-[x] Administration (partially implemented)
25
-
-[x] Assistants (beta)
26
-
-[x] Audio
27
-
-[x] Batch
28
-
-[x] Chat
29
-
-[x] ChatKit (beta)
30
-
-[x] Completions (legacy)
31
-
-[x] Conversations
32
-
-[x] Containers
33
-
-[x] Embeddings
34
-
-[x] Evals
35
-
-[x] Files
36
-
-[x] Fine-Tuning
37
-
-[x] Images
38
-
-[x] Models
39
-
-[x] Moderations
40
-
-[x] Realtime (partially implemented)
41
-
-[x] Responses
42
-
-[x] Uploads
43
-
-[x] Vector Stores
44
-
-[x] Videos
45
-
-[x] Webhooks
46
-
-[x]**WASM support**
47
-
-[x] Reasoning Model Support: support models like DeepSeek R1 via broader support for OpenAI-compatible endpoints, see `examples/reasoning`
20
+
`async-openai-wasm` is an unofficial Rust library for OpenAI, based on [OpenAI OpenAPI spec](https://github.com/openai/openai-openapi). It implements all APIs from the spec:
|**Administration**| Administration, Admin API Keys, Invites, Users, Projects, Project users, Project service accounts, Project API keys, Project rate limits, Audit logs, Usage, Certificates |
34
+
|**Legacy**| Completions |
35
+
36
+
Features that makes `async-openai` unique:
48
37
- Bring your own custom types for Request or Response objects.
49
38
- SSE streaming on available APIs
50
39
- Ergonomic builder pattern for all request objects.
51
40
- Microsoft Azure OpenAI Service (only for APIs matching OpenAI spec)
52
41
- Bring your own custom types for Request or Response objects.
53
42
43
+
More on `async-openai-wasm`:
44
+
-**WASM support**
45
+
- Reasoning Model Support: support models like DeepSeek R1 via broader support for OpenAI-compatible endpoints, see `examples/reasoning`
46
+
54
47
**Note on Azure OpenAI Service (AOS)**: `async-openai-wasm` primarily implements OpenAI spec, and doesn't try to
55
48
maintain parity with spec of AOS. Just like `async-openai`.
56
49
@@ -88,9 +81,9 @@ $Env:OPENAI_API_KEY='sk-...'
88
81
and [WASM examples](https://github.com/ifsheldon/async-openai-wasm/tree/main/examples) in `async-openai-wasm`.
89
82
- Visit [docs.rs/async-openai](https://docs.rs/async-openai) for docs.
90
83
91
-
## Realtime API
84
+
## Realtime
92
85
93
-
Only types for Realtime API are implemented, and can be enabled with feature flag `realtime`.
86
+
Realtime types and APIs can be enabled with feature flag `realtime`.
94
87
95
88
Again, the types do not bundle with a specific WS implementation. Need to convert a client event into a WS message by yourself, which is just simple `your_ws_impl::Message::Text(some_client_event.into_text())`.
This project adheres to [Rust Code of Conduct](https://www.rust-lang.org/policies/code-of-conduct)
167
206
207
+
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in async-openai by you, shall be licensed as MIT, without any additional terms or conditions.
208
+
168
209
## Why `async-openai-wasm`
169
210
170
211
Because I wanted to develop and release a crate that depends on the wasm feature in `experiments` branch
171
212
of [async-openai](https://github.com/64bit/async-openai), but the pace of stabilizing the wasm feature is different
172
213
from what I expected.
214
+
-[openai-func-enums](https://github.com/frankfralick/openai-func-enums) macros for working with function/tool calls.
0 commit comments