Releases: leafo/lua-openai
OpenRouter compatibility client
Responses API, Gemini compatibility, simplified streaming
Breaking Changes:
-
ChatSessionno longer exported from main module- Use
require("openai.chat_completions").ChatSessioninstead
- Use
-
Default model changed from
gpt-3.5-turbotogpt-4.1 -
chat()streaming callback now receives(delta, raw)instead of(chunk)- Use
create_chat_completion()for the previous raw chunk behavior
- Use
New Features:
-
Added Responses API support
create_response(params): Create a new responseresponse(response_id): Get a response by IDdelete_response(response_id): Delete a responsecancel_response(response_id): Cancel a response
-
Added
ResponsesChatSessionfor conversational state with Responses API -
Added Gemini compatibility client
require("openai.compat.gemini")for Google Gemini API
-
Added
create_chat_completion()method for raw streaming behavior
Improvements:
-
SSE parsing moved to dedicated
openai.ssemodule -
Removed
lpegdependency in favor of simpler line-based SSE parsing
v1.4.3
Improved support for images & assistants
New Features:
-
Added support for image generation endpoint
- New method:
image_generation(params)
- New method:
-
Added new endpoints for Assistants API:
assistants(): Get list of assistantsthreads(): Get list of threadsthread_messages(thread_id): Get messages for a specific threaddelete_thread(thread_id): Delete a specific thread
-
Added new endpoints for managing files:
files(): Get list of filesfile(file_id): Get details of a specific filedelete_file(file_id): Delete a specific file
Improvements:
-
Updated content format to support image_url input for GPT Vision
- Added support for
image_urltype in content format
- Added support for
-
Made authorization optional
- Allows usage with local models that don't require API key
v1.2.0 - Support for chat with functions
OpenAI allows sending a list of function declarations that the LLM can decide to call based on the prompt. The function calling interface must be used with chat completions and the gpt-4-0613 or gpt-3.5-turbo-0613 models or later.
This update supports providing a function schema to the initialization of the chat session object.
The function result is then detected when parsing the response from a chat message.
See https://github.com/leafo/lua-openai/blob/main/examples/example5.lua or the README for a complete example.
v1.1.0 - Improved error message output for bad-status errors
Returned error messages will include the message from the server if available.
Full Changelog: v1.0.0...v1.1.0
v1.0.0
Initial release
luarocks install lua-openai
Full Changelog: https://github.com/leafo/lua-openai/commits/v1.0.0