High-level PHP extension for interacting with Large Language Models using the octolib crate. Provides structured output and tool calling support with a clean, object-oriented API.
- ✅ Multi-provider support: OpenAI, Anthropic, OpenRouter, Google Vertex AI, Amazon Bedrock, Cloudflare Workers AI, DeepSeek, Z.ai
- ✅ Structured output: JSON and JSON Schema validation
- ✅ Tool calling: Function definitions with auto-execution
- ✅ Fluent interface: Chainable methods for elegant code
- ✅ Builder pattern: Separate builders for complex operations
- ✅ Type safety: Strong typing throughout
- ✅ Static compilation: Can be embedded in PHP binary
- ✅ Comprehensive error handling: Custom exception hierarchy
- ✅ IDE support: Full PHPDoc stubs
- PHP 8.1 or later
- Rust 1.70 or later
- Clang 5.0 or later
- cargo-php (install via
cargo install cargo-php)
cargo install cargo-php --locked
cd llm-php-ext
cargo php install# Build extension
cargo build --release
# Install to PHP
cargo php install
# Generate IDE stubs
cargo php stubs --stdout > php/llm.php# Linux
make build-linux-release
# macOS
make build-macos-release
# Windows
make build-windows-release<?php
$llm = new LLM('openai:gpt-4o');
$messages = MessageCollection::fromArray([
Message::user('What is PHP?')
]);
$response = $llm->complete($messages);
echo $response->getContent();
echo "Tokens used: " . $response->getUsage()->getTotalTokens();<?php
$schema = json_encode([
'type' => 'object',
'properties' => [
'name' => ['type' => 'string'],
'age' => ['type' => 'number'],
'skills' => ['type' => 'array', 'items' => ['type' => 'string']]
]
]);
$llm = new LLM('openai:gpt-4o');
$response = $llm->structured($schema)
->complete([Message::user('Tell me about a software engineer')]);
print_r($response->getStructured());<?php
$weatherTool = new Tool(
'get_weather',
'Get current weather for a location',
[
'type' => 'object',
'properties' => [
'location' => ['type' => 'string']
]
]
);
$llm = new LLM('openai:gpt-4o');
$response = $llm->withTools([$weatherTool])
->complete([Message::user("What's the weather in Tokyo?")]);
if ($response->hasToolCalls()) {
foreach ($response->getToolCalls() as $call) {
$result = getWeather($call->getArguments()['location']);
// Continue conversation with tool result
}
}<?php
$response = (new LLM('openai:gpt-4o'))
->setTemperature(0.8)
->setMaxTokens(500)
->setTopP(0.9)
->complete([
Message::user('Write a haiku about Rust')
]);
echo $response->getContent();Main class for interacting with language models.
__construct(string $model, array $options = [])$model: Model identifier (e.g.,'openai:gpt-4o','anthropic:claude-3-opus')$options: Configuration optionsapi_key: API key (optional, uses environment variable if not provided)base_url: Custom base URL (optional)timeout: Request timeout in seconds (default: 30)
complete(array|MessageCollection $messages): Response
structured(?string $schema = null): StructuredBuilder
withTools(array $tools = []): ToolBuilder
withOptions(array $options): self
setTemperature(float $temperature): self
setMaxTokens(int $maxTokens): self
setTopP(float $topP): self
setFrequencyPenalty(float $penalty): self
setPresencePenalty(float $penalty): self$content = $response->getContent();
$usage = $response->getUsage();
$model = $response->getModel();
$finishReason = $response->getFinishReason();
$array = $response->toArray();
$json = $response->toJson();$content = $response->getContent();
$structured = $response->getStructured(); // Parsed JSON
$usage = $response->getUsage();$content = $response->getContent();
$toolCalls = $response->getToolCalls();
$hasTools = $response->hasToolCalls();$promptTokens = $usage->getPromptTokens();
$completionTokens = $usage->getCompletionTokens();
$totalTokens = $usage->getTotalTokens();$userMsg = Message::user('Hello');
$assistantMsg = Message::assistant('Hi there!');
$systemMsg = Message::system('You are helpful.');
$toolMsg = Message::tool('call_123', 'Result');$messages = new MessageCollection();
$messages->addUser('Hello')
->addAssistant('Hi!')
->addSystem('Be helpful');
// Or from array
$messages = MessageCollection::fromArray([
Message::user('Hello'),
Message::assistant('Hi!')
]);$tool = new Tool(
'function_name',
'Function description',
[
'type' => 'object',
'properties' => [
'param' => ['type' => 'string']
]
]
);$id = $call->getId();
$name = $call->getName();
$args = $call->getArguments();- OpenAI:
openai:gpt-4o,openai:gpt-4-turbo, etc. - Anthropic:
anthropic:claude-3-opus,anthropic:claude-3-sonnet, etc. - OpenRouter:
openrouter:model-name - Google Vertex AI:
vertex:model-name - Amazon Bedrock:
bedrock:model-name - Cloudflare Workers AI:
workers-ai:model-name - DeepSeek:
deepseek:deepseek-chat - Z.ai:
zai:model-name
Set via environment variables or constructor options:
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."Or in code:
$llm = new LLM('openai:gpt-4o', [
'api_key' => 'sk-...'
]);$llm->setTemperature(0.7) // 0.0-2.0, default 0.7
->setMaxTokens(1000) // Maximum tokens, default 1000
->setTopP(0.9) // 0.0-1.0, default 1.0
->setFrequencyPenalty(0.0) // -2.0-2.0, default 0.0
->setPresencePenalty(0.0); // -2.0-2.0, default 0.0try {
$response = $llm->complete($messages);
} catch (LLMConnectionException $e) {
// Network or API errors
echo "Connection error: " . $e->getMessage();
} catch (LLMValidationException $e) {
// Invalid parameters
echo "Validation error: " . $e->getMessage();
} catch (LLMStructuredOutputException $e) {
// Structured output errors
echo "Structured output error: " . $e->getMessage();
} catch (LLMToolCallException $e) {
// Tool calling errors
echo "Tool call error: " . $e->getMessage();
} catch (LLMException $e) {
// Generic errors
echo "LLM error: " . $e->getMessage();
}# Run all tests
make test
# Or directly
php tests/run_tests.phpThis project uses GitHub Actions for continuous integration. The CI pipeline:
- ✅ Runs on every push and pull request to
mainanddevelopbranches - ✅ Checks code formatting with
cargo fmt - ✅ Lints code with
cargo clippy - ✅ Runs Rust unit tests
- ✅ Builds extension on Ubuntu and macOS
- ✅ Runs PHP integration tests
- ✅ Generates and verifies PHP stubs
You can run the entire CI pipeline locally before pushing:
make ciThis will execute all CI checks in order:
- Code formatting check
- Clippy linting
- Rust unit tests
- Extension build
- PHP integration tests
- Stub generation
For detailed CI documentation, see CI.md.
# Clone repository
git clone https://github.com/manticoresearch/llm-php-ext.git
cd llm-php-ext
# Install dependencies
cargo build
# Generate stubs
cargo php stubs --stdout > php/llm.php# Build for static linking
make static
# This creates a static library that can be embedded in PHP# Linux
cargo build --release --target x86_64-unknown-linux-gnu
# macOS
cargo build --release --target aarch64-apple-darwin
# Windows
cargo build --release --target x86_64-pc-windows-msvcError: Cannot turn unknown calling convention to tokens: 20
Solution: This is a bindgen issue with ext-php-rs. Try:
export LIBCLANG_PATH=$(xcrun --show-sdk-path)/usr/lib
cargo clean
cargo buildError: Library not loaded: @rpath/libclang.dylib
Solution: Install Xcode command line tools:
xcode-select --installError: Authentication failed
Solution: Set API key via environment variable or constructor:
export OPENAI_API_KEY="sk-..."Error: Model not supported
Solution: Check model identifier format: provider:model
See the examples/ directory for more examples:
basic_completion.php- Simple completionstructured_output.php- JSON schema validationtool_calling.php- Function callingfluent_interface.php- Fluent API usagemulti_turn.php- Multi-turn conversations
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Submit a pull request
Apache-2.0
- GitHub Issues: https://github.com/manticoresearch/llm-php-ext/issues
- Documentation: https://github.com/manticoresearch/llm-php-ext
- octolib: https://crates.io/crates/octolib