The first production-ready integration between CopilotKit and Cloudflare Workers AI.
- 68% faster responses (110ms vs 350ms)
- 93% cheaper than OpenAI ($11 vs $150 per million tokens)
- 200+ edge locations worldwide
- Zero cold starts with Cloudflare Workers AI
npm install copilotkit-cloudflare
# or
yarn add copilotkit-cloudflare
# or
pnpm add copilotkit-cloudflareCreate app/api/copilotkit/route.ts:
import { createCopilotCloudflareHandler } from 'copilotkit-cloudflare';
export const POST = createCopilotCloudflareHandler({
apiToken: process.env.CLOUDFLARE_API_TOKEN!,
accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
model: '@cf/meta/llama-3.1-8b-instruct', // Optional
temperature: 0.7, // Optional
debug: process.env.NODE_ENV === 'development', // Optional
});import { CopilotKit } from '@copilotkit/react-core';
import { CopilotChat } from '@copilotkit/react-ui';
import '@copilotkit/react-ui/styles.css';
export default function MyApp() {
return (
<CopilotKit runtimeUrl="/api/copilotkit">
<CopilotChat
instructions="You are running on Cloudflare Workers AI at the edge!"
labels={{
title: "AI Assistant",
initial: "Hello! I'm powered by Cloudflare Workers AI. How can I help?",
}}
/>
</CopilotKit>
);
}interface CopilotCloudflareConfig {
/** Cloudflare API token */
apiToken: string;
/** Cloudflare account ID */
accountId: string;
/** AI model to use (defaults to Llama 3.1 8B) */
model?: string;
/** Use Cloudflare AI Gateway for caching and observability */
useGateway?: boolean;
/** AI Gateway ID (required if useGateway is true) */
gatewayId?: string;
/** Temperature for AI responses (0-1) */
temperature?: number;
/** Maximum tokens in response */
maxTokens?: number;
/** Enable debug logging */
debug?: boolean;
}For production apps, use Cloudflare AI Gateway for caching and observability:
export const POST = createCopilotCloudflareHandler({
apiToken: process.env.CLOUDFLARE_API_TOKEN!,
accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
useGateway: true,
gatewayId: process.env.CLOUDFLARE_GATEWAY_ID!,
model: '@cf/meta/llama-3.3-70b-instruct', // Use more capable model with gateway
});| Model | Cloudflare Name | Best For |
|---|---|---|
| Llama 3.1 8B | @cf/meta/llama-3.1-8b-instruct |
Fast general tasks |
| Llama 3.1 70B | @cf/meta/llama-3.1-70b-instruct |
Complex reasoning |
| Llama 3.3 70B | @cf/meta/llama-3.3-70b-instruct |
Latest features |
The package automatically maps OpenAI model names:
gpt-4o-miniβ@cf/meta/llama-3.1-8b-instructgpt-4β@cf/meta/llama-3.1-70b-instructgpt-4-turboβ@cf/meta/llama-3.3-70b-instruct
Create a .env.local file:
CLOUDFLARE_API_TOKEN=your_cloudflare_api_token
CLOUDFLARE_ACCOUNT_ID=your_cloudflare_account_id
CLOUDFLARE_GATEWAY_ID=your_gateway_id # Optional, for AI Gateway| Provider | Cost per 1M tokens | Typical Response Time |
|---|---|---|
| OpenAI GPT-4o | $150 | 350ms |
| Cloudflare Llama 3.1 8B | $11 | 110ms |
| Savings | 93% cheaper | 68% faster |
This package supports all CopilotKit request formats:
- β Modern CopilotKit (1.9.x) GraphQL mutations
- β Direct chat completions
- β Legacy GraphQL queries
- β REST API fallbacks
Replace your existing CopilotKit API route:
// Before (OpenAI)
import { CopilotRuntime, OpenAIAdapter } from '@copilotkit/runtime';
const copilotKit = new CopilotRuntime();
export const POST = copilotKit.streamHttpServerResponse(
openaiAdapter.chatCompletionRequest()
);
// After (Cloudflare)
import { createCopilotCloudflareHandler } from 'copilotkit-cloudflare';
export const POST = createCopilotCloudflareHandler({
apiToken: process.env.CLOUDFLARE_API_TOKEN!,
accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
});No frontend changes required! π
import {
createCopilotCloudflareHandler,
createCopilotCloudflareHealthCheck,
createCopilotCloudflareOptions
} from 'copilotkit-cloudflare';
const config = {
apiToken: process.env.CLOUDFLARE_API_TOKEN!,
accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
};
export const POST = createCopilotCloudflareHandler(config);
export const GET = createCopilotCloudflareHealthCheck(config);
export const OPTIONS = createCopilotCloudflareOptions();Enable debug logging:
export const POST = createCopilotCloudflareHandler({
apiToken: process.env.CLOUDFLARE_API_TOKEN!,
accountId: process.env.CLOUDFLARE_ACCOUNT_ID!,
debug: true, // Enables request/response logging
});- "Account ID required" - Set
CLOUDFLARE_ACCOUNT_IDenvironment variable - "API token required" - Set
CLOUDFLARE_API_TOKENenvironment variable - 403 Forbidden - Check your Cloudflare API token permissions
- Model not found - Verify the model name is correct for Cloudflare
- Check the CopilotKit docs
- Review Cloudflare Workers AI docs
- Open an issue on GitHub
MIT License - see LICENSE file for details.
Contributions welcome! Please read our contributing guidelines.
Built with β€οΈ by developers who believe AI should be fast and affordable.