A community provider for the Vercel AI SDK that enables using Google's Gemini models through @google/gemini-cli-core and Google Cloud Code endpoints.
| Provider Version | AI SDK Version | NPM Tag | Branch |
|---|---|---|---|
| 2.x | v6 | latest |
main |
| 1.x | v5 | ai-sdk-v5 |
ai-sdk-v5 |
| 0.x | v4 | ai-sdk-v4 |
ai-sdk-v4 |
# AI SDK v6 (default)
npm install ai-sdk-provider-gemini-cli ai
# AI SDK v5
npm install ai-sdk-provider-gemini-cli@ai-sdk-v5 ai@^5.0.0
# AI SDK v4
npm install ai-sdk-provider-gemini-cli@ai-sdk-v4 ai@^4.3.16- Install and authenticate the Gemini CLI:
npm install -g @google/gemini-cli
gemini # Follow the interactive authentication setup- Add the provider to your project:
npm install ai-sdk-provider-gemini-cli aiimport { generateText } from 'ai';
import { createGeminiProvider } from 'ai-sdk-provider-gemini-cli';
const gemini = createGeminiProvider({
authType: 'oauth-personal',
});
const result = await generateText({
model: gemini('gemini-3-pro-preview'),
prompt: 'Write a haiku about coding',
});
console.log(result.text);Uses credentials from ~/.gemini/oauth_creds.json created by the Gemini CLI:
const gemini = createGeminiProvider({
authType: 'oauth-personal',
});const gemini = createGeminiProvider({
authType: 'api-key',
apiKey: process.env.GEMINI_API_KEY,
});Get your API key from Google AI Studio.
gemini-3-pro-preview- Latest model with enhanced reasoning (Preview)gemini-3-flash-preview- Fast, efficient model (Preview)gemini-2.5-pro- Previous generation model (64K output tokens)gemini-2.5-flash- Previous generation fast model (64K output tokens)
- Streaming responses
- Tool/function calling
- Structured output with Zod schemas
- Multimodal support (text and base64 images)
- TypeScript support
- Configurable logging
const model = gemini('gemini-3-pro-preview', {
temperature: 0.7,
maxOutputTokens: 1000,
topP: 0.95,
});// Disable logging
const model = gemini('gemini-3-flash-preview', { logger: false });
// Enable verbose debug logging
const model = gemini('gemini-3-flash-preview', { verbose: true });
// Custom logger
const model = gemini('gemini-3-flash-preview', {
logger: {
debug: (msg) => myLogger.debug(msg),
info: (msg) => myLogger.info(msg),
warn: (msg) => myLogger.warn(msg),
error: (msg) => myLogger.error(msg),
},
});See the examples/ directory for comprehensive examples:
check-auth.mjs- Verify authenticationbasic-usage.mjs- Text generationstreaming.mjs- Streaming responsesgenerate-object-basic.mjs- Structured output with Zodtool-calling.mjs- Function calling
npm run build
npm run example:check
npm run example:basic- Provider interface: ProviderV2 → ProviderV3
- Token usage: flat → hierarchical structure
- Warning format:
unsupported-setting→unsupported - Method rename:
textEmbeddingModel()→embeddingModel() - Finish reason: string →
{ unified, raw }object
See CHANGELOG.md for details.
- Requires Node.js >= 20
- OAuth requires global Gemini CLI installation
- Image URLs not supported (use base64)
- Some parameters not supported:
frequencyPenalty,presencePenalty,seed - Abort signals work but underlying requests continue in background
This is an unofficial community provider, not affiliated with Google or Vercel. Your data is sent to Google's servers. See Google's Terms of Service.
MIT