diff --git a/.changeset/lovely-weeks-sip.md b/.changeset/lovely-weeks-sip.md
new file mode 100644
index 000000000000..b3516522e426
--- /dev/null
+++ b/.changeset/lovely-weeks-sip.md
@@ -0,0 +1,5 @@
+---
+'@ai-sdk/perplexity': patch
+---
+
+feat (provider/perplexity): add Perplexity provider
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 9858db97304d..72f67577a343 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -20,6 +20,7 @@ You can find the changelogs for the individual packages in their respective `CHA
- [@ai-sdk/mistral](./packages/mistral/CHANGELOG.md)
- [@ai-sdk/openai](./packages/openai/CHANGELOG.md)
- [@ai-sdk/openai-compatible](./packages/openai-compatible/CHANGELOG.md)
+- [@ai-sdk/perplexity](./packages/perplexity/CHANGELOG.md)
- [@ai-sdk/togetherai](./packages/togetherai/CHANGELOG.md)
- [@ai-sdk/xai](./packages/xai/CHANGELOG.md)
diff --git a/content/docs/02-foundations/02-providers-and-models.mdx b/content/docs/02-foundations/02-providers-and-models.mdx
index 1be1e2b84c7e..001eee6af994 100644
--- a/content/docs/02-foundations/02-providers-and-models.mdx
+++ b/content/docs/02-foundations/02-providers-and-models.mdx
@@ -22,7 +22,7 @@ Here is an overview of the AI SDK Provider Architecture:
## AI SDK Providers
-The AI SDK comes with several providers that you can use to interact with different language models:
+The AI SDK comes with a wide range of providers that you can use to interact with different language models:
- [OpenAI Provider](/providers/ai-sdk-providers/openai) (`@ai-sdk/openai`)
- [Azure OpenAI Provider](/providers/ai-sdk-providers/azure) (`@ai-sdk/azure`)
@@ -39,10 +39,10 @@ The AI SDK comes with several providers that you can use to interact with differ
- [DeepSeek Provider](/providers/ai-sdk-providers/deepseek) (`@ai-sdk/deepseek`)
- [Cerebras Provider](/providers/ai-sdk-providers/cerebras) (`@ai-sdk/cerebras`)
- [Groq Provider](/providers/ai-sdk-providers/groq) (`@ai-sdk/groq`)
+- [Perplexity Provider](/providers/ai-sdk-providers/perplexity) (`@ai-sdk/perplexity`)
-You can also use the OpenAI provider with OpenAI-compatible APIs:
+You can also use the [OpenAI Compatible provider](/providers/openai-compatible-providers) with OpenAI-compatible APIs:
-- [Perplexity](/providers/ai-sdk-providers/perplexity)
- [LM Studio](/providers/openai-compatible-providers/lmstudio)
- [Baseten](/providers/openai-compatible-providers/baseten)
diff --git a/content/docs/02-guides/03-llama-3_1.mdx b/content/docs/02-guides/03-llama-3_1.mdx
index 62dd10bed75c..d67edf318edb 100644
--- a/content/docs/02-guides/03-llama-3_1.mdx
+++ b/content/docs/02-guides/03-llama-3_1.mdx
@@ -53,7 +53,6 @@ const { text } = await generateText({
Llama 3.1 is available to use with many AI SDK providers including
[Groq](/providers/ai-sdk-providers/groq), [Amazon
Bedrock](/providers/ai-sdk-providers/amazon-bedrock),
- [Perplexity](/providers/ai-sdk-providers/perplexity),
[Baseten](/providers/openai-compatible-providers/baseten)
[Fireworks](/providers/ai-sdk-providers/fireworks), and more.
diff --git a/content/providers/01-ai-sdk-providers/70-perplexity.mdx b/content/providers/01-ai-sdk-providers/70-perplexity.mdx
new file mode 100644
index 000000000000..f05cdbdd67c0
--- /dev/null
+++ b/content/providers/01-ai-sdk-providers/70-perplexity.mdx
@@ -0,0 +1,134 @@
+---
+title: Perplexity
+description: Learn how to use Perplexity's Sonar API with the AI SDK.
+---
+
+# Perplexity Provider
+
+The [Perplexity](https://sonar.perplexity.ai) provider offers access to Sonar API - a language model that uniquely combines real-time web search with natural language processing. Each response is grounded in current web data and includes detailed citations, making it ideal for research, fact-checking, and obtaining up-to-date information.
+
+API keys can be obtained from the [Perplexity Platform](https://docs.perplexity.ai).
+
+## Setup
+
+The Perplexity provider is available via the `@ai-sdk/perplexity` module. You can install it with:
+
+
+
+
+
+
+
+
+
+
+
+
+
+## Provider Instance
+
+You can import the default provider instance `perplexity` from `@ai-sdk/perplexity`:
+
+```ts
+import { perplexity } from '@ai-sdk/perplexity';
+```
+
+For custom configuration, you can import `createPerplexity` and create a provider instance with your settings:
+
+```ts
+import { createPerplexity } from '@ai-sdk/perplexity';
+
+const perplexity = createPerplexity({
+ apiKey: process.env.PERPLEXITY_API_KEY ?? '',
+});
+```
+
+You can use the following optional settings to customize the Perplexity provider instance:
+
+- **baseURL** _string_
+
+ Use a different URL prefix for API calls.
+ The default prefix is `https://api.perplexity.ai`.
+
+- **apiKey** _string_
+
+ API key that is being sent using the `Authorization` header. It defaults to
+ the `PERPLEXITY_API_KEY` environment variable.
+
+- **headers** _Record<string,string>_
+
+ Custom headers to include in the requests.
+
+- **fetch** _(input: RequestInfo, init?: RequestInit) => Promise<Response>_
+
+ Custom [fetch](https://developer.mozilla.org/en-US/docs/Web/API/fetch) implementation.
+
+## Language Models
+
+You can create Perplexity models using a provider instance:
+
+```ts
+import { perplexity } from '@ai-sdk/perplexity';
+import { generateText } from 'ai';
+
+const { text } = await generateText({
+ model: perplexity('sonar-pro'),
+ prompt: 'What are the latest developments in quantum computing?',
+});
+```
+
+Perplexity models can be used in the `streamText` and `streamUI` functions (see
+[AI SDK Core](/docs/ai-sdk-core) and [AI SDK RSC](/docs/ai-sdk-rsc)).
+
+### Provider Metadata
+
+The Perplexity provider includes additional experimental metadata in the response through `experimental_providerMetadata`:
+
+```ts
+const result = await generateText({
+ model: perplexity('sonar-pro'),
+ prompt: 'What are the latest developments in quantum computing?',
+});
+
+console.log(result.experimental_providerMetadata);
+// Example output:
+// {
+// perplexity: {
+// citations: [
+// 'https://www.sfchronicle.com',
+// 'https://www.cbsnews.com/sanfrancisco/',
+// ],
+// usage: { citationTokens: 5286, numSearchQueries: 1 },
+// },
+// }
+```
+
+The metadata includes:
+
+- `citations`: Array of URLs used as sources for the response
+- `usage`: Object containing `citationTokens` and `numSearchQueries` metrics
+
+
+ For more details about Perplexity's citations see the [Perplexity chat
+ completion docs](https://docs.perplexity.ai/api-reference/chat-completions).
+
+
+## Model Capabilities
+
+| Model | Image Input | Object Generation | Tool Usage | Tool Streaming |
+| ----------- | ------------------- | ------------------- | ------------------- | ------------------- |
+| `sonar-pro` | | | | |
+| `sonar` | | | | |
+
+### Key Features
+
+- **Real-time Web Search**: Both models provide grounded responses using real-time web search
+- **Citations**: Sonar Pro provides 2x more citations than standard Sonar
+- **Data Privacy**: No training on customer data
+- **Self-serve API**: Immediate access with scalable pricing
+- **Advanced Queries**: Support for complex queries and follow-up questions
+
+
+ Please see the [Perplexity docs](https://docs.perplexity.ai) for detailed API
+ documentation and the latest updates.
+
diff --git a/content/providers/02-openai-compatible-providers/10-perplexity.mdx b/content/providers/02-openai-compatible-providers/10-perplexity.mdx
deleted file mode 100644
index 5c5d7120a6c2..000000000000
--- a/content/providers/02-openai-compatible-providers/10-perplexity.mdx
+++ /dev/null
@@ -1,71 +0,0 @@
----
-title: Perplexity
-description: Use the Perplexity OpenAI compatible API with the AI SDK.
----
-
-# Perplexity Provider
-
-[Perplexity](https://docs.perplexity.ai/) is a search engine that uses LLMs to answer questions.
-It offers an OpenAI compatible API that you can use with the AI SDK.
-
-## Setup
-
-The Perplexity provider is available via the `@ai-sdk/openai-compatible` module as it is compatible with the OpenAI API.
-You can install it with
-
-
-
-
-
-
-
-
-
-
-
-
-
-## Provider Instance
-
-To use Perplexity, you can create a custom provider instance with the `createOpenAI` function from `@ai-sdk/openai`:
-
-```ts
-import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
-
-const perplexity = createOpenAICompatible({
- name: 'perplexity',
- apiKey: process.env.PERPLEXITY_API_KEY,
- baseURL: 'https://api.perplexity.ai/',
-});
-```
-
-## Language Models
-
-You can create [Perplexity models](https://docs.perplexity.ai/docs/model-cards) using a provider instance.
-The first argument is the model id, e.g. `llama-3.1-sonar-large-32k-online`.
-
-```ts
-const model = perplexity('llama-3.1-sonar-large-32k-online');
-```
-
-### Example
-
-You can use Perplexity language models to generate text with the `generateText` function:
-
-```ts
-import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
-import { generateText } from 'ai';
-
-const perplexity = createOpenAICompatible({
- name: 'perplexity',
- apiKey: process.env.PERPLEXITY_API_KEY,
- baseURL: 'https://api.perplexity.ai/',
-});
-
-const { text } = await generateText({
- model: perplexity('llama-3.1-sonar-small-128k-online'),
- prompt: 'Write a vegetarian lasagna recipe for 4 people.',
-});
-```
-
-Perplexity language models can also be used in the `streamText` function.
diff --git a/content/providers/02-openai-compatible-providers/index.mdx b/content/providers/02-openai-compatible-providers/index.mdx
index 15a5e4990c37..03de2071883a 100644
--- a/content/providers/02-openai-compatible-providers/index.mdx
+++ b/content/providers/02-openai-compatible-providers/index.mdx
@@ -11,7 +11,6 @@ Below we focus on the general setup and provider instance creation. You can also
We provide detailed documentation for the following OpenAI compatible providers:
-- [Perplexity](/providers/openai-compatible-providers/perplexity)
- [LM Studio](/providers/openai-compatible-providers/lmstudio)
- [Baseten](/providers/openai-compatible-providers/baseten)
diff --git a/examples/ai-core/package.json b/examples/ai-core/package.json
index e1f84efb1243..86b6882981f5 100644
--- a/examples/ai-core/package.json
+++ b/examples/ai-core/package.json
@@ -17,6 +17,7 @@
"@ai-sdk/mistral": "1.1.2",
"@ai-sdk/openai": "1.1.2",
"@ai-sdk/openai-compatible": "0.1.2",
+ "@ai-sdk/perplexity": "0.0.0",
"@ai-sdk/provider": "1.0.6",
"@ai-sdk/replicate": "0.1.2",
"@ai-sdk/togetherai": "0.1.2",
diff --git a/examples/ai-core/src/e2e/perplexity.test.ts b/examples/ai-core/src/e2e/perplexity.test.ts
new file mode 100644
index 000000000000..f9cee0ba6a6a
--- /dev/null
+++ b/examples/ai-core/src/e2e/perplexity.test.ts
@@ -0,0 +1,30 @@
+import 'dotenv/config';
+import { expect } from 'vitest';
+import {
+ perplexity as provider,
+ PerplexityErrorData,
+} from '@ai-sdk/perplexity';
+import {
+ createFeatureTestSuite,
+ createLanguageModelWithCapabilities,
+} from './feature-test-suite';
+import { APICallError } from '@ai-sdk/provider';
+
+const createChatModel = (modelId: string) =>
+ createLanguageModelWithCapabilities(provider.chat(modelId));
+
+createFeatureTestSuite({
+ name: 'perplexity',
+ models: {
+ invalidModel: provider.chat('no-such-model'),
+ languageModels: [createChatModel('sonar-pro'), createChatModel('sonar')],
+ },
+ timeout: 30000,
+ customAssertions: {
+ errorValidator: (error: APICallError) => {
+ expect((error.data as PerplexityErrorData).code).toBe(
+ 'Some requested entity was not found',
+ );
+ },
+ },
+})();
diff --git a/examples/ai-core/src/generate-text/perplexity.ts b/examples/ai-core/src/generate-text/perplexity.ts
new file mode 100644
index 000000000000..4ad67ec338c9
--- /dev/null
+++ b/examples/ai-core/src/generate-text/perplexity.ts
@@ -0,0 +1,18 @@
+import 'dotenv/config';
+import { perplexity } from '@ai-sdk/perplexity';
+import { generateText } from 'ai';
+
+async function main() {
+ const result = await generateText({
+ model: perplexity('sonar-pro'),
+ prompt: 'Invent a new holiday and describe its traditions.',
+ });
+
+ console.log(result.text);
+ console.log();
+ console.log('Token usage:', result.usage);
+ console.log('Finish reason:', result.finishReason);
+ console.log('Metadata:', result.experimental_providerMetadata);
+}
+
+main().catch(console.error);
diff --git a/examples/ai-core/src/stream-text/perplexity.ts b/examples/ai-core/src/stream-text/perplexity.ts
index e008497bba41..3f1f489219e2 100644
--- a/examples/ai-core/src/stream-text/perplexity.ts
+++ b/examples/ai-core/src/stream-text/perplexity.ts
@@ -1,18 +1,10 @@
-import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
+import { perplexity } from '@ai-sdk/perplexity';
import { streamText } from 'ai';
import 'dotenv/config';
-const perplexity = createOpenAICompatible({
- name: 'perplexity',
- headers: {
- Authorization: `Bearer ${process.env.PERPLEXITY_API_KEY ?? ''}`,
- },
- baseURL: 'https://api.perplexity.ai/',
-});
-
async function main() {
const result = streamText({
- model: perplexity('llama-3.1-sonar-small-128k-online'),
+ model: perplexity('sonar-pro'),
prompt:
'List the top 5 San Francisco news from the past week.' +
'You must include the date of each article.',
@@ -25,6 +17,7 @@ async function main() {
console.log();
console.log('Token usage:', await result.usage);
console.log('Finish reason:', await result.finishReason);
+ console.log('Metadata:', await result.experimental_providerMetadata);
}
main().catch(console.error);
diff --git a/packages/perplexity/CHANGELOG.md b/packages/perplexity/CHANGELOG.md
new file mode 100644
index 000000000000..c83ca48a6a45
--- /dev/null
+++ b/packages/perplexity/CHANGELOG.md
@@ -0,0 +1 @@
+# @ai-sdk/perplexity
diff --git a/packages/perplexity/README.md b/packages/perplexity/README.md
new file mode 100644
index 000000000000..6bc00cd37666
--- /dev/null
+++ b/packages/perplexity/README.md
@@ -0,0 +1,47 @@
+# AI SDK - Perplexity Provider
+
+The **[Perplexity provider](https://sdk.vercel.ai/providers/ai-sdk-providers/perplexity)** for the [AI SDK](https://sdk.vercel.ai/docs)
+contains language model support for Perplexity's Sonar API - a powerful answer engine with real-time web search capabilities.
+
+## Features
+
+- Real-time web search grounding for accurate, up-to-date responses
+- Support for advanced queries and follow-up questions
+- Multiple tiers available:
+ - **Sonar Pro**: Enhanced capabilities for complex tasks with 2x more citations
+ - **Sonar**: Lightweight offering optimized for speed and cost
+- Industry-leading answer quality
+- Data privacy - no training on customer data
+- Self-serve API access with scalable pricing
+
+## Setup
+
+The Perplexity provider is available in the `@ai-sdk/perplexity` module. You can install it with:
+
+```bash
+npm i @ai-sdk/perplexity
+```
+
+## Provider Instance
+
+You can import the default provider instance `perplexity` from `@ai-sdk/perplexity`:
+
+```ts
+import { perplexity } from '@ai-sdk/perplexity';
+```
+
+## Example
+
+```ts
+import { perplexity } from '@ai-sdk/perplexity';
+import { generateText } from 'ai';
+
+const { text } = await generateText({
+ model: perplexity('sonar-pro'),
+ prompt: 'What are the latest developments in quantum computing?',
+});
+```
+
+## Documentation
+
+Please check out the **[Perplexity provider documentation](https://sdk.vercel.ai/providers/ai-sdk-providers/perplexity)** for more information.
diff --git a/packages/perplexity/package.json b/packages/perplexity/package.json
new file mode 100644
index 000000000000..c9b74c0df8fa
--- /dev/null
+++ b/packages/perplexity/package.json
@@ -0,0 +1,64 @@
+{
+ "name": "@ai-sdk/perplexity",
+ "version": "0.0.0",
+ "license": "Apache-2.0",
+ "sideEffects": false,
+ "main": "./dist/index.js",
+ "module": "./dist/index.mjs",
+ "types": "./dist/index.d.ts",
+ "files": [
+ "dist/**/*",
+ "CHANGELOG.md"
+ ],
+ "scripts": {
+ "build": "tsup",
+ "build:watch": "tsup --watch",
+ "clean": "rm -rf dist",
+ "lint": "eslint \"./**/*.ts*\"",
+ "type-check": "tsc --noEmit",
+ "prettier-check": "prettier --check \"./**/*.ts*\"",
+ "test": "pnpm test:node && pnpm test:edge",
+ "test:edge": "vitest --config vitest.edge.config.js --run",
+ "test:node": "vitest --config vitest.node.config.js --run"
+ },
+ "exports": {
+ "./package.json": "./package.json",
+ ".": {
+ "types": "./dist/index.d.ts",
+ "import": "./dist/index.mjs",
+ "require": "./dist/index.js"
+ }
+ },
+ "dependencies": {
+ "@ai-sdk/openai-compatible": "0.1.2",
+ "@ai-sdk/provider": "1.0.6",
+ "@ai-sdk/provider-utils": "2.1.2"
+ },
+ "devDependencies": {
+ "@types/node": "^18",
+ "@vercel/ai-tsconfig": "workspace:*",
+ "tsup": "^8",
+ "typescript": "5.6.3",
+ "zod": "3.23.8"
+ },
+ "peerDependencies": {
+ "zod": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=18"
+ },
+ "publishConfig": {
+ "access": "public"
+ },
+ "homepage": "https://sdk.vercel.ai/docs",
+ "repository": {
+ "type": "git",
+ "url": "git+https://github.com/vercel/ai.git"
+ },
+ "bugs": {
+ "url": "https://github.com/vercel/ai/issues"
+ },
+ "keywords": [
+ "ai"
+ ]
+}
diff --git a/packages/perplexity/src/index.ts b/packages/perplexity/src/index.ts
new file mode 100644
index 000000000000..3a3af86fc98e
--- /dev/null
+++ b/packages/perplexity/src/index.ts
@@ -0,0 +1,6 @@
+export { createPerplexity, perplexity } from './perplexity-provider';
+export type {
+ PerplexityErrorData,
+ PerplexityProvider,
+ PerplexityProviderSettings,
+} from './perplexity-provider';
diff --git a/packages/perplexity/src/perplexity-chat-settings.ts b/packages/perplexity/src/perplexity-chat-settings.ts
new file mode 100644
index 000000000000..a6dc46254f35
--- /dev/null
+++ b/packages/perplexity/src/perplexity-chat-settings.ts
@@ -0,0 +1,6 @@
+import { OpenAICompatibleChatSettings } from '@ai-sdk/openai-compatible';
+
+// https://docs.perplexity.ai/guides/model-cards
+export type PerplexityChatModelId = 'sonar-pro' | 'sonar' | (string & {});
+
+export interface PerplexityChatSettings extends OpenAICompatibleChatSettings {}
diff --git a/packages/perplexity/src/perplexity-metadata-extractor.test.ts b/packages/perplexity/src/perplexity-metadata-extractor.test.ts
new file mode 100644
index 000000000000..89e699080d49
--- /dev/null
+++ b/packages/perplexity/src/perplexity-metadata-extractor.test.ts
@@ -0,0 +1,199 @@
+import { perplexityMetadataExtractor } from './perplexity-metadata-extractor';
+
+describe('buildMetadataFromResponse', () => {
+ it('should extract metadata from complete response with citations and usage', () => {
+ const response = {
+ citations: ['source1', 'source2'],
+ usage: {
+ citation_tokens: 100,
+ num_search_queries: 5,
+ },
+ };
+
+ const metadata = perplexityMetadataExtractor.extractMetadata({
+ parsedBody: response,
+ });
+
+ expect(metadata).toEqual({
+ perplexity: {
+ citations: ['source1', 'source2'],
+ usage: {
+ citationTokens: 100,
+ numSearchQueries: 5,
+ },
+ },
+ });
+ });
+
+ it('should extract metadata with only citations', () => {
+ const response = {
+ citations: ['source1', 'source2'],
+ };
+
+ const metadata = perplexityMetadataExtractor.extractMetadata({
+ parsedBody: response,
+ });
+
+ expect(metadata).toEqual({
+ perplexity: {
+ citations: ['source1', 'source2'],
+ },
+ });
+ });
+
+ it('should extract metadata with only usage', () => {
+ const response = {
+ usage: {
+ citation_tokens: 100,
+ num_search_queries: 5,
+ },
+ };
+
+ const metadata = perplexityMetadataExtractor.extractMetadata({
+ parsedBody: response,
+ });
+
+ expect(metadata).toEqual({
+ perplexity: {
+ usage: {
+ citationTokens: 100,
+ numSearchQueries: 5,
+ },
+ },
+ });
+ });
+
+ it('should handle missing metadata', () => {
+ const response = {
+ id: 'test-id',
+ choices: [],
+ };
+
+ const metadata = perplexityMetadataExtractor.extractMetadata({
+ parsedBody: response,
+ });
+
+ expect(metadata).toBeUndefined();
+ });
+
+ it('should handle invalid response data', () => {
+ const response = 'invalid data';
+
+ const metadata = perplexityMetadataExtractor.extractMetadata({
+ parsedBody: response,
+ });
+
+ expect(metadata).toBeUndefined();
+ });
+});
+
+describe('streaming metadata processor', () => {
+ it('should process streaming chunks and build final metadata', () => {
+ const processor = perplexityMetadataExtractor.createStreamExtractor();
+
+ // Process chunk with citations
+ processor.processChunk({
+ choices: [{ delta: { role: 'assistant', content: 'content' } }],
+ citations: ['source1', 'source2'],
+ });
+
+ // Process chunk with usage
+ processor.processChunk({
+ choices: [{ delta: { role: 'assistant', content: 'content' } }],
+ usage: {
+ citation_tokens: 100,
+ num_search_queries: 5,
+ },
+ });
+
+ const finalMetadata = processor.buildMetadata();
+
+ expect(finalMetadata).toEqual({
+ perplexity: {
+ citations: ['source1', 'source2'],
+ usage: {
+ citationTokens: 100,
+ numSearchQueries: 5,
+ },
+ },
+ });
+ });
+
+ it('should update metadata with latest chunk data', () => {
+ const processor = perplexityMetadataExtractor.createStreamExtractor();
+
+ // Process initial chunk
+ processor.processChunk({
+ citations: ['source1'],
+ usage: {
+ citation_tokens: 50,
+ num_search_queries: 2,
+ },
+ });
+
+ // Process chunk with updated data
+ processor.processChunk({
+ citations: ['source1', 'source2'],
+ usage: {
+ citation_tokens: 100,
+ num_search_queries: 5,
+ },
+ });
+
+ const finalMetadata = processor.buildMetadata();
+
+ expect(finalMetadata).toEqual({
+ perplexity: {
+ citations: ['source1', 'source2'],
+ usage: {
+ citationTokens: 100,
+ numSearchQueries: 5,
+ },
+ },
+ });
+ });
+
+ it('should handle streaming chunks without metadata', () => {
+ const processor = perplexityMetadataExtractor.createStreamExtractor();
+
+ processor.processChunk({
+ choices: [{ delta: { role: 'assistant', content: 'content' } }],
+ });
+
+ const finalMetadata = processor.buildMetadata();
+
+ expect(finalMetadata).toBeUndefined();
+ });
+
+ it('should handle invalid streaming chunks', () => {
+ const processor = perplexityMetadataExtractor.createStreamExtractor();
+
+ processor.processChunk('invalid chunk');
+
+ const finalMetadata = processor.buildMetadata();
+
+ expect(finalMetadata).toBeUndefined();
+ });
+
+ it('should handle null values in usage data', () => {
+ const processor = perplexityMetadataExtractor.createStreamExtractor();
+
+ processor.processChunk({
+ usage: {
+ citation_tokens: null,
+ num_search_queries: null,
+ },
+ });
+
+ const finalMetadata = processor.buildMetadata();
+
+ expect(finalMetadata).toEqual({
+ perplexity: {
+ usage: {
+ citationTokens: NaN,
+ numSearchQueries: NaN,
+ },
+ },
+ });
+ });
+});
diff --git a/packages/perplexity/src/perplexity-metadata-extractor.ts b/packages/perplexity/src/perplexity-metadata-extractor.ts
new file mode 100644
index 000000000000..634490654764
--- /dev/null
+++ b/packages/perplexity/src/perplexity-metadata-extractor.ts
@@ -0,0 +1,96 @@
+import { MetadataExtractor } from '@ai-sdk/openai-compatible';
+import { safeValidateTypes } from '@ai-sdk/provider-utils';
+import { z } from 'zod';
+
+const buildPerplexityMetadata = (
+ citations: string[] | undefined,
+ usage: z.infer | undefined,
+) => {
+ return citations || usage
+ ? {
+ perplexity: {
+ ...(citations && { citations }),
+ ...(usage && {
+ usage: {
+ citationTokens: usage.citation_tokens ?? NaN,
+ numSearchQueries: usage.num_search_queries ?? NaN,
+ },
+ }),
+ },
+ }
+ : undefined;
+};
+
+export const perplexityMetadataExtractor: MetadataExtractor = {
+ extractMetadata: ({ parsedBody }: { parsedBody: unknown }) => {
+ const parsed = safeValidateTypes({
+ value: parsedBody,
+ schema: perplexityResponseSchema,
+ });
+
+ return !parsed.success
+ ? undefined
+ : buildPerplexityMetadata(
+ parsed.value.citations ?? undefined,
+ parsed.value.usage ?? undefined,
+ );
+ },
+
+ createStreamExtractor: () => {
+ let citations: string[] | undefined;
+ let usage: z.infer | undefined;
+
+ return {
+ processChunk: (chunk: unknown) => {
+ const parsed = safeValidateTypes({
+ value: chunk,
+ schema: perplexityStreamChunkSchema,
+ });
+
+ if (parsed.success) {
+ // Update citations and usage with latest data from each chunk
+ if (parsed.value.citations) {
+ citations = parsed.value.citations;
+ }
+ if (parsed.value.usage) {
+ usage = parsed.value.usage;
+ }
+ }
+ },
+ buildMetadata: () => buildPerplexityMetadata(citations, usage),
+ };
+ },
+};
+
+// Schema for citations
+const perplexityCitationSchema = z.array(z.string());
+
+const perplexityUsageSchema = z.object({
+ citation_tokens: z.number().nullish(),
+ num_search_queries: z.number().nullish(),
+});
+
+// Update response schema to include usage
+const perplexityResponseSchema = z.object({
+ citations: perplexityCitationSchema.nullish(),
+ usage: perplexityUsageSchema.nullish(),
+});
+
+// Update stream chunk schema to match example format
+const perplexityStreamChunkSchema = z.object({
+ choices: z
+ .array(
+ z.object({
+ finish_reason: z.string().nullish(),
+ delta: z
+ .object({
+ role: z.string(),
+ content: z.string(),
+ })
+ .nullish(),
+ }),
+ )
+ .nullish(),
+ citations: perplexityCitationSchema.nullish(),
+ usage: perplexityUsageSchema.nullish(),
+});
diff --git a/packages/perplexity/src/perplexity-provider.test.ts b/packages/perplexity/src/perplexity-provider.test.ts
new file mode 100644
index 000000000000..e96e9d060f24
--- /dev/null
+++ b/packages/perplexity/src/perplexity-provider.test.ts
@@ -0,0 +1,84 @@
+import { describe, it, expect, vi, beforeEach, Mock } from 'vitest';
+import { createPerplexity } from './perplexity-provider';
+import { loadApiKey } from '@ai-sdk/provider-utils';
+import { OpenAICompatibleChatLanguageModel } from '@ai-sdk/openai-compatible';
+
+const OpenAICompatibleChatLanguageModelMock =
+ OpenAICompatibleChatLanguageModel as unknown as Mock;
+
+vi.mock('@ai-sdk/openai-compatible', () => ({
+ OpenAICompatibleChatLanguageModel: vi.fn(),
+ OpenAICompatibleCompletionLanguageModel: vi.fn(),
+ OpenAICompatibleEmbeddingModel: vi.fn(),
+}));
+
+vi.mock('@ai-sdk/provider-utils', () => ({
+ loadApiKey: vi.fn().mockReturnValue('mock-api-key'),
+ withoutTrailingSlash: vi.fn(url => url),
+}));
+
+describe('perplexityProvider', () => {
+ beforeEach(() => {
+ vi.clearAllMocks();
+ });
+
+ describe('createPerplexity', () => {
+ it('should create a PerplexityProvider instance with default options', () => {
+ const provider = createPerplexity();
+ const model = provider('model-id');
+
+ const constructorCall =
+ OpenAICompatibleChatLanguageModelMock.mock.calls[0];
+ const config = constructorCall[2];
+ config.headers();
+
+ expect(loadApiKey).toHaveBeenCalledWith({
+ apiKey: undefined,
+ environmentVariableName: 'PERPLEXITY_API_KEY',
+ description: 'Perplexity',
+ });
+ });
+
+ it('should create a PerplexityProvider instance with custom options', () => {
+ const options = {
+ apiKey: 'custom-key',
+ baseURL: 'https://custom.url',
+ headers: { 'Custom-Header': 'value' },
+ };
+ const provider = createPerplexity(options);
+ provider('model-id');
+
+ const constructorCall =
+ OpenAICompatibleChatLanguageModelMock.mock.calls[0];
+ const config = constructorCall[2];
+ config.headers();
+
+ expect(loadApiKey).toHaveBeenCalledWith({
+ apiKey: 'custom-key',
+ environmentVariableName: 'PERPLEXITY_API_KEY',
+ description: 'Perplexity',
+ });
+ });
+
+ it('should return a chat model when called as a function', () => {
+ const provider = createPerplexity();
+ const modelId = 'foo-model-id';
+ const settings = { user: 'foo-user' };
+
+ const model = provider(modelId, settings);
+ expect(model).toBeInstanceOf(OpenAICompatibleChatLanguageModel);
+ });
+ });
+
+ describe('chatModel', () => {
+ it('should construct a chat model with correct configuration', () => {
+ const provider = createPerplexity();
+ const modelId = 'perplexity-chat-model';
+ const settings = { user: 'foo-user' };
+
+ const model = provider.chat(modelId, settings);
+
+ expect(model).toBeInstanceOf(OpenAICompatibleChatLanguageModel);
+ });
+ });
+});
diff --git a/packages/perplexity/src/perplexity-provider.ts b/packages/perplexity/src/perplexity-provider.ts
new file mode 100644
index 000000000000..d4397248776f
--- /dev/null
+++ b/packages/perplexity/src/perplexity-provider.ts
@@ -0,0 +1,127 @@
+import {
+ LanguageModelV1,
+ NoSuchModelError,
+ ProviderV1,
+} from '@ai-sdk/provider';
+import { OpenAICompatibleChatLanguageModel } from '@ai-sdk/openai-compatible';
+import {
+ FetchFunction,
+ loadApiKey,
+ withoutTrailingSlash,
+} from '@ai-sdk/provider-utils';
+import {
+ PerplexityChatModelId,
+ PerplexityChatSettings,
+} from './perplexity-chat-settings';
+import { z } from 'zod';
+import { ProviderErrorStructure } from '@ai-sdk/openai-compatible';
+import { perplexityMetadataExtractor } from './perplexity-metadata-extractor';
+
+// Add error schema and structure
+const perplexityErrorSchema = z.object({
+ code: z.string(),
+ error: z.string(),
+});
+
+export type PerplexityErrorData = z.infer;
+
+const perplexityErrorStructure: ProviderErrorStructure = {
+ errorSchema: perplexityErrorSchema,
+ errorToMessage: data => data.error,
+};
+
+export interface PerplexityProvider extends ProviderV1 {
+ /**
+Creates an Perplexity chat model for text generation.
+ */
+ (
+ modelId: PerplexityChatModelId,
+ settings?: PerplexityChatSettings,
+ ): LanguageModelV1;
+
+ /**
+Creates an Perplexity language model for text generation.
+ */
+ languageModel(
+ modelId: PerplexityChatModelId,
+ settings?: PerplexityChatSettings,
+ ): LanguageModelV1;
+
+ /**
+Creates an Perplexity chat model for text generation.
+ */
+ chat: (
+ modelId: PerplexityChatModelId,
+ settings?: PerplexityChatSettings,
+ ) => LanguageModelV1;
+}
+
+export interface PerplexityProviderSettings {
+ /**
+Base URL for the perplexity API calls.
+ */
+ baseURL?: string;
+
+ /**
+API key for authenticating requests.
+ */
+ apiKey?: string;
+
+ /**
+Custom headers to include in the requests.
+ */
+ headers?: Record;
+
+ /**
+Custom fetch implementation. You can use it as a middleware to intercept requests,
+or to provide a custom fetch implementation for e.g. testing.
+ */
+ fetch?: FetchFunction;
+}
+
+export function createPerplexity(
+ options: PerplexityProviderSettings = {},
+): PerplexityProvider {
+ const baseURL = withoutTrailingSlash(
+ options.baseURL ?? 'https://api.perplexity.ai',
+ );
+ const getHeaders = () => ({
+ Authorization: `Bearer ${loadApiKey({
+ apiKey: options.apiKey,
+ environmentVariableName: 'PERPLEXITY_API_KEY',
+ description: 'Perplexity',
+ })}`,
+ ...options.headers,
+ });
+
+ const createLanguageModel = (
+ modelId: PerplexityChatModelId,
+ settings: PerplexityChatSettings = {},
+ ) => {
+ return new OpenAICompatibleChatLanguageModel(modelId, settings, {
+ provider: 'perplexity.chat',
+ url: ({ path }) => `${baseURL}${path}`,
+ headers: getHeaders,
+ fetch: options.fetch,
+ defaultObjectGenerationMode: 'json',
+ errorStructure: perplexityErrorStructure,
+ metadataExtractor: perplexityMetadataExtractor,
+ supportsStructuredOutputs: true,
+ });
+ };
+
+ const provider = (
+ modelId: PerplexityChatModelId,
+ settings?: PerplexityChatSettings,
+ ) => createLanguageModel(modelId, settings);
+
+ provider.languageModel = createLanguageModel;
+ provider.chat = createLanguageModel;
+ provider.textEmbeddingModel = (modelId: string) => {
+ throw new NoSuchModelError({ modelId, modelType: 'textEmbeddingModel' });
+ };
+
+ return provider as PerplexityProvider;
+}
+
+export const perplexity = createPerplexity();
diff --git a/packages/perplexity/tsconfig.json b/packages/perplexity/tsconfig.json
new file mode 100644
index 000000000000..8eee8f9f6a82
--- /dev/null
+++ b/packages/perplexity/tsconfig.json
@@ -0,0 +1,5 @@
+{
+ "extends": "./node_modules/@vercel/ai-tsconfig/ts-library.json",
+ "include": ["."],
+ "exclude": ["*/dist", "dist", "build", "node_modules"]
+}
diff --git a/packages/perplexity/tsup.config.ts b/packages/perplexity/tsup.config.ts
new file mode 100644
index 000000000000..3f92041b987c
--- /dev/null
+++ b/packages/perplexity/tsup.config.ts
@@ -0,0 +1,10 @@
+import { defineConfig } from 'tsup';
+
+export default defineConfig([
+ {
+ entry: ['src/index.ts'],
+ format: ['cjs', 'esm'],
+ dts: true,
+ sourcemap: true,
+ },
+]);
diff --git a/packages/perplexity/turbo.json b/packages/perplexity/turbo.json
new file mode 100644
index 000000000000..620b8380e744
--- /dev/null
+++ b/packages/perplexity/turbo.json
@@ -0,0 +1,12 @@
+{
+ "extends": [
+ "//"
+ ],
+ "tasks": {
+ "build": {
+ "outputs": [
+ "**/dist/**"
+ ]
+ }
+ }
+}
diff --git a/packages/perplexity/vitest.edge.config.js b/packages/perplexity/vitest.edge.config.js
new file mode 100644
index 000000000000..700660e913f5
--- /dev/null
+++ b/packages/perplexity/vitest.edge.config.js
@@ -0,0 +1,10 @@
+import { defineConfig } from 'vite';
+
+// https://vitejs.dev/config/
+export default defineConfig({
+ test: {
+ environment: 'edge-runtime',
+ globals: true,
+ include: ['**/*.test.ts', '**/*.test.tsx'],
+ },
+});
diff --git a/packages/perplexity/vitest.node.config.js b/packages/perplexity/vitest.node.config.js
new file mode 100644
index 000000000000..b1d14b21fc11
--- /dev/null
+++ b/packages/perplexity/vitest.node.config.js
@@ -0,0 +1,10 @@
+import { defineConfig } from 'vite';
+
+// https://vitejs.dev/config/
+export default defineConfig({
+ test: {
+ environment: 'node',
+ globals: true,
+ include: ['**/*.test.ts', '**/*.test.tsx'],
+ },
+});
diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml
index fa4124983f55..a39c6da7dda6 100644
--- a/pnpm-lock.yaml
+++ b/pnpm-lock.yaml
@@ -98,6 +98,9 @@ importers:
'@ai-sdk/openai-compatible':
specifier: 0.1.2
version: link:../../packages/openai-compatible
+ '@ai-sdk/perplexity':
+ specifier: 0.0.0
+ version: link:../../packages/perplexity
'@ai-sdk/provider':
specifier: 1.0.6
version: link:../../packages/provider
@@ -1571,6 +1574,34 @@ importers:
specifier: 3.23.8
version: 3.23.8
+ packages/perplexity:
+ dependencies:
+ '@ai-sdk/openai-compatible':
+ specifier: 0.1.2
+ version: link:../openai-compatible
+ '@ai-sdk/provider':
+ specifier: 1.0.6
+ version: link:../provider
+ '@ai-sdk/provider-utils':
+ specifier: 2.1.2
+ version: link:../provider-utils
+ devDependencies:
+ '@types/node':
+ specifier: ^18
+ version: 18.19.54
+ '@vercel/ai-tsconfig':
+ specifier: workspace:*
+ version: link:../../tools/tsconfig
+ tsup:
+ specifier: ^8
+ version: 8.3.0(jiti@2.4.0)(postcss@8.4.49)(tsx@4.19.2)(typescript@5.6.3)(yaml@2.5.0)
+ typescript:
+ specifier: 5.6.3
+ version: 5.6.3
+ zod:
+ specifier: 3.23.8
+ version: 3.23.8
+
packages/provider:
dependencies:
json-schema: