Skip to content

Commit

Permalink
feat (provider/perplexity): add Perplexity provider (#4501)
Browse files Browse the repository at this point in the history
Co-authored-by: ybelakov <[email protected]>
  • Loading branch information
shaper and ybelakov authored Jan 24, 2025
1 parent 482df11 commit 5a5b668
Show file tree
Hide file tree
Showing 26 changed files with 903 additions and 86 deletions.
5 changes: 5 additions & 0 deletions .changeset/lovely-weeks-sip.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'@ai-sdk/perplexity': patch
---

feat (provider/perplexity): add Perplexity provider
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ You can find the changelogs for the individual packages in their respective `CHA
- [@ai-sdk/mistral](./packages/mistral/CHANGELOG.md)
- [@ai-sdk/openai](./packages/openai/CHANGELOG.md)
- [@ai-sdk/openai-compatible](./packages/openai-compatible/CHANGELOG.md)
- [@ai-sdk/perplexity](./packages/perplexity/CHANGELOG.md)
- [@ai-sdk/togetherai](./packages/togetherai/CHANGELOG.md)
- [@ai-sdk/xai](./packages/xai/CHANGELOG.md)

Expand Down
6 changes: 3 additions & 3 deletions content/docs/02-foundations/02-providers-and-models.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Here is an overview of the AI SDK Provider Architecture:

## AI SDK Providers

The AI SDK comes with several providers that you can use to interact with different language models:
The AI SDK comes with a wide range of providers that you can use to interact with different language models:

- [OpenAI Provider](/providers/ai-sdk-providers/openai) (`@ai-sdk/openai`)
- [Azure OpenAI Provider](/providers/ai-sdk-providers/azure) (`@ai-sdk/azure`)
Expand All @@ -39,10 +39,10 @@ The AI SDK comes with several providers that you can use to interact with differ
- [DeepSeek Provider](/providers/ai-sdk-providers/deepseek) (`@ai-sdk/deepseek`)
- [Cerebras Provider](/providers/ai-sdk-providers/cerebras) (`@ai-sdk/cerebras`)
- [Groq Provider](/providers/ai-sdk-providers/groq) (`@ai-sdk/groq`)
- [Perplexity Provider](/providers/ai-sdk-providers/perplexity) (`@ai-sdk/perplexity`)

You can also use the OpenAI provider with OpenAI-compatible APIs:
You can also use the [OpenAI Compatible provider](/providers/openai-compatible-providers) with OpenAI-compatible APIs:

- [Perplexity](/providers/ai-sdk-providers/perplexity)
- [LM Studio](/providers/openai-compatible-providers/lmstudio)
- [Baseten](/providers/openai-compatible-providers/baseten)

Expand Down
1 change: 0 additions & 1 deletion content/docs/02-guides/03-llama-3_1.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,6 @@ const { text } = await generateText({
Llama 3.1 is available to use with many AI SDK providers including
[Groq](/providers/ai-sdk-providers/groq), [Amazon
Bedrock](/providers/ai-sdk-providers/amazon-bedrock),
[Perplexity](/providers/ai-sdk-providers/perplexity),
[Baseten](/providers/openai-compatible-providers/baseten)
[Fireworks](/providers/ai-sdk-providers/fireworks), and more.
</Note>
Expand Down
134 changes: 134 additions & 0 deletions content/providers/01-ai-sdk-providers/70-perplexity.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,134 @@
---
title: Perplexity
description: Learn how to use Perplexity's Sonar API with the AI SDK.
---

# Perplexity Provider

The [Perplexity](https://sonar.perplexity.ai) provider offers access to Sonar API - a language model that uniquely combines real-time web search with natural language processing. Each response is grounded in current web data and includes detailed citations, making it ideal for research, fact-checking, and obtaining up-to-date information.

API keys can be obtained from the [Perplexity Platform](https://docs.perplexity.ai).

## Setup

The Perplexity provider is available via the `@ai-sdk/perplexity` module. You can install it with:

<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>
<Snippet text="pnpm add @ai-sdk/perplexity" dark />
</Tab>
<Tab>
<Snippet text="npm install @ai-sdk/perplexity" dark />
</Tab>
<Tab>
<Snippet text="yarn add @ai-sdk/perplexity" dark />
</Tab>
</Tabs>

## Provider Instance

You can import the default provider instance `perplexity` from `@ai-sdk/perplexity`:

```ts
import { perplexity } from '@ai-sdk/perplexity';
```

For custom configuration, you can import `createPerplexity` and create a provider instance with your settings:

```ts
import { createPerplexity } from '@ai-sdk/perplexity';

const perplexity = createPerplexity({
apiKey: process.env.PERPLEXITY_API_KEY ?? '',
});
```

You can use the following optional settings to customize the Perplexity provider instance:

- **baseURL** _string_

Use a different URL prefix for API calls.
The default prefix is `https://api.perplexity.ai`.

- **apiKey** _string_

API key that is being sent using the `Authorization` header. It defaults to
the `PERPLEXITY_API_KEY` environment variable.

- **headers** _Record&lt;string,string&gt;_

Custom headers to include in the requests.

- **fetch** _(input: RequestInfo, init?: RequestInit) => Promise&lt;Response&gt;_

Custom [fetch](https://developer.mozilla.org/en-US/docs/Web/API/fetch) implementation.

## Language Models

You can create Perplexity models using a provider instance:

```ts
import { perplexity } from '@ai-sdk/perplexity';
import { generateText } from 'ai';

const { text } = await generateText({
model: perplexity('sonar-pro'),
prompt: 'What are the latest developments in quantum computing?',
});
```

Perplexity models can be used in the `streamText` and `streamUI` functions (see
[AI SDK Core](/docs/ai-sdk-core) and [AI SDK RSC](/docs/ai-sdk-rsc)).

### Provider Metadata

The Perplexity provider includes additional experimental metadata in the response through `experimental_providerMetadata`:

```ts
const result = await generateText({
model: perplexity('sonar-pro'),
prompt: 'What are the latest developments in quantum computing?',
});

console.log(result.experimental_providerMetadata);
// Example output:
// {
// perplexity: {
// citations: [
// 'https://www.sfchronicle.com',
// 'https://www.cbsnews.com/sanfrancisco/',
// ],
// usage: { citationTokens: 5286, numSearchQueries: 1 },
// },
// }
```

The metadata includes:

- `citations`: Array of URLs used as sources for the response
- `usage`: Object containing `citationTokens` and `numSearchQueries` metrics

<Note>
For more details about Perplexity's citations see the [Perplexity chat
completion docs](https://docs.perplexity.ai/api-reference/chat-completions).
</Note>

## Model Capabilities

| Model | Image Input | Object Generation | Tool Usage | Tool Streaming |
| ----------- | ------------------- | ------------------- | ------------------- | ------------------- |
| `sonar-pro` | <Cross size={18} /> | <Cross size={18} /> | <Cross size={18} /> | <Cross size={18} /> |
| `sonar` | <Cross size={18} /> | <Cross size={18} /> | <Cross size={18} /> | <Cross size={18} /> |

### Key Features

- **Real-time Web Search**: Both models provide grounded responses using real-time web search
- **Citations**: Sonar Pro provides 2x more citations than standard Sonar
- **Data Privacy**: No training on customer data
- **Self-serve API**: Immediate access with scalable pricing
- **Advanced Queries**: Support for complex queries and follow-up questions

<Note>
Please see the [Perplexity docs](https://docs.perplexity.ai) for detailed API
documentation and the latest updates.
</Note>
71 changes: 0 additions & 71 deletions content/providers/02-openai-compatible-providers/10-perplexity.mdx

This file was deleted.

1 change: 0 additions & 1 deletion content/providers/02-openai-compatible-providers/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,6 @@ Below we focus on the general setup and provider instance creation. You can also

We provide detailed documentation for the following OpenAI compatible providers:

- [Perplexity](/providers/openai-compatible-providers/perplexity)
- [LM Studio](/providers/openai-compatible-providers/lmstudio)
- [Baseten](/providers/openai-compatible-providers/baseten)

Expand Down
1 change: 1 addition & 0 deletions examples/ai-core/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
"@ai-sdk/mistral": "1.1.2",
"@ai-sdk/openai": "1.1.2",
"@ai-sdk/openai-compatible": "0.1.2",
"@ai-sdk/perplexity": "0.0.0",
"@ai-sdk/provider": "1.0.6",
"@ai-sdk/replicate": "0.1.2",
"@ai-sdk/togetherai": "0.1.2",
Expand Down
30 changes: 30 additions & 0 deletions examples/ai-core/src/e2e/perplexity.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
import 'dotenv/config';
import { expect } from 'vitest';
import {
perplexity as provider,
PerplexityErrorData,
} from '@ai-sdk/perplexity';
import {
createFeatureTestSuite,
createLanguageModelWithCapabilities,
} from './feature-test-suite';
import { APICallError } from '@ai-sdk/provider';

const createChatModel = (modelId: string) =>
createLanguageModelWithCapabilities(provider.chat(modelId));

createFeatureTestSuite({
name: 'perplexity',
models: {
invalidModel: provider.chat('no-such-model'),
languageModels: [createChatModel('sonar-pro'), createChatModel('sonar')],
},
timeout: 30000,
customAssertions: {
errorValidator: (error: APICallError) => {
expect((error.data as PerplexityErrorData).code).toBe(
'Some requested entity was not found',
);
},
},
})();
18 changes: 18 additions & 0 deletions examples/ai-core/src/generate-text/perplexity.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
import 'dotenv/config';
import { perplexity } from '@ai-sdk/perplexity';
import { generateText } from 'ai';

async function main() {
const result = await generateText({
model: perplexity('sonar-pro'),
prompt: 'Invent a new holiday and describe its traditions.',
});

console.log(result.text);
console.log();
console.log('Token usage:', result.usage);
console.log('Finish reason:', result.finishReason);
console.log('Metadata:', result.experimental_providerMetadata);
}

main().catch(console.error);
13 changes: 3 additions & 10 deletions examples/ai-core/src/stream-text/perplexity.ts
Original file line number Diff line number Diff line change
@@ -1,18 +1,10 @@
import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
import { perplexity } from '@ai-sdk/perplexity';
import { streamText } from 'ai';
import 'dotenv/config';

const perplexity = createOpenAICompatible({
name: 'perplexity',
headers: {
Authorization: `Bearer ${process.env.PERPLEXITY_API_KEY ?? ''}`,
},
baseURL: 'https://api.perplexity.ai/',
});

async function main() {
const result = streamText({
model: perplexity('llama-3.1-sonar-small-128k-online'),
model: perplexity('sonar-pro'),
prompt:
'List the top 5 San Francisco news from the past week.' +
'You must include the date of each article.',
Expand All @@ -25,6 +17,7 @@ async function main() {
console.log();
console.log('Token usage:', await result.usage);
console.log('Finish reason:', await result.finishReason);
console.log('Metadata:', await result.experimental_providerMetadata);
}

main().catch(console.error);
1 change: 1 addition & 0 deletions packages/perplexity/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
# @ai-sdk/perplexity
47 changes: 47 additions & 0 deletions packages/perplexity/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# AI SDK - Perplexity Provider

The **[Perplexity provider](https://sdk.vercel.ai/providers/ai-sdk-providers/perplexity)** for the [AI SDK](https://sdk.vercel.ai/docs)
contains language model support for Perplexity's Sonar API - a powerful answer engine with real-time web search capabilities.

## Features

- Real-time web search grounding for accurate, up-to-date responses
- Support for advanced queries and follow-up questions
- Multiple tiers available:
- **Sonar Pro**: Enhanced capabilities for complex tasks with 2x more citations
- **Sonar**: Lightweight offering optimized for speed and cost
- Industry-leading answer quality
- Data privacy - no training on customer data
- Self-serve API access with scalable pricing

## Setup

The Perplexity provider is available in the `@ai-sdk/perplexity` module. You can install it with:

```bash
npm i @ai-sdk/perplexity
```

## Provider Instance

You can import the default provider instance `perplexity` from `@ai-sdk/perplexity`:

```ts
import { perplexity } from '@ai-sdk/perplexity';
```

## Example

```ts
import { perplexity } from '@ai-sdk/perplexity';
import { generateText } from 'ai';

const { text } = await generateText({
model: perplexity('sonar-pro'),
prompt: 'What are the latest developments in quantum computing?',
});
```

## Documentation

Please check out the **[Perplexity provider documentation](https://sdk.vercel.ai/providers/ai-sdk-providers/perplexity)** for more information.
Loading

0 comments on commit 5a5b668

Please sign in to comment.