Skip to content

ShuyuZ1999/llm-provider-errors

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

1 Commit
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿ”ฅ llm-provider-errors

npm version License: MIT GitHub stars TypeScript

English | ไธญๆ–‡

Stop guessing what {"error_code": 17} means. Unified error diagnostics for Chinese LLM APIs โ€” get human-readable messages and actionable fix suggestions in seconds.

One function. Six providers. Two languages. Zero confusion.


๐ŸŽฏ The Problem

You're building with Chinese LLM APIs. Then this happens:

{ "base_resp": { "status_code": 1004 } }

Is that auth? Rate limit? Billing? Each provider has its own error format, its own codes, its own conventions. You end up writing the same error-handling boilerplate six times.

llm-provider-errors fixes this. One call to diagnose() and you get a clear explanation + what to do about it.

โšก Quick Start

npm install llm-provider-errors
import { diagnose } from 'llm-provider-errors';

// Got a 429 from DeepSeek?
const result = diagnose('deepseek', 429);
console.log(result.message);  // "Too Many Requests โ€” rate limit or quota exceeded."
console.log(result.hint);     // "Implement exponential backoff, reduce request frequency, or upgrade your plan."
console.log(result.severity); // "high"

// Got a provider-specific error in the response body?
const body = { error: { code: 'insufficient_quota', message: 'Quota exceeded' } };
const result2 = diagnose('deepseek', 402, body);
console.log(result2.providerCode); // "insufficient_quota"
console.log(result2.hint);         // "Top up your DeepSeek account at https://platform.deepseek.com/top_up."

// Need Chinese? ไธญๆ–‡ไนŸ่กŒ๏ผ
const result3 = diagnose('qwen', 429, undefined, { locale: 'zh' });
console.log(result3.message); // "่ฏทๆฑ‚่ฟ‡ๅคš โ€” ่ถ…ๅ‡บ้€Ÿ็އ้™ๅˆถๆˆ–้…้ขใ€‚"
console.log(result3.hint);    // "่ฏทๅฎž็ŽฐๆŒ‡ๆ•ฐ้€€้ฟ้‡่ฏ•ใ€้™ไฝŽ่ฏทๆฑ‚้ข‘็އ๏ผŒๆˆ–ๅ‡็บงๅฅ—้คใ€‚"

๐Ÿข Supported Providers

Provider Key API Docs
MiniMax minimax platform.minimaxi.com
Kimi / Moonshot moonshot platform.moonshot.cn
DeepSeek deepseek platform.deepseek.com
Qwen / Tongyi qwen DashScope Docs
GLM / Zhipu AI glm open.bigmodel.cn
Baidu / ERNIE baidu Qianfan Docs

๐Ÿ“– API Reference

diagnose(provider, statusCode, responseBody?, options?)

Diagnose an LLM API error and return structured information.

Parameters:

Parameter Type Required Description
provider Provider โœ… Provider name ('minimax', 'moonshot', 'deepseek', 'qwen', 'glm', 'baidu')
statusCode number โœ… HTTP status code from the API response
responseBody unknown โŒ Response body (string or object) for provider-specific error extraction
options DiagnoseOptions โŒ `{ locale: 'en'

Returns: DiagnosisResult

interface DiagnosisResult {
  provider: Provider;      // Which provider
  code: number;            // HTTP status code
  message: string;         // Human-readable error message
  hint: string;            // Actionable fix suggestion
  severity: Severity;      // 'low' | 'medium' | 'high' | 'critical'
  providerCode?: string;   // Provider-specific error code (if extracted)
  detail?: string;         // Raw detail from provider response
}

isSupported(provider)

Check if a provider is supported.

import { isSupported } from 'llm-provider-errors';

isSupported('deepseek'); // true
isSupported('openai');   // false

providers

Array of all supported provider names.

import { providers } from 'llm-provider-errors';
console.log(providers); // ['minimax', 'moonshot', 'deepseek', 'qwen', 'glm', 'baidu']

๐Ÿ”ง How It Works

  1. Provider-specific codes first โ€” If you pass a response body, the library tries to extract the provider's own error code (e.g., Baidu's error_code: 17, MiniMax's base_resp.status_code) and map it to a detailed diagnosis.

  2. HTTP status fallback โ€” If no provider-specific code is found, it falls back to standard HTTP status code mappings with provider-aware messaging.

  3. Generic fallback โ€” For completely unknown errors, you still get a structured result with the status code and severity.

๐Ÿ’ก Real-World Usage

With try/catch

import { diagnose } from 'llm-provider-errors';

try {
  const response = await fetch('https://api.deepseek.com/chat/completions', { ... });
  if (!response.ok) {
    const body = await response.json().catch(() => undefined);
    const diagnosis = diagnose('deepseek', response.status, body);

    if (diagnosis.severity === 'critical') {
      // Alert the team, this needs attention
      alertOps(diagnosis);
    }

    throw new Error(`${diagnosis.message}\n๐Ÿ’ก ${diagnosis.hint}`);
  }
} catch (err) {
  // ...
}

With Axios interceptors

import axios from 'axios';
import { diagnose } from 'llm-provider-errors';

const client = axios.create({ baseURL: 'https://open.bigmodel.cn/api/paas/v4' });

client.interceptors.response.use(undefined, (error) => {
  if (error.response) {
    const diagnosis = diagnose('glm', error.response.status, error.response.data, { locale: 'zh' });
    error.diagnosis = diagnosis;
    console.error(`[GLM Error] ${diagnosis.message}\n๐Ÿ’ก ${diagnosis.hint}`);
  }
  return Promise.reject(error);
});

๐Ÿค Contributing

Contributions are welcome! Here's how:

  1. Fork the repo
  2. Create a feature branch: git checkout -b feat/add-provider
  3. Commit your changes: git commit -m 'feat: add new provider'
  4. Push to your branch: git push origin feat/add-provider
  5. Open a Pull Request

Adding a new provider

  1. Create src/providers/your-provider.ts implementing the ProviderHandler interface
  2. Add the provider name to the Provider type in src/types.ts
  3. Register it in src/index.ts
  4. Add tests in test/index.test.ts
  5. Update the README tables

Improving error mappings

Found an error code we missed? PRs that add real-world error codes with accurate messages and hints are especially valued. Please reference the provider's official documentation.

๐Ÿ“„ License

MIT ยฉ Shuyu Zhang


Built with โค๏ธ for everyone fighting LLM API errors at 3 AM

About

๐Ÿ”ฅ Unified error handling for Chinese LLM APIs โ€” human-readable diagnostics + fix suggestions for MiniMax, Kimi/Moonshot, DeepSeek, Qwen, GLM, and Baidu ERNIE

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors