A Model Context Protocol (MCP) server that demonstrates how to incorporate comprehensive observability into MCP servers using Sentry, OpenTelemetry, and Braintrust.
- File Operations: Read files, write files, list directories
- Data Processing: JSON manipulation, data transformation
- HTTP Client: URL fetching, webhooks, health checks
- Server Status: Real-time server metrics and performance data
- Configuration: Server and observability configuration details
- Health Check: Service health monitoring endpoint
- Error tracking and performance monitoring
- Automatic breadcrumb collection
- User context and custom tags
- Transaction performance tracking
- Distributed tracing with Jaeger export
- Custom metrics with Prometheus export
- Automatic instrumentation of Node.js libraries
- Custom spans for tool executions
- AI/ML model evaluation and logging
- Tool execution tracking
- Performance analytics
- Experiment management
npm install
Copy the example environment file:
cp .env.example .env
Edit .env
with your configuration:
# Sentry (optional)
SENTRY_DSN=your_sentry_dsn_here
SENTRY_ENVIRONMENT=development
# Braintrust (optional)
BRAINTRUST_API_KEY=your_braintrust_api_key_here
BRAINTRUST_PROJECT_NAME=mcp-observability
# OpenTelemetry (configured by default)
OTEL_SERVICE_NAME=mcp-observability-server
OTEL_EXPORTER_JAEGER_ENDPOINT=http://localhost:14268/api/traces
# Development mode
npm run dev
# Production mode
npm run build
npm start
Add to your Claude Desktop MCP configuration:
{
"mcpServers": {
"observability-server": {
"command": "node",
"args": ["/path/to/mcp-observability-server/dist/index.js"]
}
}
}
- Create a Sentry project at sentry.io
- Copy your DSN to the
SENTRY_DSN
environment variable - The server will automatically track errors and performance
- Run Jaeger locally:
docker run -d --name jaeger \
-p 16686:16686 \
-p 14268:14268 \
jaegertracing/all-in-one:latest
- Access Jaeger UI at http://localhost:16686
- Traces will be automatically exported
- Sign up at braintrust.dev
- Get your API key and project name
- Set the environment variables
- Tool executions will be logged for analysis
// Read a file
{
"tool": "read_file",
"arguments": {
"path": "/path/to/file.txt",
"encoding": "utf8"
}
}
// Process JSON data
{
"tool": "process_json",
"arguments": {
"data": "{\"users\": [{\"name\": \"Alice\"}, {\"name\": \"Bob\"}]}",
"operation": "extract_keys",
"path": "users"
}
}
// Fetch URL
{
"tool": "fetch_url",
"arguments": {
"url": "https://api.github.com/users/octocat",
"method": "GET"
}
}
// Get server status
{
"resource": "status://server"
}
// Health check
{
"resource": "status://health"
}
// Configuration
{
"resource": "config://server"
}
src/
├── observability/ # Observability integrations
│ ├── sentry.ts # Sentry error tracking
│ ├── opentelemetry.ts # OpenTelemetry tracing
│ ├── braintrust.ts # Braintrust logging
│ └── index.ts # Unified observability wrapper
├── tools/ # MCP tools
│ ├── fileOperations.ts # File I/O tools
│ ├── dataProcessing.ts # Data manipulation tools
│ ├── httpClient.ts # HTTP client tools
│ └── index.ts # Tool registry
├── resources/ # MCP resources
│ ├── status.ts # Server status resource
│ ├── config.ts # Configuration resource
│ └── index.ts # Resource registry
└── index.ts # Main server entry point
All tools are automatically wrapped with the withObservability
decorator that:
- Creates OpenTelemetry spans
- Starts Sentry transactions
- Logs to Braintrust
- Records performance metrics
- Automatic error capture in Sentry with context
- Error spans in OpenTelemetry traces
- Failed execution logging in Braintrust
- Response time tracking
- Memory usage monitoring
- Request success/failure rates
- Custom metrics via OpenTelemetry
- End-to-end request tracing
- Tool execution spans
- External API call tracing
- Performance bottleneck identification
# Type checking
npm run typecheck
# Linting
npm run lint
# Testing
npm test
MIT