An MCP (Model Context Protocol) server that provides integration with SAP Business Data Cloud (BDC) Connect SDK. This server enables AI assistants like Claude to interact with SAP BDC for data sharing, Delta Sharing protocol operations, and data product management.
Status: ✅ Released on PyPI - v0.2.0 (2026-01-10)
New in v0.2.0: ✨ Added
provision_share(end-to-end orchestration) andvalidate_share_readiness(pre-flight validation) tools!
This MCP server provides 7 powerful tools for SAP BDC operations:
- Create/Update Shares: Manage data shares with ORD metadata
- CSN Schema Management: Configure shares using Common Semantic Notation
- Data Product Publishing: Publish and unpublish data products
- Share Deletion: Remove and withdraw shared resources
- CSN Template Generation: Auto-generate CSN templates from Databricks shares
- End-to-End Provisioning ✨: One-step share creation, granting, and registration
- Pre-flight Validation ✨: Validate shares before registration to prevent errors
- Python 3.9+ (Python 3.11+ recommended for local development)
- Access to a Databricks environment
- SAP Business Data Cloud account
- Databricks recipient configured for Delta Sharing
- For local development: Databricks personal access token
Choose your preferred language/platform:
pip install sap-bdc-mcp-servernpm install @mariodefelize/sap-bdc-mcp-serverNote: The npm package requires Python 3.9+ to be installed, as it wraps the Python MCP server.
See NPM_README.md for full Node.js/TypeScript documentation.
# Clone the repository
git clone https://github.com/MarioDeFelipe/sap-bdc-mcp-server.git
cd sap-bdc-mcp-server
# Install Python package in development mode
pip install -e .
# Install npm dependencies (optional, for Node.js development)
npm installCreate a .env file in the project root:
# Databricks Configuration
DATABRICKS_RECIPIENT_NAME=your_recipient_name
DATABRICKS_HOST=https://your-workspace.cloud.databricks.com
DATABRICKS_TOKEN=your_databricks_token
# Optional
LOG_LEVEL=INFOThe server will automatically use LocalDatabricksClient which works without dbutils.
If running inside Databricks notebooks, only set:
DATABRICKS_RECIPIENT_NAME=your_recipient_name
LOG_LEVEL=INFO
The server will automatically detect the notebook environment and use dbutils.
The MCP server runs as a stdio-based service:
python -m sap_bdc_mcp.serverOr using the installed script:
sap-bdc-mcpAdd this server to your Claude Desktop configuration file:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"sap-bdc": {
"command": "python",
"args": ["-m", "sap_bdc_mcp.server"],
"env": {
"DATABRICKS_RECIPIENT_NAME": "your_recipient_name"
}
}
}
}For Node.js/TypeScript applications, use the npm package:
import { createSapBdcMcpClient } from '@mariodefelize/sap-bdc-mcp-server';
const client = await createSapBdcMcpClient({
env: {
DATABRICKS_HOST: process.env.DATABRICKS_HOST,
DATABRICKS_TOKEN: process.env.DATABRICKS_TOKEN,
DATABRICKS_RECIPIENT_NAME: process.env.DATABRICKS_RECIPIENT_NAME,
},
});
// Validate a share
const validation = await client.validateShareReadiness({
share_name: 'my_share',
});
console.log(validation);
await client.close();See NPM_README.md for complete Node.js/TypeScript documentation and examples.
Alternatively, if installed in a virtual environment:
{
"mcpServers": {
"sap-bdc": {
"command": "C:\\path\\to\\venv\\Scripts\\python.exe",
"args": ["-m", "sap_bdc_mcp.server"],
"env": {
"DATABRICKS_RECIPIENT_NAME": "your_recipient_name"
}
}
}
}Create or update a data share with ORD metadata.
Parameters:
share_name(required): Name of the shareord_metadata(optional): ORD metadata objecttables(optional): Array of table names to include
Example:
{
"share_name": "customer_data_share",
"ord_metadata": {
"title": "Customer Data",
"description": "Shared customer information"
},
"tables": ["customers", "orders"]
}Create or update a share using CSN format.
Parameters:
share_name(required): Name of the sharecsn_schema(required): CSN schema definition object
Example:
{
"share_name": "product_share",
"csn_schema": {
"definitions": {
"Products": {
"kind": "entity",
"elements": {
"ID": {"type": "String"},
"name": {"type": "String"}
}
}
}
}
}Publish a data product to make it available for consumption.
Parameters:
share_name(required): Name of the sharedata_product_name(required): Name of the data product
Example:
{
"share_name": "customer_data_share",
"data_product_name": "CustomerAnalytics"
}Delete a share and withdraw shared resources.
Parameters:
share_name(required): Name of the share to delete
Example:
{
"share_name": "old_share"
}Generate a CSN template from an existing Databricks share.
Parameters:
share_name(required): Name of the Databricks share
Example:
{
"share_name": "existing_databricks_share"
}One-step provisioning: Creates Databricks share, grants to recipient, and registers with SAP BDC in a single operation.
This tool orchestrates the complete workflow:
- Creates the Databricks Delta share
- Adds specified tables to the share
- Grants the share to your configured recipient
- Registers the share with SAP BDC
Parameters:
share_name(required): Name of the share to createtables(required): Array of table names (format:catalog.schema.tableorschema.table)ord_metadata(required): ORD metadata objecttitle(required): Display title for the shareshortDescription: Brief descriptiondescription: Detailed descriptionversion: Version number (e.g., "1.0.0")releaseStatus: Status (e.g., "active", "beta")tags: Array of tags
comment(optional): Comment for the Databricks shareauto_grant(optional): Auto-grant to recipient (default:true)skip_if_exists(optional): Skip if share already exists (default:true)
Example:
{
"share_name": "customer_analytics",
"tables": ["main.analytics.customers", "main.analytics.orders"],
"ord_metadata": {
"title": "Customer Analytics Data",
"shortDescription": "Customer and order data for analytics",
"description": "Comprehensive customer analytics dataset including customer profiles and order history",
"version": "1.0.0",
"releaseStatus": "active",
"tags": ["analytics", "customer", "orders"]
},
"comment": "Customer analytics share for data consumers",
"auto_grant": true
}What it does:
- ✅ Creates Databricks share (or skips if exists)
- ✅ Adds all specified tables to the share
- ✅ Grants SELECT permission to your recipient
- ✅ Registers with SAP BDC with ORD metadata
- ✅ Provides step-by-step progress feedback
- ✅ If any step fails, shows what completed and what to do manually
Why use this instead of manual steps:
- Single command instead of 4 separate operations
- Automatic error handling and recovery guidance
- Idempotent - safe to retry if interrupted
- Clear visibility into each step's success/failure
Validate before you register: Check if a Databricks share is ready for BDC Connect operations.
This tool performs comprehensive pre-flight checks:
- ✅ Verifies the share exists in Databricks
- ✅ Checks if the share has tables/objects
- ✅ Validates the share is granted to your recipient
- ✅ Provides actionable next steps if validation fails
Parameters:
share_name(required): Name of the share to validatecheck_bdc_registration(optional): Also check BDC registration status (default:false)
Example:
{
"share_name": "customer_data_share"
}Success Response:
✅ Share 'customer_data_share' is READY for BDC Connect registration!
All checks passed:
✅ PASS Share 'customer_data_share' exists in Databricks
✅ PASS Share has 3 object(s)
✅ PASS Share is granted to recipient 'bdc-connect-12345'
Next step: Register with BDC using create_or_update_share('customer_data_share', ...)
Failure Response:
❌ Share 'test_share' is NOT ready for BDC Connect
Errors found:
❌ Share is empty - no tables added
❌ Share not granted to BDC Connect recipient 'bdc-connect-12345'
Required actions:
1. Add tables: w.shares.update(name='test_share', ...)
2. Grant share: GRANT SELECT ON SHARE test_share TO RECIPIENT `bdc-connect-12345`
Use Cases:
- Before registration: Validate a share before calling
create_or_update_share - Troubleshooting: Understand why registration failed
- Documentation: Generate a checklist of what's needed
- CI/CD pipelines: Automated validation before deployment
- Onboarding: Help new users understand the prerequisites
Why this matters:
- Prevents "trial and error" workflow - know upfront if share is ready
- Clear, actionable guidance instead of cryptic error messages
- Saves time by catching issues before attempting registration
- Validates all prerequisites in one call
The server uses:
- MCP SDK: For protocol implementation
- SAP BDC Connect SDK: For SAP Business Data Cloud operations
- Delta Sharing: Open protocol for secure data sharing
- ORD Protocol: For resource discovery and metadata
pytestsap-bdc-mcp-server/
├── src/
│ └── sap_bdc_mcp/
│ ├── __init__.py
│ ├── server.py # Main MCP server implementation
│ └── config.py # Configuration management
├── pyproject.toml # Project dependencies
├── .env.example # Environment variable template
└── README.md # This file
The SAP BDC Connect SDK was originally designed to run inside Databricks notebooks, requiring access to dbutils (Databricks utilities). This made local development challenging.
We've created LocalDatabricksClient - a custom wrapper that extends the SAP BDC SDK to work without dbutils. This enables:
✅ Local development - Run on your machine without Databricks notebooks ✅ IDE integration - Use your favorite development tools ✅ Easier debugging - Standard Python debugging workflows ✅ CI/CD friendly - Works in automated pipelines ✅ Claude Desktop integration - Direct MCP server usage
The LocalDatabricksClient class:
- Bypasses
dbutilsrequirement - Accepts workspace URL and API token directly - Reads from
.envfile - No notebook context needed - Auto-detects mode - Automatically uses brownfield (BDC Connect) or Databricks Connect mode
- Maintains compatibility - Fully compatible with the SAP BDC SDK API
- Clear error messages - Helpful guidance if configuration is missing
from sap_bdc_mcp.local_client import LocalDatabricksClient
# Initialize from environment variables
client = LocalDatabricksClient.from_env()
# Or with explicit credentials
client = LocalDatabricksClient(
workspace_url="https://your-workspace.cloud.databricks.com",
api_token="your_token",
recipient_name="your_recipient"
)BDC Connect Mode (Brownfield) ✨
- Uses OIDC federation for authentication
- No Databricks secrets required
- Simpler setup
- Automatically detected if recipient is configured
Databricks Connect Mode
- Requires additional secrets (api_url, tenant, token_audience)
- Can be provided via environment variables
- For greenfield deployments
See our comprehensive guides:
- QUICKSTART.md - Get started in 5 minutes
- IMPLEMENTATION_SUCCESS.md - Technical deep dive
- HOW_TO_CREATE_SHARE.md - Complete workflow guide
Key points to highlight:
- The Problem: SAP BDC SDK requires
dbutils, limiting usage to Databricks notebooks - The Investigation: We analyzed the SDK to understand what
dbutilsactually provided - The Discovery: Only 2 uses - getting workspace credentials and accessing secrets
- The Solution: Created
LocalDatabricksClientthat injects credentials directly - The Result: Full local development support with < 200 lines of code
Technical highlights:
- Custom inheritance from
DatabricksClient - Override
__init__to bypassdbutilsrequirement - Override
_get_secret()to read from env vars - Maintains all SDK functionality
- Zero changes to SAP BDC SDK itself
┌─────────────────────────┐
│ Claude Desktop │
│ (MCP Client) │
└───────────┬─────────────┘
│ MCP Protocol (stdio)
┌───────────▼─────────────┐
│ sap_bdc_mcp.server │
│ ┌───────────────────┐ │
│ │ BDCClientManager │ │
│ │ (Auto-detect) │ │
│ └────────┬──────────┘ │
│ ├─────────────┼─ Notebook? → DatabricksClient (dbutils)
│ │ │
│ └─────────────┼─ Local? → LocalDatabricksClient (.env)
└───────────┼─────────────┘
│
┌───────────▼─────────────┐
│ SAP BDC Connect SDK │
│ ┌──────────────────┐ │
│ │ BdcConnectClient │ │
│ └────────┬─────────┘ │
└──────────┼──────────────┘
│ HTTPS/OIDC
┌──────────▼──────────────┐
│ Databricks + SAP BDC │
└─────────────────────────┘
The server uses:
- MCP SDK: For protocol implementation
- SAP BDC Connect SDK: For SAP Business Data Cloud operations
- Delta Sharing: Open protocol for secure data sharing
- ORD Protocol: For resource discovery and metadata
The server supports two integration modes:
1. Notebook Mode (Original)
- Runs inside Databricks notebooks
- Uses
dbutilsfor credentials - Requires active notebook session
2. Local Mode (New!) ✨
- Runs on your local machine
- Uses environment variables for credentials
- No notebook required
Authentication is handled through:
- Databricks workspace credentials (URL + token)
- Recipient configuration in Databricks
- SAP BDC service credentials (auto-configured in BDC Connect mode)
For Local Development:
- Ensure
.envfile exists withDATABRICKS_HOST,DATABRICKS_TOKEN, andDATABRICKS_RECIPIENT_NAME - Check that your Databricks token is valid
- Verify the workspace URL is correct
For Notebook Environment:
- Ensure you're running in a Databricks notebook with
dbutilsavailable - Set
DATABRICKS_RECIPIENT_NAMEenvironment variable
For local development, ensure these are set in your .env file:
DATABRICKS_HOST=https://your-workspace.cloud.databricks.com
DATABRICKS_TOKEN=dapi...
DATABRICKS_RECIPIENT_NAME=your_recipient_nameThe share must exist in Databricks before registering with SAP BDC:
- Create a Delta share in Databricks first
- Grant the share to your recipient
- Then register it with SAP BDC using this server
See HOW_TO_CREATE_SHARE.md for detailed steps.
Grant the share to your recipient in Databricks:
GRANT SELECT ON SHARE your_share_name TO RECIPIENT `your_recipient_name`;- SAP BDC Connect SDK on PyPI
- Model Context Protocol Documentation
- Delta Sharing Protocol
- SAP Business Data Cloud
This MCP server is provided as-is. Please review the SAP BDC Connect SDK license terms when using this integration.
Contributions are welcome! Please see CONTRIBUTING.md for details on:
- Setting up your development environment
- Running tests
- Submitting pull requests
- Code style guidelines
- Initial validation with Databricks environment
- Local development support (LocalDatabricksClient)
- PyPI package publication
- Comprehensive documentation
- npm package for Node.js environments
- Additional SAP BDC SDK features
- Enhanced error handling and logging
- More integration examples and tutorials
- Video tutorials and demos
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Documentation: Wiki
- SAP for the BDC Connect SDK
- Anthropic for the Model Context Protocol
- The MCP community for inspiration and support