This guide will help you set up a Model Context Protocol (MCP) server that allows Claude Desktop to interact with Microsoft Fabric Data Warehouse and SQL databases using natural language.
β¨ New in this version:
- Fabric Data Warehouse Support: Optimized schema discovery for Fabric SQL endpoints
- Enhanced Compatibility: Works with various Fabric configurations and constraint limitations
- Robust Fallbacks: Graceful handling when advanced metadata isn't available
- Improved Error Handling: Better diagnostics for connection and schema issues
This section summarizes the recent changes made to improve compatibility with Microsoft Fabric Data Warehouse endpoints.
The original schema discovery logic was designed for traditional Azure SQL Database and failed when working with Fabric Data Warehouse due to:
- Different system view capabilities
- Limited foreign key constraint metadata
- Varying levels of INFORMATION_SCHEMA support across Fabric configurations
Before: Single complex query with LEFT JOINs for constraints
After: Separated queries with fallback mechanisms
Improvements:
- Fabric-compatible basic schema query
- Separate primary key discovery with error handling
- Fallback to basic schema when constraints aren't available
- Enhanced column metadata collection (precision, scale, defaults)
- Graceful degradation with informative messages
Before: Single complex query trying to get all metadata at once
After: Multi-stage discovery with robust error handling
Improvements:
- Separated basic table/column discovery from constraint discovery
- Independent primary key and foreign key queries
- Fabric-specific error messages and user guidance
- Enhanced response formatting with Fabric branding
- Better handling of missing relationship metadata
- Multiple try/catch blocks for different query types
- Specific error messages for Fabric limitations
- Fallback strategies when advanced features aren't available
- User-friendly explanations of what's normal vs. problematic
- Fabric Data Warehouse branding in responses
- Clear messaging about what features are/aren't available
- Better progress indicators during discovery
- Compatibility notes in schema responses
- Basic Schema Query: Now uses explicit schema joins and filtering
- Primary Key Query: Separated with proper error handling
- Foreign Key Query: Uses REFERENTIAL_CONSTRAINTS with fallbacks
- Fallback Query: Minimal query when advanced features fail
- Try advanced queries first
- Fall back to basic queries on failure
- Provide informative error messages
- Continue operation even when some metadata isn't available
| Feature | Traditional Azure SQL | Fabric Data Warehouse |
|---|---|---|
| Basic Tables/Columns | β | β |
| Primary Keys | β | β (with fallback) |
| Foreign Keys | β | |
| Column Metadata | β | β |
| Constraint Details | β |
- All SQL queries validated for syntax
- Error handling paths verified
- Fallback mechanisms tested
- User messaging reviewed for clarity
Users can expect:
- Schema discovery to work with any Fabric SQL endpoint
- Graceful handling of missing constraint information
- Clear feedback about what metadata is/isn't available
- Continued functionality even with limited metadata support
db.py- Core schema discovery logicmcp_server.py- MCP server schema handlingREADME.md- Updated documentation.gitignore- Added temporary filesrequirements.txt- Added for dependency management
- Python 3.8+ installed on your system
- Claude Desktop application
- Microsoft Fabric workspace with SQL endpoint
- Azure Active Directory app registration
- OpenAI API account
- Azure Client ID and Tenant ID
- OpenAI API key
- Fabric SQL server address and database name
- Clone or download the repository
- Ensure you have these Python files in your project directory:
mcp_server.pydb.pyauth.pyllm.pyllm_dynamic.pyprompt.py
# Navigate to your project directory
cd /path/to/your/fabric-sql-mcp
# Create virtual environment
python -m venv .venv
# Activate virtual environment
# On Windows:
.venv\Scripts\activate
# On macOS/Linux:
source .venv/bin/activate
# Install required packages
pip install mcp pyodbc msal python-dotenv openaiCreate a .env file in your project directory:
# Azure AD - Get these from Azure Portal > App Registrations
AZURE_CLIENT_ID=your_azure_client_id_here
AZURE_TENANT_ID=your_azure_tenant_id_here
# OpenAI - Get from https://platform.openai.com/api-keys
OPENAI_API_KEY=your_openai_api_key_here
# Fabric SQL - Get from your Fabric workspace
FABRIC_SQL_SERVER=your-server.datawarehouse.fabric.microsoft.com
FABRIC_DATABASE=your_database_name_here- Go to Azure Portal > Azure Active Directory > App Registrations
- Click New Registration
- Name: "Fabric SQL MCP"
- Account types: "Accounts in this organizational directory only"
- Redirect URI: Leave blank for now
- Click Register
- In your new app, go to Authentication
- Click Add a platform > Mobile and desktop applications
- Check https://login.microsoftonline.com/common/oauth2/nativeclient
- Click Configure
- Go to API Permissions
- Click Add a permission
- Select Azure Service Management
- Check user_impersonation
- Click Grant admin consent
- Application (client) ID - Copy this as your
AZURE_CLIENT_ID - Directory (tenant) ID - Copy this as your
AZURE_TENANT_ID
- Go to your Microsoft Fabric workspace
- Navigate to your SQL Database or Data Warehouse
- Click on Settings or Connection strings
- Copy the SQL connection endpoint (looks like:
your-server.datawarehouse.fabric.microsoft.com) - Note your database name
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
{
"mcpServers": {
"fabric-sql-assistant": {
"command": "C:/path/to/your/project/.venv/Scripts/python.exe",
"args": [
"C:/path/to/your/project/mcp_server.py"
],
"env": {
"FABRIC_SQL_SERVER": "your-server.datawarehouse.fabric.microsoft.com",
"FABRIC_DATABASE": "your_database_name",
"AZURE_CLIENT_ID": "your_azure_client_id",
"AZURE_TENANT_ID": "your_azure_tenant_id",
"OPENAI_API_KEY": "your_openai_api_key"
}
}
}
}Important Notes:
- Use full absolute paths for both
commandandargs - On Windows, use forward slashes
/or escape backslashes\\ - On macOS/Linux, the command would be
/path/to/your/project/.venv/bin/python
# Activate your virtual environment
.venv\Scripts\activate # Windows
# or
source .venv/bin/activate # macOS/Linux
# Run the server
python mcp_server.pyYou should see the server start without errors.
- Restart Claude Desktop completely
- Open a new conversation
- Look for the π§ (tools) icon in the interface
- Try asking: "What tables are available in my database?"
Once set up, you can use these commands:
Configure my database connection:
Server: your-server.datawarehouse.fabric.microsoft.com
Database: your_database_name
Discover my database schema
What are the total sales by product category?
Which customers have made the most purchases?
Show me monthly revenue trends for this year
Execute this SQL query: SELECT TOP 10 * FROM Sales
Show me details about the Sales table
- Solution: Restart Claude Desktop completely after config changes
- Check that file paths in config are absolute and correct
- Verify Python virtual environment path
- Solution: Ensure Azure app has correct permissions
- Check that you've granted admin consent for API permissions
- Verify Client ID and Tenant ID are correct
- Solution: Test connection to Fabric SQL from another tool first
- Ensure your account has access to the Fabric workspace
- Check that the SQL endpoint is active
- Solution: Ensure all required packages are installed in the virtual environment
- Check that all Python files are in the same directory
- Solution: Verify your OpenAI API key is valid and has credits
- Check that the key has access to the GPT-4 models
.venv\Scripts\activate
python -c "import mcp, pyodbc, msal, openai; print('All imports successful')"python -c "from db import test_connection; print(test_connection())"- Windows:
%APPDATA%\Claude\logs\ - macOS:
~/Library/Logs/Claude/
- Never commit
.envorclaude_desktop_config.jsonto version control - Use principle of least privilege for Azure app permissions
- Rotate credentials regularly
- Monitor usage of your OpenAI API key
- Use environment variables rather than hardcoding secrets
If you encounter issues:
- Check Claude Desktop logs for error messages
- Test each component separately (auth, database, MCP server)
- Verify all credentials are correct and active
- Ensure all dependencies are installed in the virtual environment
- Microsoft Fabric Documentation
- Azure Active Directory App Registration
- OpenAI API Documentation
- Model Context Protocol Specification
Happy querying! π