Are you ready to supercharge your AI workflows with structured data? Let’s dive into the dbt MCP server, a game-changer for connecting your dbt projects to AI systems. In this tutorial, I’ll walk you through what the dbt MCP server is, why it’s awesome, and how to set it up using the updated installation steps. Buckle up for a fun, conversational ride through the world of data and AI!
Want an integrated, All-in-One platform for your Developer Team to work together with maximum productivity?
Apidog delivers all your demands, and replaces Postman at a much more affordable price!
What’s dbt All About?
If you’re new to dbt (data build tool), it’s like the Swiss Army knife for data teams. It’s an open-source framework that lets you transform raw data in your data warehouse into clean, reliable datasets for analytics. With dbt, you can:
- Write modular SQL models to shape your data.
- Document your data assets and their relationships.
- Test data quality to keep things trustworthy.
- Track data lineage to see how everything flows.
Think of dbt as the backbone of modern data engineering, making your datasets governed and ready for action.

Meet the dbt MCP Server
Now, let’s talk about the star of the show: the dbt MCP server. This experimental, open-source server is like a bridge that connects your dbt project to AI systems. MCP stands for Model Context Protocol, a fancy way of saying it’s a standard for AI tools (like Claude Desktop or Cursor) to tap into your dbt project’s metadata, documentation, and semantic layer.
With the dbt MCP server, AI agents and business users can explore your data, run queries, and even execute dbt commands—all through natural language or code. It’s like giving your AI a VIP pass to your data warehouse!

Why You’ll Love the dbt MCP Server
Here’s what makes the dbt MCP server so cool:
- Discover Your Data: AI and users can browse your dbt models, check their structure, and understand how they’re connected.
- Query with Confidence: Use the dbt Semantic Layer for consistent metrics or run custom SQL queries for flexibility.
- Automate Like a Pro: Run dbt commands (like
run
,test
, orbuild
) directly from AI workflows to keep your pipelines humming.
How the dbt MCP Server Powers AI Workflows
The dbt MCP server is all about bringing structured, governed data to AI. Here’s how it works its magic:
- Universal Data Access: It uses the Model Context Protocol to share your dbt project’s context—models, metrics, and lineage—with any MCP-enabled AI tool. No custom integrations needed!
- Smart Data Discovery: AI agents can list models, check dependencies, and grab metadata, making it easy to answer questions like “What’s our customer data like?”
- Governed Querying: By tapping into the dbt Semantic Layer, the server ensures AI-generated reports stick to your company’s official metrics, keeping things consistent and trustworthy.
- Automation Galore: AI can trigger dbt commands to run models, test data, or build projects, streamlining your data pipelines.
- Safe and Scalable: Run it locally or in a sandbox, with permissions to keep sensitive data locked down. It’s flexible for both testing and production.

Installing the dbt MCP Server: Step-by-Step
Ready to get the dbt MCP server up and running? Let’s follow the updated installation steps to get you set up smoothly. Don’t worry, I’ll keep it simple and fun!
Prerequisites
Before we start, make sure you have:
- Python 3.12+: The server needs a modern Python environment.
- uv: A fast Python package installer and resolver (installation guide).
- Task: A task runner/build tool (installation guide).
- A dbt project with a configured
profiles.yml
file pointing to your data warehouse. - A dbt Cloud account for cloud-based functionality (optional for dbt CLI usage).
Step 1: Clone the Repository
First, grab the dbt MCP server code from GitHub. Open your terminal and run:
git clone https://github.com/dbt-labs/dbt-mcp.git
cd dbt-mcp
This downloads the source code to your local machine and moves you into the project directory.
Step 2: Install Dependencies
With uv
and Task
installed, set up the required Python packages by running:
task install
This creates a virtual environment and installs all necessary dependencies for the dbt MCP server.
Step 3: Configure Environment Variables
Set up your environment by copying the example configuration file:
cp .env.example .env
Open the .env
file in your favorite text editor and fill in these key variables:
- DBT_HOST: Your dbt Cloud instance hostname (e.g.,
cloud.getdbt.com
). - DBT_TOKEN: Your dbt Cloud personal access token or service token.
- DBT_PROD_ENV_ID: Your dbt Cloud production environment ID.
- DBT_DEV_ENV_ID: (Optional) Your dbt Cloud development environment ID.
- DBT_USER_ID: (Optional) Your dbt Cloud user ID.
- DBT_PROJECT_DIR: Path to your local dbt project (for dbt CLI usage).
- DBT_PATH: Path to your dbt CLI executable (find it with
which dbt
).
You can also enable or disable specific tool groups (e.g., Semantic Layer, Discovery) via these variables. Adjust them based on your needs.
Step 4: Start the dbt MCP Server
Now, let’s fire it up! From the dbt-mcp
directory, run:
task start
This launches the dbt MCP server, making it available for connections from MCP-compatible clients like Claude Desktop or Cursor.
Step 5: Connect an MCP-Enabled Client
To connect an MCP client, add this configuration to the client’s config file (replace <path-to-.env-file>
with the path to your .env
file):
{
"mcpServers": {
"dbt-mcp": {
"command": "uvx",
"args": ["--env-file", "<path-to-.env-file>", "dbt-mcp"]
}
}
}
- Claude Desktop: Create a
claude_desktop_config.json
file with the above config. Check logs at~/Library/Logs/Claude
(Mac) or%APPDATA%\Claude\logs
(Windows) for debugging.

- Cursor: Follow Cursor’s MCP docs to input the config.
- VS Code:
- Open Settings (
Command + ,
) and select the appropriate tab (Workspace or User). - For WSL users, use the Remote tab via the Command Palette (
F1
) or Settings editor. - Enable “Mcp” under Features → Chat.

4. Click “Edit in settings.json” under “Mcp > Discovery” and add:
{
"mcp": {
"inputs": [],
"servers": {
"dbt": {
"command": "uvx",
"args": ["--env-file", "<path-to-.env-file>", "dbt-mcp"]
}
}
}
}
You can manage servers via the Command Palette (Control + Command + P
) with the “MCP: List Servers” command.
Troubleshooting Tips
- uvx Not Found? If clients can’t find
uvx
, use the full path (find it withwhich uvx
on Unix systems) in the JSON config. - Connection Issues? Verify your
.env
variables, especiallyDBT_HOST
andDBT_TOKEN
. - WSL Users: Configure WSL-specific settings in VS Code’s Remote tab, as local User settings may not work.
Available Tools
The dbt MCP server supports powerful tools, including:
- dbt CLI: Commands like
build
,compile
,docs
,run
,test
, andshow
for managing your dbt project. - Semantic Layer: Commands like
list_metrics
,get_dimensions
, andquery_metrics
for working with governed metrics. - Discovery: Commands like
get_all_models
andget_model_details
for exploring your dbt project. - Remote: Commands like
text_to_sql
andexecute_sql
for generating and running SQL queries (requires a personal access token forDBT_TOKEN
).
Note: Be very cautious, as some commands (e.g., run
, build
) can modify your data models or warehouse objects. So, proceed with caution!
Wrapping Up
And there you have it! The dbt MCP server is your ticket to bringing structured, governed data into AI workflows. By connecting your dbt project to AI agents, you’re unlocking a world of data discovery, querying, and automation—all while keeping things secure and scalable. Whether you’re a data engineer or an AI enthusiast, this server is a powerful tool to make your data shine.
Want an integrated, All-in-One platform for your Developer Team to work together with maximum productivity?
Apidog delivers all your demands, and replaces Postman at a much more affordable price!