Connecting language models with external data sources is critical for building robust, intelligent applications. Model Context Protocol (MCP) is a standardized framework that streamlines the exchange of context and data between AI models and external systems. Whether you’re building chatbots, search engines, or data analysis tools, MCP helps bridge the gap between different models and APIs, ensuring a seamless flow of information.
Imagine a system where you can easily switch between using Ollama for lightweight, local model inference, OpenAI for cutting-edge natural language understanding, and Deepseek for powerful search capabilities. Now, add Dolphin MCP—an open-source Python library and CLI tool that simplifies this integration. Dolphin MCP not only connects to multiple MCP servers simultaneously but also makes their tools available to language models through natural language queries.
In this tutorial, we’ll guide you through everything from installing Dolphin MCP to integrating it with models like Ollama and OpenAI.

What is MCP? (Starting from the basics)
Model Context Protocol (MCP) is a framework designed to standardize the interaction between AI models and external applications. It allows different models to share context, exchange data, and call tools in a unified, conversational manner. With MCP, you can:
- Maintain a seamless conversation history across different queries.
- Dynamically discover and invoke external tools or APIs.
- Integrate multiple AI providers under a single standardized protocol.
By using MCP, developers can focus on building innovative solutions without worrying about the underlying complexities of cross-model communication. Click here if you would like a more in-depth tutorial on MCP and what it is all about.
Why Use Dolphin MCP?
Dolphin MCP is an open-source Python library and CLI tool that makes it incredibly simple to interact with multiple MCP servers (you can have as many as you like). Its design emphasizes modularity and ease of use, providing a clean API for integrating with various language models like OpenAI, Anthropic, and Ollama, as well as external data sources like Deepseek. You can simply switch between models according to the needs of the task you are working on!
Key Features:
- Multiple Provider Support: Seamlessly works with Ollama, OpenAI, DeepSeek, and many more.
- Dual Interface: Use it as a Python library or through its command-line tool.
- Tool Discovery: Automatically detect and use tools provided by MCP servers.
- Modular Architecture: Enjoy a clean separation of concerns with provider-specific modules.
- Flexible Configuration: Easily configure models and MCP servers using JSON and environment variables.
- Reusability: Build scalable and reusable integrations that can be quickly adapted to new requirements.
Dolphin MCP simplifies the process of building a conversational interface for data manipulation and interaction with AI models, making it a powerful asset for any developer.
Prerequisites and Environment Setup
Before we dive into the installation and integration steps, let’s ensure that your environment is properly set up to work with Dophin MCP.
System Requirements:
- Python 3.8 or higher: Make sure you have Python installed. You can download it from python.org.
- SQLite: Used by the demo database to store sample data (Optional).
- uv/uvx: A fast Python package installer and resolver.
- Node.js 18+ (if using CLI integrations): Required for some additional tools.
Platform-Specific Setup:
Windows:
- Python: Download from python.org and remember to check “Add Python to PATH.”
- SQLite: Download precompiled binaries from the SQLite website, extract them, and add the folder to your PATH.
- uv/uvx: Open your Windows PowerShell as an Administrator and run:
curl -sSf https://install.ultraviolet.rs/windows
- Verify Installations:
python --version
sqlite3 --version
uv --version
macOS:
- Python: Install using Homebrew:
brew install python
- SQLite: Pre-installed on macOS, or update using:
brew install sqlite
- uv/uvx: Install with Homebrew or the official installer:
brew install ultraviolet/uv/uv
or
curl -sSf https://install.ultraviolet.rs/mac
- Verify Installations:
python3 --version
sqlite3 --version
uv --version
Linux (Ubuntu/Debian):
- Python:
sudo apt update
sudo apt install python3 python3-pip
- SQLite:
sudo apt install sqlite3
- uv/uvx:
curl -sSf https://install.ultraviolet.rs/linux
- Verify Installations:
python3 --version
sqlite3 --version
uv --version
Once everything has been downloaded and your system is ready, you’re all set to install Dolphin MCP.
Installation of Dolphin MCP
Ther are two ways in which Dolphin MCP can be installed on your system, either as a package from PyPI or directly from the source.
Option 1: Install from PyPI (Recommended)
The simplest method is to install Dolphin MCP through pip:
pip install dolphin-mcp
This command installs both the library and the command-line tool dolphin-mcp-cli
, which allows you to use the tool directly from your terminal.
Option 2: Install from Source
If you prefer to work with the source code directly or you intend on contributing to the project, then you should follow the steps below:
Clone the Repository:
git clone https://github.com/cognitivecomputations/dolphin-mcp.git
cd dolphin-mcp
Install in Development Mode:
pip install -e .
Set Up the Environment Variables:
Copy the example environment file (the .env.example
file in the project) and update it with your API key. Optionally you can specify the base Url for your model:
cp .env.example .env
Feel free to edit the .env
file as you like to include your OpenAI API key (and any other keys you need).
(Optional) Set Up the Demo Database:
If you want to test the system with some sample data to see if Dophin MCP successfully connected your models to your MCP, run:
python setup_db.py
This command creates a sample SQLite database with information about dolphin species for demo purposes. Pay attention to the output path where the newly created SQLite database will be saved. The database contains some mock data about Dolphin's. Be sure to check it out if you like!
Configuration and Environment Variables
Dolphin MCP uses two main configuration files to manage your settings: the .env
file and mcp_config.json
file.
.env File
The .env
file stores sensitive API credentials. For example:
OPENAI_API_KEY=your_openai_api_key_here
OPENAI_MODEL=gpt-4o
# OPENAI_ENDPOINT=https://api.openai.com/v1 # Uncomment and update if needed
mcp_config.json
This JSON file defines the MCP servers that your client will connect to. An example configuration might look like this:
{
"mcpServers": {
"server1": {
"command": "command-to-start-server",
"args": ["arg1", "arg2"],
"env": {
"ENV_VAR1": "value1",
"ENV_VAR2": "value2"
}
},
"server2": {
"command": "another-server-command",
"args": ["--option", "value"]
}
}
}
By configuring these files, you allow Dolphin MCP to securely store and use your API keys and connect to multiple MCP servers simultaneously.
Testing and Using Dolphin MCP
Dolphin MCP offers flexible ways to test and interact with your MCP server, whether you prefer CLI commands, Python integration, or a legacy script.
Using the CLI Command
The simplest way to interact with your MCP server is through the CLI command. Once your environment is set up and your MCP server is running, you can send a query directly from your terminal. For example:
dolphin-mcp-cli "What dolphin species are endangered?"
Key Options:
--model <name>
: Specify a model (e.g.,gpt-4o
).--quiet
: Hide intermediate output.--config <file>
: Use a custom config file.
Example:
dolphin-mcp-cli --model gpt-4o "List dolphins in the Atlantic Ocean"
This routes your query to connected MCP servers (Ollama, OpenAI, etc.) and returns structured results.
Via Python Library
If you prefer to integrate Dolphin MCP directly into your Python code, the library provides a convenient function called run_interaction
. This allows you to embed MCP interactions as part of a larger application. Here’s an example script that demonstrates how to use the library programmatically:
import asyncio
from dolphin_mcp import run_interaction
async def main():
result = await run_interaction(
user_query="What dolphin species are endangered?",
model_name="gpt-4o",
quiet_mode=False
)
print(result)
asyncio.run(main())
This handles server connections, tool discovery, and model calls automatically.
Legacy Script
For quick tests (for those who prefer a more straightforward approach), run the original script directly from the command line. This method provides the same functionality as the CLI but in a simpler form:
python dolphin_mcp.py "Analyze dolphin migration patterns"
It connects to servers, lists tools, and returns conversational results without extra options.
Example Queries & Demo Database
Try these queries:
- General:
dolphin-mcp-cli "Explain dolphin evolution"
- Model-Specific:
dolphin-mcp-cli --model ollama "Define quantum physics"
- Quiet Mode:
dolphin-mcp-cli --quiet "List endangered species"
Demo Database:
Run setup_db.py
to create a sample SQLite database with dolphin species data. Use it to test queries like:
dolphin-mcp-cli "Which dolphins are critically endangered?"
Output:
{
"species": "Maui Dolphin",
"status": "Critically Endangered"
}
With these tools, Dolphin MCP adapts to your workflow—whether you’re debugging, scripting, or building complex AI systems. Feel free to also visit their GitHub repo.
Conclusion
Dolphin MCP revolutionizes AI integration by seamlessly connecting tools like Ollama and OpenAI into a unified workflow. With its CLI for natural language queries, Python library for programmatic control, and demo database for testing, it empowers developers to build sophisticated AI agents without boilerplate code. Whether analyzing conservation data, generating reports, or experimenting with local LLMs, Dolphin MCP simplifies complex tasks while maintaining flexibility. Its multi-model support and intuitive configuration make it ideal for both quick prototypes and production systems.
Ready to streamline your AI projects? Download Apidog to test your MCP server’s APIs and start building smarter workflows today!