How to Use MCP with Ollama (without Claude, with Dolphin MCP)

Discover how to use MCP with Ollama, OpenAI, and Deepseek using Dolphin MCP. Follow this guide to set up, integrate, and test your AI-driven MCP server.

Ashley Goolam

Ashley Goolam

14 March 2025

How to Use MCP with Ollama (without Claude, with Dolphin MCP)

Connecting language models with external data sources is critical for building robust, intelligent applications. Model Context Protocol (MCP) is a standardized framework that streamlines the exchange of context and data between AI models and external systems. Whether you’re building chatbots, search engines, or data analysis tools, MCP helps bridge the gap between different models and APIs, ensuring a seamless flow of information.

Imagine a system where you can easily switch between using Ollama for lightweight, local model inference, OpenAI for cutting-edge natural language understanding, and Deepseek for powerful search capabilities. Now, add Dolphin MCP—an open-source Python library and CLI tool that simplifies this integration. Dolphin MCP not only connects to multiple MCP servers simultaneously but also makes their tools available to language models through natural language queries.

In this tutorial, we’ll guide you through everything from installing Dolphin MCP to integrating it with models like Ollama and OpenAI.

💡
Before we dive in, here’s a quick tip: Download Apidog for free today! It’s a great tool for developers who want to simplify testing AI models, especially those using LLMs (Large Language Models). Apidog helps you streamline the API testing process, making it easier to work with cutting-edge AI technologies. Give it a try!
Apidog — the all-in-one API development tool
button

What is MCP? (Starting from the basics)

Model Context Protocol (MCP) is a framework designed to standardize the interaction between AI models and external applications. It allows different models to share context, exchange data, and call tools in a unified, conversational manner. With MCP, you can:

By using MCP, developers can focus on building innovative solutions without worrying about the underlying complexities of cross-model communication. Click here if you would like a more in-depth tutorial on MCP and what it is all about.

Why Use Dolphin MCP?

Dolphin MCP is an open-source Python library and CLI tool that makes it incredibly simple to interact with multiple MCP servers (you can have as many as you like). Its design emphasizes modularity and ease of use, providing a clean API for integrating with various language models like OpenAI, Anthropic, and Ollama, as well as external data sources like Deepseek. You can simply switch between models according to the needs of the task you are working on!

Key Features:

Dolphin MCP simplifies the process of building a conversational interface for data manipulation and interaction with AI models, making it a powerful asset for any developer.

Prerequisites and Environment Setup

Before we dive into the installation and integration steps, let’s ensure that your environment is properly set up to work with Dophin MCP.

System Requirements:

Platform-Specific Setup:

Windows:

curl -sSf https://install.ultraviolet.rs/windows
python --version
sqlite3 --version
uv --version

macOS:

brew install python
brew install sqlite
brew install ultraviolet/uv/uv

or

curl -sSf https://install.ultraviolet.rs/mac
python3 --version
sqlite3 --version
uv --version

Linux (Ubuntu/Debian):

sudo apt update
sudo apt install python3 python3-pip
sudo apt install sqlite3
curl -sSf https://install.ultraviolet.rs/linux
python3 --version
sqlite3 --version
uv --version

Once everything has been downloaded and your system is ready, you’re all set to install Dolphin MCP.

Installation of Dolphin MCP

Ther are two ways in which Dolphin MCP can be installed on your system, either as a package from PyPI or directly from the source.

The simplest method is to install Dolphin MCP through pip:

pip install dolphin-mcp

This command installs both the library and the command-line tool dolphin-mcp-cli, which allows you to use the tool directly from your terminal.

Option 2: Install from Source

If you prefer to work with the source code directly or you intend on contributing to the project, then you should follow the steps below:

Clone the Repository:

git clone https://github.com/cognitivecomputations/dolphin-mcp.git
cd dolphin-mcp

Install in Development Mode:

pip install -e .

Set Up the Environment Variables:

Copy the example environment file (the .env.example file in the project) and update it with your API key. Optionally you can specify the base Url for your model:

cp .env.example .env

Feel free to edit the .env file as you like to include your OpenAI API key (and any other keys you need).

(Optional) Set Up the Demo Database:

If you want to test the system with some sample data to see if Dophin MCP successfully connected your models to your MCP, run:

python setup_db.py

This command creates a sample SQLite database with information about dolphin species for demo purposes. Pay attention to the output path where the newly created SQLite database will be saved. The database contains some mock data about Dolphin's. Be sure to check it out if you like!

Configuration and Environment Variables

Dolphin MCP uses two main configuration files to manage your settings: the .env file and mcp_config.json file.

.env File

The .env file stores sensitive API credentials. For example:

OPENAI_API_KEY=your_openai_api_key_here
OPENAI_MODEL=gpt-4o
# OPENAI_ENDPOINT=https://api.openai.com/v1  # Uncomment and update if needed

mcp_config.json

This JSON file defines the MCP servers that your client will connect to. An example configuration might look like this:

{
  "mcpServers": {
    "server1": {
      "command": "command-to-start-server",
      "args": ["arg1", "arg2"],
      "env": {
        "ENV_VAR1": "value1",
        "ENV_VAR2": "value2"
      }
    },
    "server2": {
      "command": "another-server-command",
      "args": ["--option", "value"]
    }
  }
}

By configuring these files, you allow Dolphin MCP to securely store and use your API keys and connect to multiple MCP servers simultaneously.

Testing and Using Dolphin MCP

Dolphin MCP offers flexible ways to test and interact with your MCP server, whether you prefer CLI commands, Python integration, or a legacy script.

Using the CLI Command

The simplest way to interact with your MCP server is through the CLI command. Once your environment is set up and your MCP server is running, you can send a query directly from your terminal. For example:

dolphin-mcp-cli "What dolphin species are endangered?"  

Key Options:

Example:

dolphin-mcp-cli --model gpt-4o "List dolphins in the Atlantic Ocean"  

This routes your query to connected MCP servers (Ollama, OpenAI, etc.) and returns structured results.

Via Python Library

If you prefer to integrate Dolphin MCP directly into your Python code, the library provides a convenient function called run_interaction. This allows you to embed MCP interactions as part of a larger application. Here’s an example script that demonstrates how to use the library programmatically:

import asyncio  
from dolphin_mcp import run_interaction  

async def main():  
    result = await run_interaction(  
        user_query="What dolphin species are endangered?",  
        model_name="gpt-4o",  
        quiet_mode=False  
    )  
    print(result)  

asyncio.run(main())  

This handles server connections, tool discovery, and model calls automatically.

Legacy Script

For quick tests (for those who prefer a more straightforward approach), run the original script directly from the command line. This method provides the same functionality as the CLI but in a simpler form:

python dolphin_mcp.py "Analyze dolphin migration patterns"  

It connects to servers, lists tools, and returns conversational results without extra options.

Example Queries & Demo Database

Try these queries:

Demo Database:
Run setup_db.py to create a sample SQLite database with dolphin species data. Use it to test queries like:

dolphin-mcp-cli "Which dolphins are critically endangered?"  

Output:

{  
  "species": "Maui Dolphin",  
  "status": "Critically Endangered"  
}  

With these tools, Dolphin MCP adapts to your workflow—whether you’re debugging, scripting, or building complex AI systems. Feel free to also visit their GitHub repo.

Conclusion


Dolphin MCP revolutionizes AI integration by seamlessly connecting tools like Ollama and OpenAI into a unified workflow. With its CLI for natural language queries, Python library for programmatic control, and demo database for testing, it empowers developers to build sophisticated AI agents without boilerplate code. Whether analyzing conservation data, generating reports, or experimenting with local LLMs, Dolphin MCP simplifies complex tasks while maintaining flexibility. Its multi-model support and intuitive configuration make it ideal for both quick prototypes and production systems.

Ready to streamline your AI projects? Download Apidog to test your MCP server’s APIs and start building smarter workflows today!

button

Explore more

30+ Public Web 3.0 APIs You Can Use Now

30+ Public Web 3.0 APIs You Can Use Now

The ascent of Web 3.0 marks a paradigm shift in how we interact with the digital world. Moving beyond the centralized platforms of Web 2.0, this new era champions decentralization, user ownership, and a more transparent, permissionless internet. At the heart of this transformation lie Application Programming Interfaces (APIs), the unsung heroes that enable developers to build innovative decentralized applications (dApps), integrate blockchain functionalities, and unlock the vast potential of thi

4 June 2025

Fixed: "Error Cascade has encountered an internal error in this step. No credits consumed on this tool call."

Fixed: "Error Cascade has encountered an internal error in this step. No credits consumed on this tool call."

Facing the dreaded "Error Cascade has encountered an internal error in this step. No credits consumed on this tool call"? You're not alone. We delve into this frustrating Cascade error, explore user-reported workarounds.

4 June 2025

How to Obtain a Rugcheck API Key and Use Rugcheck API

How to Obtain a Rugcheck API Key and Use Rugcheck API

The cryptocurrency landscape is rife with opportunity, but also with significant risk. Rug pulls and poorly designed tokens can lead to substantial losses. Rugcheck.xyz provides a critical service by analyzing crypto projects for potential red flags. Its API allows developers, traders, and analysts to programmatically access these insights, automating and scaling their due diligence efforts. This guide will focus heavily on how to use the Rugcheck.xyz API, equipping you with practical Python exa

4 June 2025

Practice API Design-first in Apidog

Discover an easier way to build and use APIs