How to use MCP Servers with OpenRouter

Integrate MCP Servers with OpenRouter to enhance AI functionality via a single API. This guide details setup, tool conversion, API calls, and real-world use cases. Optimize your AI workflows with MCP Servers and OpenRouter. Download Apidog free for API testing.

Ashley Innocent

Ashley Innocent

18 June 2025

How to use MCP Servers with OpenRouter

AI development is evolving fast, and integrating external tools with language models is a critical step forward. OpenRouter provides a unified API to access numerous language models, while MCP Servers (Model Context Protocol Servers) enable these models to execute external tools and access live data. Combining them creates a powerful system for building advanced AI applications.

In this post, I’ll guide you through integrating MCP Servers with OpenRouter. You’ll learn their core functionalities, the integration process, and practical examples.

💡
Before starting, download Apidog for free here it’s an excellent tool for testing APIs and debugging your setup.
button

Understanding MCP Servers and OpenRouter

To integrate MCP Servers with OpenRouter, you first need to grasp what each component does.

OpenRouter: Unified Access to Language Models

OpenRouter is a platform that simplifies interaction with large language models (LLMs) from providers like OpenAI, Anthropic, and xAI. It offers a single API endpoint https://openrouter.ai/api/v1/chat/completions compatible with OpenAI’s API structure. Key features include:

You’ll need an OpenRouter account and an API key to proceed. Get yours at openrouter.ai.

MCP Servers: Extending Model Capabilities

MCP Servers implement the Model Context Protocol, enabling LLMs to call external tools. Unlike standalone models limited to their training data, MCP Servers allow real-time interaction with systems like file directories, databases, or third-party APIs. A typical MCP tool definition includes:

For example, an MCP tool to list directory files might look like this:

{
  "name": "list_files",
  "description": "Lists files in a specified directory",
  "parameters": {
    "type": "object",
    "properties": {
      "path": {"type": "string", "description": "Directory path"}
    },
    "required": ["path"]
  }
}

Together, OpenRouter provides the model, and MCP Servers supply the tools, forming a robust AI ecosystem.

Why Integrate MCP Servers with OpenRouter?

Combining these technologies offers several technical advantages:

This integration is ideal for developers building AI systems that need real-world interactivity.

Step-by-Step Integration Process

Now, let’s get technical. Here’s how to integrate MCP Servers with OpenRouter.

Prerequisites

Ensure you have:

button

Step 1: Define and Convert MCP Tools

OpenRouter uses OpenAI’s tool-calling format, so you must convert MCP tool definitions. Start with the MCP definition:

{
  "name": "list_files",
  "description": "Lists files in a specified directory",
  "parameters": {
    "type": "object",
    "properties": {
      "path": {"type": "string", "description": "Directory path"}
    },
    "required": ["path"]
  }
}

Convert it to OpenAI format by adding a type field and nesting the function details:

{
  "type": "function",
  "function": {
    "name": "list_files",
    "description": "Lists files in a specified directory",
    "parameters": {
      "type": "object",
      "properties": {
        "path": {"type": "string", "description": "Directory path"}
      },
      "required": ["path"]
    }
  }
}

This JSON structure is what OpenRouter expects in its API payload.

Step 2: Configure the API Request

Prepare an API request to OpenRouter. Define headers with your API key and a payload with the model, messages, and tools. Here’s a Python example:

import requests
import json

# Headers
headers = {
    "Authorization": "Bearer your_openrouter_api_key",
    "Content-Type": "application/json"
}

# Payload
payload = {
    "model": "openai/gpt-4",  # Replace with your preferred model
    "messages": [
        {"role": "user", "content": "List files in the current directory."}
    ],
    "tools": [
        {
            "type": "function",
            "function": {
                "name": "list_files",
                "description": "Lists files in a specified directory",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "path": {"type": "string", "description": "Directory path"}
                    },
                    "required": ["path"]
                }
            }
        }
    ]
}

Replace your_openrouter_api_key with your actual key.

Step 3: Send the Initial API Request

Make a POST request to OpenRouter’s endpoint:

response = requests.post(
    "https://openrouter.ai/api/v1/chat/completions",
    headers=headers,
    json=payload
)

response_data = response.json()

Step 4: Process Tool Calls

Check if the response includes a tool call:

{
  "choices": [
    {
      "message": {
        "role": "assistant",
        "content": null,
        "tool_calls": [
          {
            "id": "call_123",
            "type": "function",
            "function": {
              "name": "list_files",
              "arguments": "{\"path\": \".\"}"
            }
          }
        ]
      }
    }
  ]
}

Extract the tool call details:

message = response_data["choices"][0]["message"]
if "tool_calls" in message:
    tool_call = message["tool_calls"][0]
    function_name = tool_call["function"]["name"]
    arguments = json.loads(tool_call["function"]["arguments"])

Step 5: Call the MCP Server

Send the tool request to your MCP Server:

mcp_response = requests.post(
    "http://localhost:8000/call",
    json={
        "name": function_name,
        "arguments": arguments
    }
)

tool_result = mcp_response.json()["result"]  # e.g., ["file1.txt", "file2.txt"]

Step 6: Return Tool Result to OpenRouter

Append the assistant’s tool call and the result to the message history:

messages = payload["messages"] + [
    {
        "role": "assistant",
        "content": null,
        "tool_calls": [tool_call]
    },
    {
        "role": "tool",
        "tool_call_id": tool_call["id"],
        "content": json.dumps(tool_result)
    }
]

# Update payload
payload["messages"] = messages

# Send follow-up request
final_response = requests.post(
    "https://openrouter.ai/api/v1/chat/completions",
    headers=headers,
    json=payload
)

final_output = final_response.json()["choices"][0]["message"]["content"]
print(final_output)  # e.g., "Files: file1.txt, file2.txt"

Step 7: Handle Multiple Tool Calls

If the model requires multiple tool calls, loop through the process:

messages = payload["messages"]

while True:
    response = requests.post(
        "https://openrouter.ai/api/v1/chat/completions",
        headers=headers,
        json={"model": "openai/gpt-4", "messages": messages}
    )
    message = response.json()["choices"][0]["message"]
    
    if "tool_calls" not in message:
        print(message["content"])
        break
    
    for tool_call in message["tool_calls"]:
        function_name = tool_call["function"]["name"]
        arguments = json.loads(tool_call["function"]["arguments"])
        
        mcp_response = requests.post(
            "http://localhost:8000/call",
            json={"name": function_name, "arguments": arguments}
        )
        tool_result = mcp_response.json()["result"]
        
        messages.extend([
            {"role": "assistant", "content": null, "tool_calls": [tool_call]},
            {"role": "tool", "tool_call_id": tool_call["id"], "content": json.dumps(tool_result)}
        ])

This ensures all tool calls are processed.

Real-World Example: File System Interaction

Let’s apply this to a practical scenario listing files with an MCP Server.

  1. Tool Definition: Use the list_files tool from earlier.
  2. MCP Server: Assume it’s running at http://localhost:8000.
  3. API Call: Send “List files in the current directory” to OpenRouter.
  4. Response Handling: The model calls list_files with {"path": "."}.
  5. MCP Execution: The server returns ["file1.txt", "file2.txt"].
  6. Final Output: The model responds, “Files found: file1.txt, file2.txt.”

Here’s the complete code:

import requests
import json

headers = {"Authorization": "Bearer your_openrouter_api_key", "Content-Type": "application/json"}
payload = {
    "model": "openai/gpt-4",
    "messages": [{"role": "user", "content": "List files in the current directory."}],
    "tools": [{
        "type": "function",
        "function": {
            "name": "list_files",
            "description": "Lists files in a specified directory",
            "parameters": {
                "type": "object",
                "properties": {"path": {"type": "string", "description": "Directory path"}},
                "required": ["path"]
            }
        }
    }]
}

response = requests.post("https://openrouter.ai/api/v1/chat/completions", headers=headers, json=payload)
message = response.json()["choices"][0]["message"]

if "tool_calls" in message:
    tool_call = message["tool_calls"][0]
    function_name = tool_call["function"]["name"]
    arguments = json.loads(tool_call["function"]["arguments"])
    
    mcp_response = requests.post("http://localhost:8000/call", json={"name": function_name, "arguments": arguments})
    tool_result = mcp_response.json()["result"]
    
    messages = payload["messages"] + [
        {"role": "assistant", "content": null, "tool_calls": [tool_call]},
        {"role": "tool", "tool_call_id": tool_call["id"], "content": json.dumps(tool_result)}
    ]
    
    final_response = requests.post("https://openrouter.ai/api/v1/chat/completions", headers=headers, json={"model": "openai/gpt-4", "messages": messages})
    print(final_response.json()["choices"][0]["message"]["content"])

Troubleshooting Common Issues

Here are solutions to frequent problems:

Use Apidog to debug API requests and responses efficiently.

Expanding the Integration

To exceed 2000 words and add depth, consider these extensions:

Database Query Example

Define an MCP tool to query a database:

{
  "type": "function",
  "function": {
    "name": "query_db",
    "description": "Queries a database with SQL",
    "parameters": {
      "type": "object",
      "properties": {"sql": {"type": "string", "description": "SQL query"}},
      "required": ["sql"]
    }
  }
}

Send “Get all users from the database” to OpenRouter, process the query_db call, and return results like [{"id": 1, "name": "Alice"}].

Error Handling

Add robust error handling:

try:
    mcp_response = requests.post("http://localhost:8000/call", json={"name": function_name, "arguments": arguments})
    mcp_response.raise_for_status()
except requests.RequestException as e:
    tool_result = f"Error: {str(e)}"

This ensures your application remains stable.

Conclusion

Integrating MCP Servers with OpenRouter enables your AI to leverage external tools through a single API. This guide covered setup, tool conversion, API calls, and practical examples like file system interaction. With benefits like cost savings and enhanced functionality, this approach is a must-try for technical developers.

Start experimenting now grab Apidog for free here to test your APIs. Let me know how it goes!

button

Explore more

Apidog SEO Settings Explained: Maximize Your API Docs Visibility

Apidog SEO Settings Explained: Maximize Your API Docs Visibility

Discover how to supercharge your API documentation's visibility with Apidog's powerful SEO features. This comprehensive guide covers everything from page-level optimizations like custom URLs and meta tags to site-wide settings such as sitemaps and robots.txt.

18 June 2025

How to Protect API Specification from Unauthorized Users with Apidog

How to Protect API Specification from Unauthorized Users with Apidog

Learn how Apidog empowers you to protect API specification from unauthorized users. Explore advanced API documentation security, access controls, and sharing options for secure API development.

17 June 2025

How to Use the PostHog MCP Server?

How to Use the PostHog MCP Server?

Discover how to use the PostHog MCP server with this in-depth technical guide. Learn to install, configure, and optimize the server for seamless PostHog analytics integration using natural language. Includes practical use cases and troubleshooting.

16 June 2025

Practice API Design-first in Apidog

Discover an easier way to build and use APIs