How to build a Custom MCP Server for Cursor (Open Source)

Learn how to create and deploy custom MCP servers for Cursor IDE. From setup to advanced tool development, master AI-powered workflow automation.

Ashley Goolam

Ashley Goolam

26 March 2025

How to build a Custom MCP Server for Cursor (Open Source)

Imagine giving your Cursor IDE superpowers - like automatically searching the web or analyzing your documents without leaving your editor. In this tutorial, we'll walk through creating a custom MCP (Model Context Protocol) server that adds these exact capabilities to Cursor.

💡
To streamline your API integration process, tools like Apidog can simplify testing and debugging. Download Apidog for free today to manage your API workflows efficiently and ensure seamless interaction with the Gemini 2.5 Pro API. In this technical guide, we’ll walk you through how to use the Gemini 2.5 Pro API, from setup to advanced use cases, with clear examples and actionable steps.
button

Why Build a Custom MCP Server?

MCP servers let you extend Cursor's functionality beyond its built-in features. With your own MCP server, you can:

Recent updates make MCP server development easier than ever - perfect for beginners!

Step 1: Setting Up Your Development Environment

Prerequisites

Before we begin, make sure you have:

  1. Cursor IDE (latest version)
  2. Python 3.8+ installed
  3. UV package manager (we'll install this below)
  4. Basic familiarity with terminal commands

Getting the Starter Template

We'll use a ready-made template to get started quickly:

  1. Clone the repository:
git clone https://github.com/patchy631/ai-engineering-hub/tree/main/cursor_linkup_mcp
  1. Open the folder in Cursor IDE

Step 2: Setting Up the MCP Server in Cursor

In Cursor, go to:

Settings > Cursor Settings > MCP > Add New MCP Server
cursor mcp server settings

Configure your server:

add mcp server to cursor

If you don't have UV installed:

pip install uv

Set the command to run your server:

uv --directory /path/to/cursor_linkup_mcp run server.py

(Replace /path/to/ with the actual location where you cloned the repository)

Click "Add" to save your configuration

check mcp server configuration

Step 3: Testing Your New Tools

Now that your server is set up, let's test its capabilities:

1. Web Search Tool

This allows Cursor to search the web for answers to your questions.

How to use:

  1. Open a new chat in "Agent" mode
use cursor in agent mode

2. Ask a question that requires web lookup, like:

>> Who won the latest cricket match between India and Australia?
web tool search query

3. Cursor will use your MCP server to find and display the answer

web tool search result

2. RAG (Document Analysis) Tool

This lets Cursor analyze your personal documents.

How to set up:

  1. In the cloned repository, find the data folder
data folder

2. Add any documents you want to analyze (PDFs, Word files, etc.)

3. In chat, ask questions about your documents:

>> Summarize the key points from my file about how DeepSeek R1 is trained.
Rag tool search query

View the results:

Rag tool result

How It Works Under the Hood

Your MCP server acts as a bridge between Cursor and external services:

  1. When you ask a question, Cursor sends it to your MCP server
  2. The server processes the request (searching web or analyzing documents)
  3. Results are sent back to Cursor for display

Understanding the MCP Server Code

This Python script creates a custom MCP (Model Context Protocol) server that adds two powerful AI tools to Cursor: web search and document analysis (RAG). Let's break down what each part does:

1. Importing Dependencies

import asyncio
from dotenv import load_dotenv
from linkup import LinkupClient
from rag import RAGWorkflow
from mcp.server.fastmcp import FastMCP

2. Initial Setup

load_dotenv()

mcp = FastMCP('linkup-server')
client = LinkupClient()
rag_workflow = RAGWorkflow()

3. Web Search Tool

@mcp.tool()
def web_search(query: str) -> str:
    """Search the web for the given query."""
    search_response = client.search(
        query=query,
        depth="standard",  # "standard" or "deep"
        output_type="sourcedAnswer",  # Options: "searchResults", "sourcedAnswer", or "structured"
        structured_output_schema=None,  # Required if output_type="structured"
    )
    return search_response

What it does:

Example usage in Cursor:

/web_search query="Who won the 2023 Cricket World Cup?"

4. Document Analysis (RAG) Tool

@mcp.tool()
async def rag(query: str) -> str:
    """Use RAG to answer queries using documents from the data directory"""
    response = await rag_workflow.query(query)
    return str(response)

What it does:

Example usage in Cursor:

/rag query="What are the key safety recommendations in this AI paper?"

5. Server Startup

if __name__ == "__main__":
    asyncio.run(rag_workflow.ingest_documents("data"))
    mcp.run(transport="stdio")

What happens when you run this:

  1. First loads all documents from the data folder into memory
  2. Starts the MCP server using stdio (standard input/output) communication
  3. Makes both tools available to Cursor
💡
Would you like to add more MCP Server's to your Claude, Cursor or Windsurf, be sure to check out HiMCP and discover 1682+ Awesome MCP Servers and Clients to and turbocharge your AI Coding Workflow with ease!
HiMCP.ai home page

Key Features of The MCP Server Implementation

  1. Security: Uses .env for sensitive data
  2. Flexibility: Offers different search modes (standard/deep)
  3. Local Processing: Analyzes your private documents without sending them to the cloud
  4. Performance: Uses async operations for smooth experience

How Cursor Uses This Server

  1. You type a command in Cursor (like /web_search)
  2. Cursor sends your query to this running server
  3. The server processes it (searching web or analyzing documents)
  4. Results are returned to Cursor and displayed to you

This turns your Cursor IDE into a powerful research assistant that can both search the web and analyze your personal documents - all through simple chat commands!

Troubleshooting Tips

If something isn't working:

  1. Check that the UV command points to the correct location
  2. Make sure all dependencies are installed (run pip install -r requirements.txt)
  3. Verify your Python version is 3.8 or higher
  4. Check Cursor's error logs if the server fails to start

Next Steps:

Now that you have a basic MCP server running, you can:

Final Thoughts

Building your first MCP server might seem daunting, but as you've seen, the template makes it straightforward. In less than 30 minutes, you've added powerful new capabilities to Cursor that will save you hours of manual work.

What will you build next? Maybe a tool to:

The possibilities are endless! Remember, every expert was once a beginner - you've just taken your first step into the world of MCP server development.

And while you're at it, don't forget to check out Apidog to supercharge you MCP and API development workflow! 🚀

button

Explore more

How to Use Google Cloud Run MCP Server for AI-Driven Cloud Deployment

How to Use Google Cloud Run MCP Server for AI-Driven Cloud Deployment

Dive into using Google Cloud Run MCP Server for deploying applications to the cloud via AI agents. Then, discover how Apidog MCP Server connects your API specifications to AI, streamlining API development and boosting productivity with AI coding.

30 May 2025

Top 20 AI Coding Agents that You Must Try 2025

Top 20 AI Coding Agents that You Must Try 2025

Discover 20 game-changing AI coding agents that are transforming how developers build apps in 2025—from full-stack tools to browser automation assistants.

30 May 2025

Running DeepSeek R1 0528 Qwen 8B Locally: Complete Guide with Ollama and LM Studio

Running DeepSeek R1 0528 Qwen 8B Locally: Complete Guide with Ollama and LM Studio

Learn how to run DeepSeek R1 0528 Qwen 8B locally using Ollama and LM Studio. Complete technical guide covering installation, optimization, API integration, and troubleshooting for local AI model deployment.

30 May 2025

Practice API Design-first in Apidog

Discover an easier way to build and use APIs