Apidog

All-in-one Collaborative API Development Platform

API Design

API Documentation

API Debugging

API Mocking

API Automated Testing

Let's Build an Open Source Deep Research Agent Using MCP Servers

Discover how to build a customizable deep research agent using open-source MCP servers for enhanced privacy and control.

Ashley Goolam

Ashley Goolam

Updated on March 21, 2025

Have you ever thought about creating your very own open-source deep research agent instead of relying on proprietary options like OpenAI's Deep Research and Google's Deep Researcher? With powerful Model Context Protocol (MCP) servers like Sequential-Thinking and Exa, you can build an impressive and robust alternative to proprietary tools.

💡
Boost your API workflow while working with AI IDEs like Windsurf or Cursor with Apidog! This free, all-in-one platform lets you design, test, mock, and document APIs seamlessly—all in a single interface. Give it a try and take your development to the next level!
Apidog all in one image
button

In this guide, we’ll walk you through setting up and running your own research agent using just two MCP servers:
✅ Sequential-Thinking – for structured reasoning and analysis
✅ Exa – for powerful AI-driven web searches

💡
Don't forget to check out at HiMCP, discover 1682+ Awesome MCP Servers and Clients, and turbocharge your AI Coding Workflow with ease!

We’ll be working with Windsurf IDE and integrating your choice of AI models. For this tutorial, I’ll be using DeepSeek V3, but you can also opt for models like:

  • Claude Sonnet 3.5 or 3.7 (from Anthropic)
  • GPT-4o or GPT-3.5 (from OpenAI)
  • O3-mini and others

Let’s get started! 🚀

What is an Open-Source Deep Research Agent?

So, what actually is this "open-source deep research agent" tool that we'll be working on? At its core, the open-source deep research agent is a tool designed to automate research tasks by leveraging the power of the Model Context Protocol (MCP). It combines the power of AI-driven reasoning with web search capabilities, allowing you to gather, analyze, and summarize information from multiple sources efficiently.

Here’s how it works:

  • The tool connects to an MCP server, such as Sequential-Thinking or Exa, to process research queries intelligently.
  • It integrates with AI models to interpret, summarize, and generate insights from the gathered information.
  • It utilizes web search capabilities to fetch relevant data, ensuring that your research is comprehensive and up to date.
  • Since it’s built with open-source components, you have full control over your data, ensuring transparency and customization.

Who Is This Open-Source Deep Research Agent Suitable For?

This deep research agent is ideal for:
1. Researchers & Academics – Quickly gather and analyze information from various sources to support academic writing, literature reviews, or scientific exploration.
2. Journalists & Writers – Automate background research, fact-checking, and content curation for articles, reports, or investigative journalism.
3. Developers & AI Enthusiasts – Experiment with AI-powered workflows, build custom research assistants, or integrate MCP servers into their projects.
4. Analysts & Policy Makers – Extract insights from vast datasets, reports, and news sources to inform decision-making.
5. Students & Lifelong Learners – Streamline study sessions by summarizing key concepts and generating well-structured explanations.

When Is an Open-Source Deep Research Agent Most Useful?

1. Handling Large-Scale Research – When dealing with vast amounts of information across multiple sources, an AI-assisted research agent can save time and effort.
2. Automating Repetitive Research Tasks – If you frequently conduct similar searches, this tool can automate the process, reducing manual work.
3. Ensuring Unbiased & Transparent Research – Unlike closed-source research tools, an open-source solution allows you to verify how data is processed and maintain full control over your workflow.
4. Working with Custom AI Models – If you prefer using a specific LLM (like DeepSeek v3) or need domain-specific AI models, this tool lets you integrate the model of your choice.
5. Enhancing Productivity – By combining AI-driven reasoning with web search, you get well-organized insights faster than traditional research methods.

How to Set Up Your Open-Source Deep Research Agent

Prerequisites:

  1. Windsurf IDE: the latest version of Windsurf can be installed from the official website.
  2. Node.js: v20 or higher is recommended.
  3. npm: recommended to have the latest version of npm, however you should be able to get along fine with v7.

Step 1: Create a New Project Folder

Create a New Folder: Start by creating a new folder for your project, e.g., deep_researcher.

Open with Windsurf IDE: Open this folder using Windsurf IDE, which supports MCP server integration.

Step 2: Install Sequential-Thinking MCP Server

Install Sequential-Thinking MCP: Run the command below to install and configure the Sequential-Thinking MCP server. This will automatically set up the server without requiring manual configuration changes.

npx -y @smithery/cli@latest install @smithery-ai/server-sequential-thinking --client windsurf --config "{}"

Verify Configuration: Check the mcp_config.json file in Windsurf's configuration directory .codeium(if you cannot remember where you installed windsurf try looking at: C:/Users/You/.codeium/windsurf/mcp_config.json) to ensure the Sequential-Thinking server is correctly configured. It should look something like this:

# mcp_config.json file

{
  "mcpServers": {
    "server-sequential-thinking": {
      "command": "cmd",
      "args": [
        "/c",
        "npx",
        "-y",
        "@smithery/cli@latest",
        "run",
        "@smithery-ai/server-sequential-thinking",
        "--config",
        "\"{}\""
      ]
    }
  }
}

If the file is empty, visit their GitHub repository for the updated configurations or simply copy and paste the one above.

Test the Server: Test the Sequential-Thinking MCP server by running sample commands, such as:

# Sample Input
>> Use sequential thinking to help me develop a simple Flappy Bird Python game.

Step 3: Set Up Exa Web Search MCP Server

Create an Exa Account and Get API Key:

Visit the Exa official website, create an account, and obtain a free API key from the "API Keys" section of your profile.

access your exa api keys

Clone Exa MCP Server Repository: Clone the Exa MCP server repository from GitHub:

git clone https://github.com/exa-labs/exa-mcp-server.git cd exa-mcp-server

Install Dependencies and Build Project:

To install all dependencies using npm, run the command:

npm install

Build the project:

npm run build

Create a Global Link:

Run the following command to make the Exa MCP server executable from anywhere by running:

npm link

Configure Exa MCP Server in Windsurf:

Update the mcp_config.json file with the latest configurations from Exa's GitHub repository. Replace example text with your actual API key. It should look something like this:

# mcp_config.json file

{
  "mcpServers": {
    "server-sequential-thinking": {
      "command": "cmd",
      "args": [
        "/c",
        "npx",
        "-y",
        "@smithery/cli@latest",
        "run",
        "@smithery-ai/server-sequential-thinking",
        "--config",
        "\"{}\""
      ]
    },
    "exa": {
      "command": "npx",
      "args": ["C:/Research_agent/exa-mcp-server/build/index.js"],
      "env": {
        "EXA_API_KEY": "YOUR_API_KEY"
      }
    }
  }
}

Test Exa MCP Server: to verify that the Exa server is working by running sample prompts, such as:

# Sample Input
>> Find blog posts about AGI.

Using Your Open-Source Deep Research Agent

Now that both MCP servers have been set up and configured correctly, you can use them to create a deep research tool. Here’s a sample prompt to test your setup:

# Sample Input
>> When using sequential-thinking, use as many steps as possible. Each step of sequential-thinking must use branches, isRevision, and needsMoreThoughts, and each branch must have at least 3 steps. Before each step of sequential-thinking, use Exa to search for 3 related web pages and then think about the content of the web pages. The final reply should be long enough and well-structured. Question: What is metaphysics?

This prompt will generate a structured response with search links used by the model to answer your question.

Web searched links:

links used during web-search

Sample output:

final output in structured order

Features and Benefits

Flexibility and Control:

By using open-source MCP servers, you maintain full control over your research process and data privacy.

Customization:

You can choose from various AI models like DeepSeek v3, Claude Sonnet 3.5, or GPT 4o, allowing you to tailor your research tool to specific needs.

Cost Efficiency:

Running your own MCP servers can be more cost-effective than relying on proprietary services, especially for frequent or large-scale research tasks.

Ollama Deep Research, the Open-Source Alternative to OpenAI Deep Researcher
Ollama Deep Research is the open-source alternative to OpenAI Deep Researcher. This guide covers setup, features, pricing, and why it’s a better choice.

What to Do When Open Source Deep Research Agent Doesn't Work

When setting up and using your open-source deep research agent with MCP servers, you might encounter some issues. Here are some common problems and their solutions:

MCP Server Not Configured Correctly:

If your MCP server is not working as expected, check the configuration files (e.g., mcp_config.json) for errors. Ensure that API keys are correctly set, and that the server is properly linked.

API Key Issues:

If you encounter errors related to API keys, verify that they are correctly entered in your configuration files. Also, check if your API keys have expired or if you have exceeded usage limits.

Model Not Responding:

If your AI model is not responding, ensure that it is properly installed and configured. Check for any updates to the model or its dependencies.

Web Search Results Not Found:

If web search results are not being returned, check your internet connection and ensure that the search API (e.g., Exa) is functioning correctly.

Conclusion

Building an open-source deep research agent using MCP servers like Sequential-Thinking and Exa offers a powerful alternative to proprietary tools. By integrating these servers with Windsurf IDE and your preferred AI model, you can create a flexible and cost-effective research tool that maintains your data privacy and control.

button

Top 10 Best AI Voice APIs in 2025Viewpoint

Top 10 Best AI Voice APIs in 2025

This article explores the top 10 Best AI Voice APIs dominating the market in 2025, examining their unique strengths, key features, and ideal use cases to help you choose the right solution for your specific needs.

Mikael Svenson

March 21, 2025

Build Open Source Claude Web Search Alternative (with Firecrawl MCP Server)Viewpoint

Build Open Source Claude Web Search Alternative (with Firecrawl MCP Server)

For organizations needing more control, customization, or privacy than Claude's web search offers, building an alternative using Firecrawl provides an excellent solution. Let's learn how!

Emmanuel Mumba

March 21, 2025

How to Generate LLMs.txt Files Using Firecrawl MCPViewpoint

How to Generate LLMs.txt Files Using Firecrawl MCP

Discover how Cline and Firecrawl MCP integrate to automate web scraping and generate LLM-ready text files for AI analysis and training.

Ashley Goolam

March 21, 2025