How to Use MCP Servers in LM Studio

Rebecca Kovács

Rebecca Kovács

26 June 2025

How to Use MCP Servers in LM Studio

The world of local Large Language Models (LLMs) represents a frontier of privacy, control, and customization. For years, developers and enthusiasts have run powerful models on their own hardware, free from the constraints and costs of cloud-based services.However, this freedom often came with a significant limitation: isolation. Local models could reason, but they could not act. With the release of version 0.3.17, LM Studio shatters this barrier by introducing support for the Model Context Protocol (MCP), a transformative feature that allows your local LLMs to connect with external tools and resources.

This guide provides a comprehensive, deep-dive into configuring and using this powerful feature. We will move from foundational concepts to advanced, practical examples, giving you a complete picture of how to turn your local LLM into an interactive and effective agent.

💡
Want a great API Testing tool that generates beautiful API Documentation?

Want an integrated, All-in-One platform for your Developer Team to work together with maximum productivity?

Apidog delivers all your demands, and replaces Postman at a much more affordable price!
button

What is a MCP Server?

Before you can configure a server, it is crucial to understand the architecture you are working with. The Model Context Protocol (MCP) is an open-source specification, originally introduced by Anthropic, designed to create a universal language between LLMs and external tools. Think of it as a standardized "API for APIs," specifically for AI consumption.

The MCP system involves two main components:

The beauty of this protocol is its simplicity and standardization. Any developer can build a server that exposes tools, and any application that acts as a host can connect to it, creating a vendor-agnostic ecosystem.

Step-by-Step Guide: Adding a Remote MCP Server

The primary method to add and manage MCP servers in LM Studio is by editing a central configuration file named mcp.json.

Finding and Editing mcp.json

You can access this file directly from the LM Studio interface, which is the recommended approach.

  1. Launch LM Studio and look at the right-hand sidebar.
  2. Click on the Program tab, which is represented by a terminal prompt icon (>_).
  3. Under the "Install" section, click the Edit mcp.json button.

This action opens the configuration file directly within LM Studio's in-app text editor. The application automatically watches this file for changes, so any server you add or modify will be reloaded the moment you save.

Example Configuration: The Hugging Face Server

To illustrate the process, we will connect to the official Hugging Face MCP server. This tool provides your LLM with the ability to search the Hugging Face Hub for models and datasets—a perfect first step into tool use.

First, you need an access token from Hugging Face.

  1. Navigate to your Hugging Face account's Access Tokens settings.
  2. Create a new token. Give it a descriptive name (e.g., lm-studio-mcp) and assign it the read role, which is sufficient for searching.
  3. Copy the generated token (hf_...).

Next, add the following structure to your mcp.json file.

{
  "mcpServers": {
    "hf-mcp-server": {
      "url": "<https://huggingface.co/mcp>",
      "headers": {
        "Authorization": "Bearer <YOUR_HF_TOKEN>"
      }
    }
  }
}

Now, replace the placeholder <YOUR_HF_TOKEN> with the actual token you copied from Hugging Face. Save the file (Ctrl+S or Cmd+S).

That's it. Your LM Studio instance is now connected.

Verification and Testing

To confirm the connection is active, you must use a model that is proficient at function calling. Many modern models, such as variants of Llama 3, Mixtral, and Qwen, have this capability. Load a suitable model and start a new chat.

Issue a prompt that would necessitate the tool, for example:

"Can you find some popular LLM models on Hugging Face that are under 7 billion parameters?"

If everything is configured correctly, the model will recognize the need for a tool. Instead of answering directly, it will trigger a tool call, which LM Studio will intercept and present to you for confirmation.

Tool Call Confirmations in LMStudio

The power to connect your LLM to external tools comes with significant responsibility. An MCP server can, by design, access your local files, make network requests, and execute code. LM Studio mitigates this risk with a critical safety feature: tool call confirmations.

When a model wants to use a tool, a dialog box appears in the chat interface. This box gives you a complete, human-readable overview of the pending action:

You have full control. You can inspect the arguments for anything suspicious and then choose to Allow the call once, Deny it, or, for tools you trust implicitly, Always allow that specific tool.

Warning: Never install or grant permissions to an MCP server from a source you do not fully trust. Always scrutinize the first tool call from any new server. You can manage and revoke your "Always allow" permissions at any time in App Settings > Tools & Integrations.

Connect LMStudio to a Local MCP Server

While connecting to remote servers is useful, the true power of MCP for many users is the ability to run servers on their local machine. This grants an LLM access to local files, scripts, and programs, all while keeping data entirely offline.

LM Studio supports both local and remote MCP servers. To configure a local server, you would add an entry to your `mcp.json` file that points to a local URL.


For example, if you were running a server on your machine on port 8000, your configuration might look like this:

{
  "mcpServers": {
    "my-local-server": {
      "url": "http://localhost:8000"
    }
  }
}
💡
Want a great API Testing tool that generates beautiful API Documentation?

Want an integrated, All-in-One platform for your Developer Team to work together with maximum productivity?

Apidog delivers all your demands, and replaces Postman at a much more affordable price!
button

The Future is Local and Connected

The integration of MCP into LM Studio is more than an incremental update; it's a foundational shift. It bridges the gap between the raw intelligence of LLMs and the practical utility of software applications. This creates a future where your AI is not just a conversationalist but a personalized assistant that operates securely on your own hardware.

Imagine a writer with a local MCP server that provides custom tools for summarization, fact-checking against a private library of documents, and style analysis—all without sending a single word to the cloud. Or a developer whose LLM can interact with a local server to run tests, read compiler output, and search internal codebases.

To facilitate this vision, the LM Studio team has also made it easy for developers to share their creations. The "Add to LM Studio" button, which uses a custom lmstudio:// deeplink, allows for one-click installation of new MCP servers. This lowers the barrier to entry and paves the way for a vibrant, community-driven ecosystem of tools.

By embracing an open standard and prioritizing user control, LM Studio has provided a powerful framework for the next generation of personal AI.

Explore more

Google Just Dropped Gemini CLI— Free Gemini 2.5 Pro Access + 1000 Daily Requests

Google Just Dropped Gemini CLI— Free Gemini 2.5 Pro Access + 1000 Daily Requests

Google's free Gemini CLI, the open-source AI agent, rivals its competitors with free access to 1000 requests/day and Gemini 2.5 pro. Explore this complete Gemini CLI setup guide with MCP server integration.

26 June 2025

Gemini CLI: Google's Open Source Claude Code Alternative

Gemini CLI: Google's Open Source Claude Code Alternative

For decades, the command-line interface (CLI) has been the developer's sanctuary—a space of pure efficiency, control, and power. It's where code is born, systems are managed, and real work gets done. While graphical interfaces have evolved, the terminal has remained a constant, a testament to its enduring utility. Now, this venerable tool is getting its most significant upgrade in a generation. Google has introduced Gemini CLI, a powerful, open-source AI agent that brings the formidable capabili

25 June 2025

3 Easy Ways to Use Google Veo 3 for Free

3 Easy Ways to Use Google Veo 3 for Free

Want to try Google Veo 3 without paying? Learn 3 legitimate ways to access Google’s powerful AI video tool for free—including student promos, Google AI trials, and $300 Google Cloud credits. Step-by-step guide included!

25 June 2025

Practice API Design-first in Apidog

Discover an easier way to build and use APIs