AI has been a necessity to streamline workflows and gain deeper insights. Model Context Protocol (MCP) servers are at the forefront acting as bridges that allow AI-powered tools to interact directly with your crucial data sources.
Today, we’ll delve into how to set up the Google Search Console MCP Server—a popular choice for analytics and SEO data—and then introduce the Apidog MCP Server, a powerful, all-in-one solution designed to elevate your API development workflow.
What Is Google Search Console MCP Server?
The Google Search Console MCP Server acts as a bridge between Google Search Console and AI-powered IDEs. Exposing your site’s search analytics data to AI enables smarter, data-driven coding and reporting.
Key Features
- Search analytics data retrieval with support for custom dimensions
- Rich data analysis with flexible reporting periods
- Integration with Claude Desktop and other AI clients
How to Set Up Google Search Console MCP Server
Setting up the Google Search Console MCP Server involves several steps. Here’s a step-by-step guide:
Prerequisites
Before you begin, ensure you have:
- Node.js 18 or later
- A Google Cloud Project with the Search Console API enabled
- Service Account credentials with Search Console access
1. Install the MCP Server
You can install the server automatically via Smithery or manually with npm.
Via Smithery:
npx -y @smithery/cli install mcp-server-gsc --client claude
Manual Installation:
npm install mcp-server-gsc
2. Set Up Google Cloud Credentials
Go to the Google Cloud Console.
Create a new project or select an existing one
Enable the Search Console API:
- Navigate to “APIs & Services” > “Library”
- Search for and enable “Search Console API”
Create credentials:
- Go to “APIs & Services” > “Credentials”
- Click “Create Credentials” > “Service Account”
- Fill in the details and create a new key in JSON format
- Download the credentials file
Grant access:
- Open Google Search Console
- Add the service account email as a property administrator
3. Configure the MCP Server in Your AI Client
For Claude Desktop or similar tools, add the following configuration:
{
"mcpServers": {
"gsc": {
"command": "npx",
"args": ["-y", "mcp-server-gsc"],
"env": {
"GOOGLE_APPLICATION_CREDENTIALS": "/path/to/credentials.json"
}
}
}
}
4. Querying Search Analytics Data
You can now use the search_analytics
tool to retrieve data. Example parameters:
{
"siteUrl": "https://example.com",
"startDate": "2024-01-01",
"endDate": "2024-01-31",
"dimensions": "query,country",
"type": "web",
"rowLimit": 500
}
Required and Optional Parameters
Parameter | Required | Description |
---|---|---|
siteUrl | Yes | Site URL (e.g., https://example.com) |
startDate | Yes | Start date (YYYY-MM-DD) |
endDate | Yes | End date (YYYY-MM-DD) |
dimensions | No | Comma-separated (query, page, country, etc.) |
type | No | Search type (web, image, video, news) |
rowLimit | No | Max rows to return (default: 1000) |
Example AI Prompt:
@gsc use the search_analytics tool for siteUrl 'https://example.com', startDate '2024-04-01', endDate '2024-04-30', with dimensions 'query,page' and a rowLimit of 10. Show me the top queries and pages.
This setup empowers your AI assistant to become a powerful SEO analyst, providing data-driven insights for better development.
Streamlining API Development: The Apidog MCP Server
While the Google Search Console MCP Server focuses on web analytics, the Apidog MCP Server is specifically engineered to enhance AI-assisted API development. It allows your AI coding assistant to directly understand and interact with your API specifications, dramatically speeding up tasks like code generation, documentation, and testing.
What Makes Apidog MCP Server Unique?
- Connect any API specification to AI: Not just analytics—connect your REST, OpenAPI, or Apidog project specs directly to AI.
- Boost productivity: Let AI generate, update, and document code based on real API specs.
- Enhance code quality: AI suggestions are grounded in your actual API, reducing errors and improving maintainability.
- Works with multiple IDEs: Integrate with Cursor, VS Code (with Cline), and more.
- Free: No costs, no vendor lock-in.
Key Features
- Local caching: API specs are cached locally for speed and privacy.
- Multiple data sources: Connect to Apidog projects, public API docs, or Swagger/OpenAPI files.
- Flexible configuration: Supports on-premise deployments and custom environments.
How to Set Up Apidog MCP Server: Step-by-Step Guide
Setting up the Apidog MCP Server involves a few straightforward steps.
Prerequisites
1. Node.js: Version 18 or later (latest LTS recommended).
2. MCP-Compatible IDE:
- Cursor
- VS Code with the Cline plugin
Configuration Based on Your Data Source
Apidog MCP Server offers flexibility by supporting various API specification sources:
1. Using an Apidog Project as the Data Source
This is ideal for teams managing their APIs within Apidog.
Obtain API Access Token & Project ID:
API Access Token: In Apidog, go to Account Settings
(via profile picture) > API Access Token
. Create a new token and copy it.

Project ID: Open your target project in Apidog. Go to Project Settings
(left sidebar) > Basic Settings
. Copy the Project ID.

Configure in Cursor (Example):
In Cursor, open MCP settings (Settings icon > MCP > "+ Add new global MCP server").

Paste the configuration into mcp.json
, replacing placeholders:
For macOS/Linux:
{
"mcpServers": {
"MyApidogAPI": { // You can name this descriptively
"command": "npx",
"args": [
"-y",
"apidog-mcp-server@latest",
"--project=<your-project-id>"
],
"env": {
"APIDOG_ACCESS_TOKEN": "<your-access-token>"
}
}
}
}
For Windows:
{
"mcpServers": {
"MyApidogAPI": {
"command": "cmd",
"args": [
"/c",
"npx",
"-y",
"apidog-mcp-server@latest",
"--project=<your-project-id>"
],
"env": {
"APIDOG_ACCESS_TOKEN": "<your-access-token>"
}
}
}
}
2. Using Online API Documentation Published by Apidog
Useful for public APIs or sharing specs with external developers via AI.
Obtain Documentation URL: Get the URL of the publicly shared Apidog documentation.
Configure in Cursor (Example):
For macOS/Linux:
{
"mcpServers": {
"apidog-site-123456": {
"command": "npx",
"args": [
"-y",
"apidog-mcp-server@latest",
"--site-id=123456"
]
}
}
}
For Windows:
{
"mcpServers": {
"apidog-site-123456": {
"command": "cmd",
"args": [
"/c",
"npx",
"-y",
"apidog-mcp-server@latest",
"--site-id=123456"
]
}
}
}
3. Using Swagger/OpenAPI Files as the Data Source
Perfect for working with local OpenAPI/Swagger files or those hosted online.
File Path/URL: Identify the local path or direct URL to your swagger.json
, openapi.json
, or openapi.yaml
file.
Configure in Cursor (Example):
For macOS/Linux:
{
"mcpServers": {
"API specification": {
"command": "npx",
"args": [
"-y",
"apidog-mcp-server@latest",
"--oas=https://petstore.swagger.io/v2/swagger.json"
]
}
}
}
For Windows:
{
"mcpServers": {
"API specification": {
"command": "cmd",
"args": [
"/c",
"npx",
"-y",
"apidog-mcp-server@latest",
"--oas=https://petstore.swagger.io/v2/swagger.json"
]
}
}
}
Verify the Configuration
After setup, test the connection by prompting your AI assistant in Agent mode. For example:
@MyApidogAPI please fetch the API specification and tell me how many endpoints exist in the project.
If the AI responds with information from your API specification, the setup is successful. Remember, API data is cached locally. If you update your specifications in Apidog, instruct the AI to refresh its context to fetch the latest changes.
Conclusion
Integrating AI with your development workflow is no longer a luxury—it’s a game changer. By setting up MCP servers like Google Search Console and Apidog MCP, you enable your AI assistants to interact directly with critical datasets, unlocking advanced use cases across SEO analysis and API development.