Introduction
MCP (Model Context Protocol) tools connect Cursor AI to external systems. Think databases, APIs, and logging systems.
Why MCP Matters
Usually, debugging means jumping between tools:
- Check code in editor
- Look at logs in terminal
- Query database in another tool
- Check API docs in browser
- Back to editor
MCP eliminates this context switching by bringing external information into your AI assistant.
MCP (Model Context Protocol) servers are a way to connect Cursor’s AI assistant to external systems and data sources, allowing the model to use tools to interact with APIs, databases, and more. Instead of manually explaining a project’s structure, you can integrate directly with your tools, providing a real-time, up-to-date context for the AI.
How It Works
An MCP server exposes a set of “tools” that the language model can invoke automatically based on your prompts. Cursor supports several transport methods for these servers, including stdio (for local command-line servers) and Streamable HTTP/SSE (for remote servers).
You configure an MCP server by adding a JSON configuration to a file, either globally (~/.cursor/mcp.json) or for a specific project (.cursor/mcp.json). This configuration specifies the server’s name, command/URL, and any necessary authentication like API keys.
For security, the protocol is designed to keep a “human in the loop,” meaning Cursor should provide visual indicators when a tool is invoked and may require user confirmation for sensitive operations.
Examples of Usage and Setup
Here are examples of how to set up different types of MCP servers, based on common use cases:
1. Apidog MCP Server for API Documentation
This server allows the AI assistant to fetch and interpret your API documentation in real time. This is useful for generating client-side code, validation rules, or error handling based on your API schema.
To install:
- Generate an API access token and find your Project ID in Apidog.
- Add a JSON configuration to your Cursor
mcp.jsonfile. For a Windows user, it might look like this:
{
"mcpServers": {
"API specification": {
"command": "cmd",
"args": [
"/c",
"npx",
"-y",
"apidog-mcp-server@latest",
"--project-id=<project-id>"
],
"env": {
"APIDOG_ACCESS_TOKEN": "<access-token>"
}
}
}
}
After saving and restarting Cursor, you can use prompts like:
- “Generate TypeScript interfaces for all data models in our API documentation.”
- “Create a Python client for the authentication endpoints according to our API documentation.”
- “Add comments for each field in the Product class based on the API documentation.”
2. Stripe MCP Server for Payment Processing
The Stripe MCP server exposes tools to interact with the Stripe API, such as sending emails, managing invoices, and creating payment links. This allows you to build features that rely on the Stripe API with natural language.
To install:
- You can often use a direct installation link provided by the server developer.
- Alternatively, you can manually add the server’s configuration to your
mcp.jsonfile. For a remote server, the configuration would specify the URL.
{
"mcpServers": {
"stripe": {
"url": "https://mcp.stripe.com"
}
}
}
Once configured, the AI agent automatically detects the available tools, such as list_invoices or create_payment_link, and calls them when you ask a related question in the chat.
3. Figma Dev Mode MCP Server for Design-to-Code Workflows
This server connects Cursor to Figma’s Dev Mode, enabling the AI to access design information, components, and layout data. This is perfect for generating code from design files.
To install:
- In the Figma desktop app, enable “Dev Mode MCP Server” under your preferences.
- Add a new global MCP server in Cursor settings. The server runs locally, so the configuration might look like this:
{
"mcpServers": {
"figma-dev-mode": {
"command": "npx",
"args": ["-y", "figma-dev-mode-mcp-server"]
}
}
}
With this server running, you can use prompts to generate code from a selected Figma frame, extract design context like variables and components, and ensure consistency with your codebase by using Code Connect.
Key Benefits
- Real-Time Context: The AI is always working with the most current data, whether it’s from an API, a local file, or a design document.
- Reduced Context Switching: You can keep your entire workflow within your IDE, eliminating the need to switch between different applications to find documentation or manage external services.
- Automated Workflows: The AI can perform complex, multi-step tasks by invoking a sequence of tools without explicit instructions from you at each step.
- Security: By using
mcp.jsonfiles and requiring authentication, you can control which tools the AI has access to and ensure secure access to sensitive data. -
Simple MCP Example
Create a database query tool:
# tools/db_query.py
import json
import sys
import sqlite3
def query_database(query: str) -> dict:
"""Execute database query and return results."""
conn = sqlite3.connect('app.db')
cursor = conn.cursor()
try:
cursor.execute(query)
results = cursor.fetchall()
columns = [description[0] for description in cursor.description]
return {
"status": "success",
"data": [dict(zip(columns, row)) for row in results]
}
except Exception as e:
return {"status": "error", "message": str(e)}
finally:
conn.close()
if __name__ == "__main__":
query = sys.stdin.read().strip()
result = query_database(query)
print(json.dumps(result))
Configure in Cursor settings:
{
"mcp.tools": [
{
"name": "database-query",
"command": "python",
"args": ["tools/db_query.py"],
"description": "Query the application database"
}
]
}
Now you can ask:
Use the database-query tool to show me all users created today, then check if there were authentication errors for these users
Cursor AI executes the tool and incorporates results. Pretty cool!
Did you like this post? Please let me know if you have any comments or suggestions about your experience with AI-powered development tools. I am always happy to learn from your experiences, too!