MCP integration
- MCP (Model Context Protocol) connects LLMs to your DipDup indexer data and functionality
- Built-in tools allow LLMs to query your indexer's state and perform operations
- Custom tools can be implemented for project-specific functionality
MCP is a pretty recent technology, so many tools might not work as expected. Consult the Known issues section for more information and open the ticket in GitHub issues if you encounter any problems.
Introduction
The Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external data sources and tools. Whether you're building an AI-powered IDE, enhancing a chat interface, or creating custom AI workflows, MCP provides a standardized way to connect LLMs with the context they need.
With DipDup's MCP integration, you can:
- Allow LLMs to understand your indexer's configuration and data models
- Enable AI assistants to query your indexed blockchain data directly
- Implement custom tools for project-specific functionality
- Create more powerful AI-assisted development workflows
MCP Primitives
There are three types of MCP primitives (citing from the docs):
- Resources are application-controlled and allow to expose data and content that can be read by clients and used as context for LLM interactions.
- Tools are model-controlled and enable servers to expose executable functionality to clients, such as interacting with external systems, performing computations, and taking actions in the real world.
- Prompts are user-controlled and enable servers to define reusable prompt templates and workflows that clients can easily surface to users and LLMs
DipDup provides an MCP server with several built-in primitives to help LLMs understand the context of your project and the current state of the indexer. You can also implement your own tools and resources specific to your project.
Running MCP server
DipDup MCP server runs in a separate process via the SSE transport. To start it, run the following command:
dipdup mcp run
Make sure that you use the same database connection settings as for the indexer.
By default, the server listens on http://127.0.0.1:9999
. You can change the settings in the mcp
section of the configuration file.
mcp:
host: 127.0.0.1
port: 9999
Connecting clients
The DipDup MCP server can be connected to various MCP-compatible clients. Here's how to set up the most popular ones:
Cursor
To configure MCP servers go to File -> Preferences -> Cursor Settings -> MCP
.
Add the following configuration to ~/.cursor/mcp.json
:
{
"mcpServers": {
"local": {
"url": "http://127.0.0.1:9999/sse"
}
}
}
If your server is running, but Cursor doesn't connect, try "Reload Window" in the Command Palette.
VSCode (Copilot)
Insiders build
Initial support for MCP is available in the Insiders build of VSCode.
Add the following to the settings.json
file:
{
"mcp": {
"inputs": [],
"servers": {
"dipdup": {
"command": "pnpx",
"args": ["supergateway", "--sse", "http://127.0.0.1:9999/sse"],
"env": {}
}
}
}
}
3rd party extensions
There are two extensions available in repository to add MCP capabilities to Copilot assistant. None of them worked out-of-the-box at the time of writing this document.
- Copilot MCP (unmainained, see #20)
- MCP-Client
Claude Desktop
Claude Desktop currently only supports stdio-based MCP servers. You can use supercorp-ai/supergateway tool to connect DipDup MCP server to Claude Desktop:
pnpx supergateway --sse http://127.0.0.1:9999
There is also lightconetech/mcp-gateway tool available for the same purpose.
Implementing MCP primitives
To make your server more useful, you can implement custom MCP primitives specific to your project.
Custom Tools Example
Let's implement a tool that provides information about token holders:
from dipdup import mcp
from demo_evm_events import models
@mcp.tool('TopHolders', 'Get top token holders')
async def tool_holders() -> str:
holders = await models.Holder.filter().order_by('-balance').limit(10).all()
res = 'Top token holders by balance:\n\n'
for i, holder in enumerate(holders:
res += f"{i}. Address: {holder.address}\n Balance: {formatted_balance}\n"
return res
This tool will:
- Query your database for the top 10 token holders
- Format the results in a readable numbered list
- Return the formatted string that can be displayed to the user by the LLM
Using application context
You can use the application context the same way as in handlers and hooks. Use the mcp.get_ctx()
function to get the context object.
from dipdup import mcp
from dipdup.context import McpContext
@mcp.tool(...)
async def tool():
ctx: McpContext = mcp.get_ctx()
ctx.logger.info('Hello from MCP tool!')
Calling other callbacks
You can use the dipdup.mcp
functions to read other resources or call tools from your callbacks.
from dipdup import mcp
@mcp.tool(...)
async def tool():
result = await mcp.call_tool('my_tool', {})
For a low-level access you can use dipdup.mcp.server
singleton to interact with the running server.
Interacting with running indexer
DipDup provides management API to interact with the running indexer. For example you can use it to add indexes in runtime. First, add running indexer as a HTTP datasource:
datasources:
indexer:
kind: http
# NOTE: Default for Compose stack
url: http://api:46339
Then, call this datasource from your MCP tool:
from dipdup import mcp
@mcp.tool(...)
async def tool():
ctx = mcp.get_ctx()
datasource = ctx.get_http_datasource('indexer')
response = await datasource.post(
'/add_index',
params={
'name': 'my_index',
'template': 'my_template',
'values': {'param': 'value'},
'first_level': 0,
'last_level': 1000,
}
)
Running in Docker
You can find a deploy/compose.mcp.yaml
Compose file that runs both the indexer and the MCP server in Docker containers. To use this manifest, use the following command:
make up COMPOSE=deploy/compose.mcp.yaml
Debugging and troubleshooting
To check if your tool is working correctly before using it in the client, you can use the MCP Inspector app:
- Run
npx @modelcontextprotocol/inspector
- Open
http://127.0.0.1:5173/
in your browser - Choose SSE transport and connect to
http://127.0.0.1:9999/sse
.
You can also enable full logging to get some insights into the MCP server:
logging:
'': DEBUG
Known issues
- DipDup doesn't support custom prompts at the moment.
- Cursor fails to discover resources exposed by DipDup server. Tools work fine. (0.47.8 tested)
- VSCode Insiders doesn't support SSE-based MCP servers; gateway tools are required.
- VSCode community extensions for MCP don't work out-of-the-box.
- Claude Desktop doesn't support SSE-based MCP servers; gateway tools are required.
Further reading
- For Server Developers - Model Context Protocol
- Example Servers - Model Context Protocol
- qpd-v/mcp-guide - a tutorial server that helps users understand MCP concepts, provides interactive examples, and demonstrates best practices for building MCP integrations. Contains awesome-list of related projects.
- anjor/coinmarket-mcp-server - example MCP server for CoinMarketCap API.
- ample-education/cursor-resources - a collection of resources to maximize productivity with Cursor: prompts, rules, MCP servers etc.