Model Context Protocol (MCP) Overview & Setup
The Model Context Protocol (MCP) allows you to securely connect AI assistants (like Claude Desktop) to your Revelator data. By configuring these servers, your Large Language Models (LLMs) can execute specific tools to fetch, analyze, and manage your catalog, analytics, and royalty data in real-time using natural language.
Beta Feature
Our MCP Servers are currently in BETA. They are under active development and are subject to
breaking changes - some tools can result in 403 Forbidden or other errors.
Available Servers
We have logically grouped our tools into five distinct MCP servers based on business domains. Choose a server below to view its connection address and supported tools:
- Catalog MCP Server: Query and manage metadata for releases, tracks, and artists.
- Supply Chain MCP Server: Manage distribution settings, pricing, and territories.
- Analytics MCP Server: Pull performance data, Top Movers, and revenue metrics.
- Royalties MCP Server: Manage accounting, royalty runs, and payee statements.
- Blockchain MCP Server: Interact with Web3 features, royalty tokens, and smart wallets.
Remote Setup Instructions
To connect your AI agent to Revelator’s data, you must add the appropriate MCP Server to your client’s configuration file using a Streamable HTTP transport.
Step 1: Add the Server Configuration
Point your AI agent to the specific MCP Server endpoint for the domain you want to access (Catalog, Analytics, etc.).
For example, to enable the Catalog tools, add the following to your client’s JSON configuration file:
{
"mcpServers": {
"revelator-catalog": {
"type": "http",
"url": "https://mcp.revelator.com/catalog"
}
}
}
Note: You can configure multiple Revelator servers simultaneously by adding additional entries to the mcpServers
object using their respective URLs.
Step 2: Global Setup & Authentication
Because your AI assistant is executing actions on your behalf, it must be properly authenticated.
Use your existing Revelator Partner API credentials (partnerUserId and partnerApiKey) when your LLM prompts you to
do so. This is usually only required initially when starting a new chat/session.
Supported Clients
You can connect Revelator MCP Servers to any AI assistant, IDE, or agent framework that supports the Model Context Protocol. Below is a list of popular supported clients:
| Client | Developer | Notes |
|---|---|---|
| Claude Desktop | Anthropic | Full native MCP support. Configure via claude_desktop_config.json. |
| Claude Cloud (Web) | Anthropic | Requires an Enterprise/Team plan for MCP integrations. |
| ChatGPT Desktop | OpenAI | Supports MCP tools via local app configuration. |
| ChatGPT Cloud (Web) | OpenAI | Supported via custom GPT actions or enterprise integrations. |
| Cloudflare AI Playground | Cloudflare | Web-based environment perfect for testing MCP connections. |
| Visual Studio Code | Microsoft | Supported via extensions (e.g., Cline, Roo Code). Configure via cline_mcp_settings.json. |
| Cursor | Cursor | Add MCP servers directly in the IDE settings to code alongside your catalog data. |
| Claude Code | Anthropic | CLI-based agent for terminal usage. |
| Gemini CLI | CLI-based agent. Configure via ~/.gemini/settings.json. | |
| Kiro CLI | Kiro | CLI-based agent. Configure via ~/.kiro/settings/mcp.json. |