Revelator Logo API

Model Context Protocol (MCP) Overview & Setup

The Model Context Protocol (MCP) allows you to securely connect AI assistants (like Claude Desktop) to your Revelator data. By configuring these servers, your Large Language Models (LLMs) can execute specific tools to fetch, analyze, and manage your catalog, analytics, and royalty data in real-time using natural language.

Beta Feature
Our MCP Servers are currently in BETA. They are under active development and are subject to breaking changes - some tools can result in 403 Forbidden or other errors.

Available Servers

We have logically grouped our tools into five distinct MCP servers based on business domains. Choose a server below to view its connection address and supported tools:

  1. Catalog MCP Server: Query and manage metadata for releases, tracks, and artists.
  2. Supply Chain MCP Server: Manage distribution settings, pricing, and territories.
  3. Analytics MCP Server: Pull performance data, Top Movers, and revenue metrics.
  4. Royalties MCP Server: Manage accounting, royalty runs, and payee statements.
  5. Blockchain MCP Server: Interact with Web3 features, royalty tokens, and smart wallets.

Remote Setup Instructions

To connect your AI agent to Revelator’s data, you must add the appropriate MCP Server to your client’s configuration file using a Streamable HTTP transport.

Step 1: Add the Server Configuration

Point your AI agent to the specific MCP Server endpoint for the domain you want to access (Catalog, Analytics, etc.).

For example, to enable the Catalog tools, add the following to your client’s JSON configuration file:

{
  "mcpServers": {
    "revelator-catalog": {
      "type": "http",
      "url": "https://mcp.revelator.com/catalog"
    }
  }
}

Note: You can configure multiple Revelator servers simultaneously by adding additional entries to the mcpServers object using their respective URLs.

Step 2: Global Setup & Authentication

Because your AI assistant is executing actions on your behalf, it must be properly authenticated.

Use your existing Revelator Partner API credentials (partnerUserId and partnerApiKey) when your LLM prompts you to do so. This is usually only required initially when starting a new chat/session.


Supported Clients

You can connect Revelator MCP Servers to any AI assistant, IDE, or agent framework that supports the Model Context Protocol. Below is a list of popular supported clients:

ClientDeveloperNotes
Claude DesktopAnthropicFull native MCP support. Configure via claude_desktop_config.json.
Claude Cloud (Web)AnthropicRequires an Enterprise/Team plan for MCP integrations.
ChatGPT DesktopOpenAISupports MCP tools via local app configuration.
ChatGPT Cloud (Web)OpenAISupported via custom GPT actions or enterprise integrations.
Cloudflare AI PlaygroundCloudflareWeb-based environment perfect for testing MCP connections.
Visual Studio CodeMicrosoftSupported via extensions (e.g., Cline, Roo Code). Configure via cline_mcp_settings.json.
CursorCursorAdd MCP servers directly in the IDE settings to code alongside your catalog data.
Claude CodeAnthropicCLI-based agent for terminal usage.
Gemini CLIGoogleCLI-based agent. Configure via ~/.gemini/settings.json.
Kiro CLIKiroCLI-based agent. Configure via ~/.kiro/settings/mcp.json.