Skip to main content

Documentation Index

Fetch the complete documentation index at: https://kleros-mintlify-changelog-2026-05-12-1778458371.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

The Kleros documentation site exposes an MCP (Model Context Protocol) server that lets AI coding assistants — Claude, Cursor, VS Code Copilot, and others — query these docs directly as context when you are building a Kleros integration. This means your AI assistant can answer questions like “what is the correct extraData encoding for V2?” or “show me the IArbitrableV2 interface” by reading the authoritative docs rather than relying on training data that may be outdated.

MCP Server URL

https://docs.kleros.io/mcp
The MCP server exposes the full content of the Kleros documentation in a format that MCP clients can query with natural language.

Setup Instructions

Run this from any directory (installs globally):
claude mcp add kleros-docs --transport http https://docs.kleros.io/mcp
Or add it manually to ~/.claude/settings.json:
{
  "mcpServers": {
    "kleros-docs": {
      "type": "http",
      "url": "https://docs.kleros.io/mcp"
    }
  }
}
Once connected, Claude Code can answer Kleros integration questions inline while you code.

What You Can Ask

Once connected, your AI assistant has access to the full Kleros documentation. Useful prompts when building integrations:
What is the correct extraData encoding for Kleros V2?
Show me the IArbitrableV2 interface.
How do I register a dispute template?
What are the data mapping types for dispute templates?
How do I handle ruling 0 (refuse to arbitrate)?
What are the KlerosCore contract addresses on Arbitrum One?
How do I integrate Curate V2 into my contract?
What subgraph queries can I use to fetch dispute data?
How do I set up cross-chain arbitration with Vea?

LLM-Friendly Endpoints

In addition to the MCP server, the docs site exposes two flat-file endpoints for LLM consumption:
EndpointDescription
https://docs.kleros.io/llms.txtIndex of all pages with titles and descriptions
https://docs.kleros.io/llms-full.txtFull text content of all pages concatenated
These are useful for one-shot context loading in systems that don’t support MCP, such as custom LLM pipelines or RAG systems.

Contextual AI on Every Page

Every documentation page includes a context-aware AI button in the top-right corner. Click the Ask AI button (or press the relevant keyboard shortcut) to open a chat that has the current page as context. The page-level AI understands the specific section you are reading and can answer follow-up questions without you needing to copy-paste content.
The MCP server content is rebuilt on every documentation deploy. You always get the latest version of the docs, including the most recent contract addresses, API changes, and integration patterns.
If you find the AI giving outdated answers (e.g., wrong contract addresses or deprecated patterns like arbitrumGoerliToChiadoDevnet), the most likely cause is that it is answering from training data rather than from the MCP server. Verify MCP is connected and ask the AI to re-check using the docs.