The Kleros documentation site exposes an MCP (Model Context Protocol) server that lets AI coding assistants — Claude, Cursor, VS Code Copilot, and others — query these docs directly as context when you are building a Kleros integration. This means your AI assistant can answer questions like “what is the correct extraData encoding for V2?” or “show me the IArbitrableV2 interface” by reading the authoritative docs rather than relying on training data that may be outdated.Documentation Index
Fetch the complete documentation index at: https://kleros-mintlify-changelog-2026-05-12-1778458371.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
MCP Server URL
Setup Instructions
- Claude Code
- Claude Desktop
- Cursor
- VS Code
- Windsurf
Run this from any directory (installs globally):Or add it manually to Once connected, Claude Code can answer Kleros integration questions inline while you code.
~/.claude/settings.json:What You Can Ask
Once connected, your AI assistant has access to the full Kleros documentation. Useful prompts when building integrations:LLM-Friendly Endpoints
In addition to the MCP server, the docs site exposes two flat-file endpoints for LLM consumption:| Endpoint | Description |
|---|---|
https://docs.kleros.io/llms.txt | Index of all pages with titles and descriptions |
https://docs.kleros.io/llms-full.txt | Full text content of all pages concatenated |
Contextual AI on Every Page
Every documentation page includes a context-aware AI button in the top-right corner. Click the Ask AI button (or press the relevant keyboard shortcut) to open a chat that has the current page as context. The page-level AI understands the specific section you are reading and can answer follow-up questions without you needing to copy-paste content.The MCP server content is rebuilt on every documentation deploy. You always get the latest version of the docs, including the most recent contract addresses, API changes, and integration patterns.