Kwery MCP Server
30 tools covering Polymarket, Kalshi, Hyperliquid, Binance, and Chainlink. Works in any MCP-compatible AI tool.
What is MCP?
Model Context Protocol lets AI assistants call external tools natively. kwery-mcp exposes all Kwery data endpoints as MCP tools so Claude and other agents can fetch live market data without writing code. You describe what you want in plain language; the agent calls the right tool and returns structured data directly into the conversation.
Quick Start
Step 1 — Install:
claude mcp add kwery npx kwery-mcp@latest
Step 2 — Set your API key in your shell environment or pass it via --env:
export KWERY_API_KEY=your_api_key_here
Step 3 — Use it. Example queries to try:
- "What's the current Polymarket probability on BTC hitting 100k?"
- "Pull Kalshi prices for BTC at 1h intervals for the last 30 days"
- "Get Binance funding rates for ETH — annualize them"
Multi-Platform Setup
claude mcp add kwery --env KWERY_API_KEY=your_api_key_here npx kwery-mcp@latest
Run claude mcp list to confirm kwery appears in the server list.
All 30 Tools
| Category | Tool | What it does |
|---|---|---|
| Discovery | kwery_sources | List all available data sources |
kwery_limits | Show plan limits, credit balance, rate limits | |
kwery_status | Health check for Kwery API | |
| Markets | kwery_markets | Search markets across all platforms |
kwery_market | Get metadata for a single market | |
| Polymarket | polymarket_markets | List/search Polymarket markets |
polymarket_market | Get a single Polymarket market | |
polymarket_candles | OHLCV candles for a Polymarket token | |
polymarket_trades | Raw trade history | |
polymarket_snapshots | Order book snapshot series | |
polymarket_snapshot_at | Order book at a specific point in time | |
| Kalshi | kalshi_markets | List/search Kalshi markets |
kalshi_prices | Price history (cents, 0–100) | |
kalshi_orderbook | Current order book | |
kalshi_snapshots | Order book snapshot series | |
kalshi_snapshot_at | Order book at a specific point in time | |
| Hyperliquid | hyperliquid_markets | List Hyperliquid perp markets |
hyperliquid_candles | OHLCV candles | |
hyperliquid_trades | Raw trade history | |
hyperliquid_funding | Funding rate history | |
hyperliquid_oi | Open interest history | |
hyperliquid_snapshots | Order book snapshot series | |
hyperliquid_snapshot_at | Order book at a specific point in time | |
| Binance / Chainlink | binance_candles | Binance spot/futures OHLCV |
chainlink_candles | Chainlink oracle price candles | |
binance_ticker | 1-second ticker | |
binance_flow | Spot buy/sell flow (buy_ratio) | |
binance_funding | Futures funding rate history | |
binance_oi | Open interest history | |
binance_liquidations | Liquidation history with side filter |
Kalshi prices are in cents (0–100). Divide by 100 before comparing to Polymarket probabilities (0–1).
Always pass token_id to polymarket_candles for backtesting. Without it you get multiple rows per bar (different legs/conditions).
snapshot_at tools require either time (exact UTC timestamp) or interval (current bar for that interval). Omitting both returns an error.
Large Responses
Some tools (kalshi_prices, polymarket_candles) can return hundreds of thousands of rows. Claude will save the output to a file and ask you to read it in chunks. Use limit and after pagination params to fetch smaller slices.
Example — fetch Kalshi prices in pages:
Get kalshi_prices with limit=500 and after=<cursor> for the next page
Authentication
Set KWERY_API_KEY in your shell environment, or pass it via the env block in your MCP config.
Get a key at kwery.xyz/dashboard.
| Error | Meaning |
|---|---|
401 | Invalid or missing API key |
403 | Plan limit reached — upgrade at kwery.xyz/pricing |
429 | Rate limited — back off and retry |
Pagination
All list tools support cursor-based pagination. The response includes meta.next_cursor. Pass it as the after parameter to fetch the next page. When meta.next_cursor is null, you've reached the last page.
Example prompt:
Fetch the first page of polymarket_trades with limit=200, then continue fetching pages using the next_cursor until it's null. Aggregate all results.