Skip to content

MCP Server

The Querri MCP server lets AI assistants like Claude, ChatGPT, Cursor, Windsurf, and VS Code Copilot talk to your Querri data on your behalf. Once connected, you can ask questions about your data, run multi-step analyses, and reference dashboards — all from inside the AI tool you already use.

MCP (Model Context Protocol) is an open standard that lets AI assistants connect to external systems — databases, SaaS tools, internal APIs — through a single, consistent interface. You install or enable an MCP “server” inside your AI tool, log in once, and the assistant gains a set of tools it can call when you ask questions.

For Querri, the MCP server gives your AI assistant the same powers you have: it can list your data sources, run analyses, summarize results, and link to dashboards. You stay in your favorite AI tool; it pulls data from Querri as it needs to.

After connecting, you can prompt your AI assistant naturally. It picks the right tool automatically. Examples:

  • “What sources are available in Querri?” — lists your data sources
  • “Look at our Q3 sales data and tell me which regions are growing fastest.” — kicks off a Querri analysis and returns the answer
  • “Continue the analysis we started last time and break it out by product line.” — picks up an existing project conversation
  • “Show me the dashboard structure for our revenue dashboard.” — pulls dashboard widgets and links
  • “Give me the raw step output from that join we ran earlier.” — fetches the rows from a specific analysis step

The assistant decides when to call Querri. You don’t need to remember tool names — just ask.

The Querri MCP server is hosted at:

https://app.querri.com/mcp

It uses OAuth authentication: when you connect, you’ll be redirected to Querri’s normal SSO login. Once approved, your assistant has access scoped to whatever you have access to in Querri.

  1. Open Settings → Connectors
  2. Click Add custom connector
  3. Enter the MCP server URL: https://app.querri.com/mcp
  4. Complete the SSO login when prompted
  5. Start a new conversation — Querri tools appear automatically when relevant
  1. Open Settings → Connectors → Advanced
  2. Add a new MCP server with URL https://app.querri.com/mcp
  3. Complete the OAuth flow
  4. Mention Querri in your prompt to invoke the tools

Cursor, Windsurf, VS Code Copilot, and other MCP clients

Section titled “Cursor, Windsurf, VS Code Copilot, and other MCP clients”

Most MCP-compatible IDEs and assistants accept a remote MCP server URL in their settings. Add https://app.querri.com/mcp as a remote MCP server. The exact menu varies — look for Settings → MCP, Extensions → MCP servers, or similar.

For tools that read a configuration file, point the entry at the URL above and choose OAuth as the auth method.

Authentication uses Querri’s standard SSO. The first time you connect, your assistant opens a browser window where you log in normally. The assistant stores a refresh token so you don’t have to log in again on subsequent sessions.

Your AI assistant inherits your role and permissions:

  • Members can browse and analyze data they have access to.
  • Admins can additionally see organization-wide projects and dashboards.

Row-level security and access policies are enforced exactly as they are in the web app — your assistant cannot see anything you can’t.

If you belong to multiple organizations, the assistant uses the org tied to your current login. To switch, you’ll need to log out of Querri and log back in to the target org, then reconnect the assistant.

The Querri MCP server exposes these tools to your AI assistant:

ToolWhat it does
ask_projectKick off a multi-step analysis on your data. The default for any analytical question — handles joins, filtering, grouping, figures. Querri’s planner discovers the right sources automatically. Returns a chat ID; the assistant then polls get_chat for the answer.
ask_sourceShortcut for questions clearly answerable from a single, named data source with no joins needed. Returns the answer directly.
get_chatReads back the latest answer in a project chat. Used by ask_project to wait for results, and also for catching up on a project’s prior conversation.
ToolWhat it does
list_sourcesLists data sources with column counts and row counts.
get_sourceReturns a source’s column schema (names and types).
list_projectsLists analysis projects you have access to.
get_projectReturns a project as a Mermaid diagram of its analysis steps, plus a step-ID table.
list_project_stepsLists the steps in a project.
list_viewsLists SQL views defined in your organization.
get_viewReturns a view’s SQL definition and metadata.
list_dashboardsLists dashboards.
get_dashboardReturns dashboard metadata, widgets, and links.
list_filesLists uploaded files.
ToolWhat it does
query_sourceRuns a SQL query against a single source. The source is exposed as a view named data.
get_step_dataReturns the rows produced by a specific step inside a project. Useful when you want to inspect intermediate output.
ToolWhat it does
create_projectCreates an empty project. Most of the time ask_project is what you want — it creates the project and runs the analysis in one call.
list_organizationsLists organizations you belong to.
switch_organizationReturns instructions for switching to a different org (requires re-authentication).
  • Be specific about the question, not the steps. Say “compare year-over-year revenue by region” rather than “join sales and regions, then group by region.” Querri’s planner picks the steps.
  • Reference projects to continue analysis. If your assistant ran an analysis earlier, mentioning it (or letting it pass project_id) keeps the same project’s context.
  • Long-running analyses are normal. Multi-step analyses take 30 seconds to a few minutes. The assistant will wait and report when done.
  • Ask for figures explicitly. “Plot revenue over time as a line chart” — Querri returns the figure inline.
  • Use SQL for known-shape questions. When you already know which source and just need to slice it a particular way, query_source is faster than ask_project.

The assistant says it can’t connect to Querri. Re-run the OAuth flow from your assistant’s connector settings. Tokens expire and need to be refreshed via login.

Tools don’t appear in conversation. Some assistants only surface MCP tools when the prompt clearly relates to the connector. Try mentioning Querri or your data explicitly: “Use Querri to look at…”

An analysis seems stuck. Long analyses long-poll get_chat for up to ~25 seconds at a time. The assistant should be calling it repeatedly. If a single analysis takes more than ~5 minutes, ask for a simpler version of the question.

Wrong organization showing. Log out of Querri, log back in to the right org, then reconnect the MCP server.

If you’re building a Python application that talks to Querri directly — without going through an AI assistant — use the Python SDK instead. The SDK gives you typed sync and async clients for the same data.

To use Querri from a terminal or shell script, see the CLI.