Data API
The Data API lets external tools create and manage data sources in Querri programmatically. Use it to push data from Zapier, sync records from internal systems, or build custom integrations that keep Querri up to date automatically.
Overview
Section titled “Overview”The Data API provides full CRUD operations on data sources:
| Operation | Method | Endpoint | Scope |
|---|---|---|---|
| List sources | GET | /data/sources | data:read |
| Get schema | GET | /data/sources/{id} | data:read |
| Read data | GET | /data/sources/{id}/data | data:read |
| Query data | POST | /data/query | data:read |
| Create source | POST | /data/sources | data:write |
| Append rows | POST | /data/sources/{id}/rows | data:write |
| Replace data | PUT | /data/sources/{id}/data | data:write |
| Delete source | DELETE | /data/sources/{id} | data:write |
Authentication
Section titled “Authentication”All requests require a qk_ API key with the appropriate scope. See Authentication for details.
curl -H "Authorization: Bearer qk_your_key_here" \ -H "X-Tenant-ID: your_org_id" \ -H "Content-Type: application/json" \ https://app.querri.com/api/v1/data/sourcesChoosing the right scope
Section titled “Choosing the right scope”| Use case | Scopes needed |
|---|---|
| Read-only integration (BI tool, dashboard) | data:read |
| Write-only ingestion (Zapier, webhook) | data:write |
| Full integration (read + write) | data:read, data:write |
Keys with only data:read cannot create, append, replace, or delete sources. Keys with only data:write cannot read data or list sources.
Creating a source
Section titled “Creating a source”Create a new data source by providing a name and an array of row objects:
curl -X POST https://app.querri.com/api/v1/data/sources \ -H "Authorization: Bearer qk_your_key_here" \ -H "X-Tenant-ID: your_org_id" \ -H "Content-Type: application/json" \ -d '{ "name": "Zapier Leads", "rows": [ {"name": "Alice", "email": "alice@example.com", "score": 85}, {"name": "Bob", "email": "bob@example.com", "score": 92} ] }'Response (201 Created):
{ "id": "src_a1b2c3d4", "name": "Zapier Leads", "columns": ["name", "email", "score"], "row_count": 2, "updated_at": "2026-03-08T15:30:00.000000"}Column types are automatically inferred from the data — strings, numbers, dates, and booleans are all detected.
Reading data
Section titled “Reading data”List sources
Section titled “List sources”curl https://app.querri.com/api/v1/data/sources \ -H "Authorization: Bearer qk_your_key_here" \ -H "X-Tenant-ID: your_org_id"Get schema
Section titled “Get schema”curl https://app.querri.com/api/v1/data/sources/{source_id} \ -H "Authorization: Bearer qk_your_key_here" \ -H "X-Tenant-ID: your_org_id"Returns column names, types, and row count.
Read paginated data
Section titled “Read paginated data”curl "https://app.querri.com/api/v1/data/sources/{source_id}/data?page=1&page_size=100" \ -H "Authorization: Bearer qk_your_key_here" \ -H "X-Tenant-ID: your_org_id"Response:
{ "data": [ {"name": "Alice", "email": "alice@example.com", "score": 85}, {"name": "Bob", "email": "bob@example.com", "score": 92} ], "total_rows": 2, "page": 1, "page_size": 100}Maximum page_size is 10,000.
SQL query
Section titled “SQL query”curl -X POST https://app.querri.com/api/v1/data/query \ -H "Authorization: Bearer qk_your_key_here" \ -H "X-Tenant-ID: your_org_id" \ -H "Content-Type: application/json" \ -d '{ "sql": "SELECT name, score FROM data WHERE score > 80", "source_id": "src_a1b2c3d4", "page": 1, "page_size": 100 }'The query runs in DuckDB with row-level security (RLS) applied automatically. Only SELECT queries are allowed.
Appending rows
Section titled “Appending rows”Add rows to an existing source. Columns are matched by name — new columns are added, missing columns are filled with null values.
curl -X POST https://app.querri.com/api/v1/data/sources/{source_id}/rows \ -H "Authorization: Bearer qk_your_key_here" \ -H "X-Tenant-ID: your_org_id" \ -H "Content-Type: application/json" \ -d '{ "rows": [ {"name": "Charlie", "email": "charlie@example.com", "score": 78}, {"name": "Diana", "email": "diana@example.com", "score": 95} ] }'Response:
{ "id": "src_a1b2c3d4", "name": "Zapier Leads", "columns": ["name", "email", "score"], "row_count": 4, "updated_at": "2026-03-08T16:00:00.000000"}This is ideal for Zapier triggers or webhook-based ingestion where new records arrive over time.
Replacing data
Section titled “Replacing data”Replace all data in a source. The existing data is removed and the new rows take its place.
curl -X PUT https://app.querri.com/api/v1/data/sources/{source_id}/data \ -H "Authorization: Bearer qk_your_key_here" \ -H "X-Tenant-ID: your_org_id" \ -H "Content-Type: application/json" \ -d '{ "rows": [ {"name": "Eve", "email": "eve@example.com", "score": 100} ] }'Use this when your integration produces a complete dataset on each sync (e.g., a nightly export from another system).
Deleting a source
Section titled “Deleting a source”Remove a source and all its data permanently.
curl -X DELETE https://app.querri.com/api/v1/data/sources/{source_id} \ -H "Authorization: Bearer qk_your_key_here" \ -H "X-Tenant-ID: your_org_id"Response:
{ "id": "src_a1b2c3d4", "deleted": true}This deletes the source metadata, QDF metadata, and the underlying data file.
Limits
Section titled “Limits”| Limit | Value |
|---|---|
| Rows per request | 100,000 |
| Payload size | 50 MB |
| Page size (reads) | 10,000 |
| SQL query length | 10,000 characters |
| Rate limit | Per-key setting (default 60/min) |
Source scope
Section titled “Source scope”If your API key uses explicit source scope (only specific sources are accessible), new sources created via the API are not automatically added to the key’s scope. Use an API key with "mode": "all" source scope for integrations that create sources, or update the key’s scope after creation.
Row-level security
Section titled “Row-level security”Read operations (data:read) enforce the API key’s row-level security (RLS) automatically. If the key has a bound_user_id, RLS is evaluated as that user. If the key has access_policy_ids, those policies are applied.
Write operations (data:write) do not evaluate RLS — the key either has write access or it doesn’t.
Setting up API keys for integrations
Section titled “Setting up API keys for integrations”Before building an integration, create the right API key in Querri. Go to Settings → API Keys (/settings/api) and click Create Key.
Recommended key configurations
Section titled “Recommended key configurations”| Integration type | Key name | Scopes | Source scope | Notes |
|---|---|---|---|---|
| Zapier (create + append) | Zapier - CRM Sync | data:read, data:write | All sources | Needs both scopes to create sources and verify data |
| Zapier (append only) | Zapier - Lead Capture | data:write | All sources | Write-only if you never need to read back |
| Nightly batch sync | Nightly Sync Bot | data:read, data:write | All sources | Needs read to verify, write to replace |
| BI tool / dashboard | Tableau Read-Only | data:read | All sources (or explicit) | Read-only, optionally locked to specific sources |
| Restricted integration | Partner API | data:read | Explicit: [src_abc, src_def] | Can only see listed sources |
Key properties
Section titled “Key properties”- Bound user: If set, read operations evaluate row-level security (RLS) as that user. Leave unset for full data access.
- Source scope:
"All sources"lets the key access any source."Explicit"restricts to a specific list of source IDs. - Expiration: Keys expire after the set period (default 90 days, max 1 year). Rotate keys before they expire.
- Rate limit: Default 60 requests/minute per key. Increase for high-volume integrations.
Key security best practices
Section titled “Key security best practices”- Use the minimum scopes needed. A Zapier webhook that only appends rows needs
data:write— notdata:read. - Create separate keys per integration. Don’t share a key between Zapier and your BI tool.
- Set IP allowlists for server-to-server integrations where the source IPs are known.
- Store the secret securely. The
qk_secret is shown once at creation — save it in your integration’s secrets manager (Zapier’s vault, AWS Secrets Manager, etc.).
Integration patterns
Section titled “Integration patterns”Zapier: Append rows on new records
Section titled “Zapier: Append rows on new records”This is the most common pattern — a Zap fires whenever a new record appears in another app and appends it to a Querri source.
Prerequisites:
- A
qk_key withdata:writescope (adddata:readif you want the Zap to verify data) - An existing source in Querri (create one manually or via the API first)
Step-by-step Zapier setup:
-
Create the source first (one-time setup via curl or the API):
Terminal window curl -X POST https://app.querri.com/api/v1/data/sources \-H "Authorization: Bearer qk_your_key_here" \-H "X-Tenant-ID: your_org_id" \-H "Content-Type: application/json" \-d '{"name": "HubSpot Contacts","rows": [{"name": "Seed Record", "email": "seed@example.com", "company": "Example Inc"}]}'Save the returned
id— you’ll use it in the Zap. -
Create a new Zap in Zapier:
- Trigger: Choose your source app (HubSpot, Salesforce, Google Sheets, etc.) → “New Contact” / “New Row” / etc.
- Action: Choose “Webhooks by Zapier” → “Custom Request”
-
Configure the webhook action:
- Method:
POST - URL:
https://app.querri.com/api/v1/data/sources/{source_id}/rows - Headers:
Authorization: Bearer qk_your_key_hereX-Tenant-ID: your_org_idContent-Type: application/json
- Body:
Replace{"rows": [{"name": "{{name}}","email": "{{email}}","company": "{{company}}","created_at": "{{created_date}}"}]}
{{name}},{{email}}, etc. with Zapier’s dynamic field mapping from your trigger.
- Method:
-
Test the Zap — send a test record and verify it appears in Querri.
Zapier: Create a new source per report
Section titled “Zapier: Create a new source per report”For integrations that produce entire datasets (e.g., a weekly report), create a new source each time:
Method: POSTURL: https://app.querri.com/api/v1/data/sourcesHeaders: Authorization: Bearer qk_your_key_here X-Tenant-ID: your_org_id Content-Type: application/jsonBody: { "name": "Weekly Report - {{zap_meta_human_now}}", "rows": {{report_data_as_json_array}} }Scheduled full sync (Python)
Section titled “Scheduled full sync (Python)”Replace all data on a schedule (e.g., nightly cron job or Zapier Schedule trigger):
import requests
API_KEY = "qk_your_key_here"ORG_ID = "your_org_id"SOURCE_ID = "src_a1b2c3d4"BASE_URL = "https://app.querri.com/api/v1"
HEADERS = { "Authorization": f"Bearer {API_KEY}", "X-Tenant-ID": ORG_ID, "Content-Type": "application/json",}
# Step 1: Fetch fresh data from your systemrows = fetch_all_records() # Returns list of dicts
# Step 2: Replace all data in Querriresp = requests.put( f"{BASE_URL}/data/sources/{SOURCE_ID}/data", headers=HEADERS, json={"rows": rows},)resp.raise_for_status()result = resp.json()print(f"Synced {result['row_count']} rows, columns: {result['columns']}")
# Step 3 (optional): Verify by reading backverify = requests.get( f"{BASE_URL}/data/sources/{SOURCE_ID}", headers=HEADERS,)verify.raise_for_status()schema = verify.json()assert schema["row_count"] == len(rows), "Row count mismatch!"Read-only BI integration (Python)
Section titled “Read-only BI integration (Python)”Create a key with only data:read scope for external BI tools:
import requests
API_KEY = "qk_readonly_key"ORG_ID = "your_org_id"BASE_URL = "https://app.querri.com/api/v1"
HEADERS = { "Authorization": f"Bearer {API_KEY}", "X-Tenant-ID": ORG_ID,}
# List all accessible sourcessources = requests.get(f"{BASE_URL}/data/sources", headers=HEADERS).json()
for source in sources["data"]: print(f"\n{source['name']} ({source['id']}): {source['row_count']} rows") print(f" Columns: {source['columns']}")
# Read first page of data data = requests.get( f"{BASE_URL}/data/sources/{source['id']}/data?page=1&page_size=1000", headers=HEADERS, ).json()
# Paginate through all data if needed all_rows = data["data"] total = data["total_rows"] page = 2 while len(all_rows) < total: next_page = requests.get( f"{BASE_URL}/data/sources/{source['id']}/data?page={page}&page_size=1000", headers=HEADERS, ).json() all_rows.extend(next_page["data"]) page += 1
print(f" Loaded {len(all_rows)} / {total} rows")Complete integration lifecycle (Node.js)
Section titled “Complete integration lifecycle (Node.js)”A full create → append → read → replace → delete flow:
const BASE_URL = "https://app.querri.com/api/v1";const API_KEY = "qk_your_key_here";const ORG_ID = "your_org_id";
const headers = { Authorization: `Bearer ${API_KEY}`, "X-Tenant-ID": ORG_ID, "Content-Type": "application/json",};
// 1. Create a sourceconst createResp = await fetch(`${BASE_URL}/data/sources`, { method: "POST", headers, body: JSON.stringify({ name: "CRM Contacts", rows: [ { name: "Alice", email: "alice@corp.com", deal_stage: "qualified" }, ], }),});const source = await createResp.json();const sourceId = source.id;console.log(`Created source ${sourceId}`);
// 2. Append rows (as new records arrive)await fetch(`${BASE_URL}/data/sources/${sourceId}/rows`, { method: "POST", headers, body: JSON.stringify({ rows: [ { name: "Bob", email: "bob@corp.com", deal_stage: "proposal" }, { name: "Carol", email: "carol@corp.com", deal_stage: "closed_won" }, ], }),});
// 3. Read data backconst readResp = await fetch( `${BASE_URL}/data/sources/${sourceId}/data?page=1&page_size=100`, { headers });const data = await readResp.json();console.log(`${data.total_rows} contacts loaded`);
// 4. Replace with fresh export (nightly sync)await fetch(`${BASE_URL}/data/sources/${sourceId}/data`, { method: "PUT", headers, body: JSON.stringify({ rows: freshCRMExport, // Your complete dataset }),});
// 5. Delete when integration is decommissionedawait fetch(`${BASE_URL}/data/sources/${sourceId}`, { method: "DELETE", headers,});Error codes
Section titled “Error codes”| Code | HTTP | Description |
|---|---|---|
source_not_found | 404 | Source ID does not exist |
no_data | 400/404 | Source has no data (append requires existing data) |
source_not_in_scope | 403 | API key’s source scope does not include this source |
insufficient_scope | 403 | API key does not have the required scope |
too_many_rows | 400 | Request exceeds the 100,000 row limit |
empty_data | 400 | Rows array contains no columns |
payload_too_large | 413 | Request body exceeds 50 MB |
invalid_sql | 400 | SQL query contains disallowed statements |
query_failed | 400 | SQL query execution failed |
Next steps
Section titled “Next steps”- Authentication — API key setup and scopes
- API Reference — Complete endpoint listing
- API Keys — Creating and managing
qk_keys