Skip to content

Connectors

Querri connects to a wide variety of data sources through its connector system. This reference provides a complete list of available connectors, their capabilities, and authentication requirements.

Connectors allow Querri to:

  • Import data from external systems
  • Sync data automatically
  • Query live data sources
  • Export results to external platforms

Each connector has specific authentication requirements and capabilities.

Relational and data warehouse databases for structured data

File storage platforms for documents and datasets

SaaS platforms for business data (CRM, accounting, CMMS)

Direct file imports in various formats


Category: Relational Database Status: Generally Available (GA) Authentication: Username/Password

Capabilities:

  • Live query execution
  • Full SQL support
  • Read and write operations
  • Schema introspection
  • Table and view access

Connection Requirements:

  • Host/IP address
  • Port (default: 5432)
  • Database name
  • Username
  • Password
  • SSL support (optional)

Use Cases:

  • Production databases
  • Data warehouses
  • Application databases
  • Analytics databases

Documentation: PostgreSQL Integration Guide


Category: Relational Database Status: Generally Available (GA) Authentication: Username/Password

Capabilities:

  • Live query execution
  • Full SQL support
  • Read and write operations
  • Schema introspection
  • Table and view access

Connection Requirements:

  • Host/IP address
  • Port (default: 3306)
  • Database name
  • Username
  • Password
  • SSL support (optional)

Use Cases:

  • Web application databases
  • E-commerce platforms
  • Content management systems
  • Legacy systems

Documentation: MySQL Integration Guide


Category: Relational Database Status: Generally Available (GA) Authentication: Username/Password or Windows Authentication

Capabilities:

  • Live query execution
  • T-SQL support
  • Read and write operations
  • Schema introspection
  • Stored procedure execution

Connection Requirements:

  • Host/IP address
  • Port (default: 1433)
  • Database name
  • Username
  • Password
  • Instance name (if applicable)

Use Cases:

  • Enterprise databases
  • Microsoft ecosystem integration
  • Business intelligence
  • Corporate data warehouses

Documentation: MSSQL Integration Guide


Category: Data Warehouse Status: Generally Available (GA) Authentication: OAuth 2.0

Capabilities:

  • Large-scale data warehouse queries
  • Standard SQL support
  • Dataset and table introspection
  • Real-time analytics
  • Petabyte-scale data processing

Connection Requirements:

  • OAuth authentication (automatic)
  • Project selection
  • Dataset and table selection

Use Cases:

  • Large-scale analytics
  • Data warehouse queries
  • Business intelligence
  • Machine learning datasets
  • Log analytics

Permissions Required:

  • BigQuery Data Viewer
  • BigQuery Job User
  • Read access to specific datasets

Documentation: BigQuery Integration Guide


Category: Relational Database Status: Development Authentication: Username/Password

Capabilities:

  • Live query execution
  • PL/SQL support
  • Read and write operations
  • Schema introspection
  • Complex query support

Connection Requirements:

  • Host/IP address
  • Port (default: 1521)
  • Service name or SID
  • Username
  • Password

Use Cases:

  • Enterprise databases
  • Legacy system integration
  • Complex transactions
  • Large-scale OLTP systems

Category: Data Warehouse Status: Development Authentication: Username/Password

Capabilities:

  • Cloud data warehouse queries
  • Standard SQL support
  • Multi-cluster compute
  • Data sharing
  • Time travel queries

Connection Requirements:

  • Account identifier
  • Username
  • Password
  • Warehouse name
  • Database name
  • Schema name (optional)

Use Cases:

  • Cloud data warehousing
  • Analytics at scale
  • Data lake integration
  • BI and reporting

Category: Data Warehouse Status: Development Authentication: Username/Password

Capabilities:

  • PostgreSQL-compatible queries
  • Columnar storage optimization
  • Massive parallel processing
  • Data warehouse analytics

Connection Requirements:

  • Cluster endpoint
  • Port (default: 5439)
  • Database name
  • Username
  • Password

Use Cases:

  • AWS data warehouse
  • Large-scale analytics
  • BI dashboards
  • Data lake queries

Category: Cloud Storage Status: Generally Available (GA) Authentication: OAuth 2.0

Capabilities:

  • File browsing
  • File download
  • Folder access
  • Google Sheets import
  • Google Docs export
  • Automatic format detection

Supported File Types:

  • CSV, Excel, JSON, Parquet
  • Google Sheets (auto-converted)
  • Text files
  • Compressed files (ZIP)

Use Cases:

  • Team shared files
  • Google Workspace integration
  • Spreadsheet data
  • Collaborative datasets

Permissions Required:

  • Read access to files
  • Drive file metadata

Note: Google Drive connector uses the drive.file scope, which only grants access to files explicitly selected by the user through the Google Picker UI.


Category: Accounting Status: Generally Available (GA) Authentication: OAuth 2.0

Capabilities:

  • Financial data access
  • Report generation
  • Customer and vendor data
  • Invoice and payment data
  • Account balances

Available Data:

  • Profit & Loss statements
  • Balance sheets
  • Sales reports
  • Expense reports
  • Customer lists
  • Invoice details
  • Payment history

Use Cases:

  • Financial reporting
  • Revenue analysis
  • Expense tracking
  • Customer analytics

Documentation: QuickBooks Integration Guide


Category: CRM/Marketing Status: Generally Available (GA) Authentication: OAuth 2.0 or API Key

Capabilities:

  • Contact data
  • Company records
  • Deal pipeline
  • Marketing analytics
  • Email performance
  • Form submissions

Available Data:

  • Contacts and companies
  • Deals and opportunities
  • Marketing campaigns
  • Email metrics
  • Landing page performance
  • Sales pipeline

Use Cases:

  • Sales analytics
  • Marketing ROI
  • Lead tracking
  • Customer journey analysis

Documentation: HubSpot Integration Guide


Category: CMMS (Computerized Maintenance Management System) Status: Generally Available (GA) Authentication: API Key

Capabilities:

  • Work order management
  • Asset tracking
  • Assignment data
  • Maintenance records
  • Equipment history

Connection Requirements:

  • API Key
  • Base URL (eMaint instance URL)

Available Data:

  • Work orders
  • Assets and equipment
  • Maintenance schedules
  • Technician assignments
  • Work order history

Use Cases:

  • Maintenance tracking
  • Asset lifecycle management
  • Work order analytics
  • Equipment performance analysis
  • Preventive maintenance scheduling

Documentation: Fluke eMaint Integration Guide


Category: File Upload Status: Generally Available (GA) Authentication: None (direct upload)

Capabilities:

  • Automatic type inference
  • Header detection
  • Delimiter detection (comma, tab, semicolon)
  • Large file support
  • Encoding detection (UTF-8, Latin-1)

Features:

  • Handles quoted fields
  • Multi-line values support
  • Custom delimiter specification
  • Skip rows option

Best Practices:

  • Include header row
  • Consistent data types per column
  • Use standard delimiters
  • UTF-8 encoding recommended

Category: File Upload Status: Generally Available (GA) Authentication: None (direct upload)

Capabilities:

  • Multi-sheet support
  • Header detection
  • Format preservation
  • Formula evaluation
  • Date/number format detection

Features:

  • Select specific sheets
  • Skip header rows
  • Handle merged cells
  • Preserve formatting

Supported Versions:

  • Excel 2007+ (XLSX)
  • Excel 97-2003 (XLS)

Limitations:

  • Formulas evaluated to values
  • Macros not supported
  • Charts not imported

Category: File Upload Status: Generally Available (GA) Authentication: None (direct upload)

Capabilities:

  • Nested object handling
  • Array of objects import
  • Schema inference
  • Flattening options

Supported Formats:

  • JSON array of objects
  • JSON Lines (JSONL)
  • Nested JSON with flattening

Features:

  • Automatic structure detection
  • Nested field access
  • Array expansion

Best Practices:

  • Use array of objects format
  • Consistent object structure
  • Avoid deep nesting (>3 levels)

Category: File Upload Status: Generally Available (GA) Authentication: None (direct upload)

Capabilities:

  • Columnar format support
  • Schema preservation
  • Compressed file handling
  • Large file support
  • Type preservation

Features:

  • Fast import performance
  • Efficient storage
  • Full schema metadata
  • Partition support

Use Cases:

  • Data warehouse exports
  • Analytics datasets
  • Large-scale data transfer
  • Machine learning data

  • Fully supported and tested
  • Production-ready
  • Full feature set available
  • Covered by support SLA
  • Functional but under active development
  • May have limited features
  • Use in production with caution
  • Subject to changes
  • Planned for future release
  • Not yet available
  • Contact support for early access

ConnectorReadWriteLive QuerySyncAuth Type
PostgreSQL-Credentials
MySQL-Credentials
MSSQL-Credentials
BigQuery--OAuth
Oracle-Credentials
Snowflake--Credentials
Redshift--Credentials
Google Drive---OAuth
QuickBooks--OAuth
HubSpot--OAuth
Fluke eMaint---API Key
CSV Upload---None
Excel Upload---None
JSON Upload---None
Parquet Upload---None

OAuth Connectors (Google Drive, BigQuery, QuickBooks, HubSpot)

Section titled “OAuth Connectors (Google Drive, BigQuery, QuickBooks, HubSpot)”
  1. Navigate to Settings → Connectors
  2. Select the connector type
  3. Click “Connect” or “Authorize”
  4. Sign in to the external service in the popup window
  5. Review and grant requested permissions
  6. The popup closes automatically
  7. Connector is now active

Note: OAuth tokens are securely stored and automatically refreshed.

  1. Click “Add Connector” in Data Sources
  2. Select database type
  3. Enter connection details:
    • Host/IP address
    • Port
    • Database name
    • Username
    • Password
  4. Test connection
  5. Save connector

Security: Credentials are encrypted at rest and in transit.

  1. Click “Upload File” or drag-and-drop
  2. Select file(s) from computer
  3. Querri auto-detects format and structure
  4. Confirm or adjust settings
  5. Import data

Limits: Individual file size up to 500MB (configurable)


Test database connectors before using:

  • Click “Test Connection” during setup
  • Verifies credentials and network access
  • Checks permissions
  • Returns success/error message

Update credentials without recreating connector:

  1. Go to Data Sources
  2. Find connector
  3. Click “Edit” or settings icon
  4. Update credentials
  5. Test and save

Remove connectors you no longer need:

  1. Go to Data Sources
  2. Find connector
  3. Click “Delete” or trash icon
  4. Confirm removal

Note: Projects using the connector will need to be updated.


  • All credentials encrypted at rest (AES-256)
  • Encrypted in transit (TLS 1.2+)
  • Stored in secure credential vault
  • Not accessible to other users
  • Securely stored per user
  • Automatically refreshed
  • Revocable from external service
  • Scoped to minimum required permissions
  • Connectors owned by creating user
  • Shareable within organization (optional)
  • Admin oversight available
  • Audit logging for compliance
  • Database connections use SSL/TLS when available
  • IP Whitelisting: Whitelist Querri’s IP address (18.189.33.77) for enhanced security
  • VPN/SSH tunnel support (enterprise)
  • No credentials logged or exposed
  • Encrypted connections for all OAuth flows

For Databases:

  • Use read-only credentials when possible
  • Create dedicated database user for Querri
  • Limit access to specific schemas/tables

For Cloud Storage:

  • Organize files in dedicated folders
  • Use consistent file naming
  • Regular cleanup of old files

For Business Apps:

  • Review required permissions
  • Use service accounts for shared access
  • Monitor API usage limits

Database Queries:

  • Use indexed columns in filters
  • Limit result set size
  • Schedule heavy queries during off-peak

File Uploads:

  • Compress large files (ZIP)
  • Use Parquet for large datasets
  • Clean data before upload

API Connections:

  • Cache frequently accessed data
  • Respect rate limits
  • Use pagination for large datasets
  • Regularly test connector health
  • Update credentials before expiration
  • Remove unused connectors
  • Monitor usage and performance
  • Review security logs

Connection Failed (Databases):

  • Verify host/IP and port are correct
  • Check firewall rules allow connection
  • Confirm credentials are correct
  • Verify database user has required permissions
  • Check SSL/TLS requirements

OAuth Authorization Failed:

  • Clear browser cache and retry
  • Check popup blockers
  • Verify service account has permissions
  • Re-authorize from external service
  • Contact administrator for org-level permissions

File Upload Errors:

  • Check file size limits
  • Verify file format is supported
  • Ensure file is not corrupted
  • Check for special characters in filename
  • Try different browser

API Connection Issues:

  • Verify endpoint URL is correct
  • Check API key/token is valid
  • Confirm API is not rate-limited
  • Review required headers
  • Check request format

Don’t see the connector you need?

Contact Support:

  • Email: support@querri.com
  • In-app: Help → Request Feature
  • Provide: Service name, use case, API documentation

Enterprise Custom Connectors:

  • Contact sales for custom connector development
  • Internal API integration
  • Legacy system connections
  • Specialized data sources