System Requirements
System Requirements
Section titled “System Requirements”This reference outlines the minimum and recommended system requirements for deploying and running Querri. Requirements vary based on deployment size, usage patterns, and expected user load.
Quick Reference
Section titled “Quick Reference”Minimum Requirements (Small Deployment)
Section titled “Minimum Requirements (Small Deployment)”- CPU: 4 cores
- RAM: 8 GB
- Storage: 50 GB
- OS: Linux (Ubuntu 20.04+, Debian 11+, or RHEL 8+)
- Docker: 24.0+
- Docker Compose: 2.20+
Recommended Requirements (Production)
Section titled “Recommended Requirements (Production)”- CPU: 8+ cores
- RAM: 16+ GB
- Storage: 100+ GB SSD
- OS: Linux (Ubuntu 22.04 LTS recommended)
- Docker: Latest stable
- Network: Dedicated server with static IP
Hardware Requirements
Section titled “Hardware Requirements”CPU (Processor)
Section titled “CPU (Processor)”Minimum
Section titled “Minimum”- Cores: 4 physical cores or 8 vCPUs
- Architecture: x86_64 (AMD64)
- Clock Speed: 2.0 GHz+
Recommended for Production
Section titled “Recommended for Production”- Cores: 8-16 physical cores or 16-32 vCPUs
- Architecture: x86_64 (AMD64)
- Clock Speed: 2.5 GHz+
- Type: Modern Intel Xeon or AMD EPYC processors
Scaling Considerations
Section titled “Scaling Considerations”- Small deployment (1-10 users): 4-8 cores
- Medium deployment (10-50 users): 8-16 cores
- Large deployment (50-200 users): 16-32 cores
- Enterprise (200+ users): 32+ cores with horizontal scaling
CPU-Intensive Operations:
- Data processing and transformations
- AI model inference
- Chart rendering
- Background job processing
- Database queries
Memory (RAM)
Section titled “Memory (RAM)”Minimum
Section titled “Minimum”- Size: 8 GB
- Type: DDR4 or newer
- Use Case: Development, testing, very light production
Recommended for Production
Section titled “Recommended for Production”- Size: 16-32 GB
- Type: DDR4 or DDR5
- ECC: Recommended for production reliability
Scaling Considerations
Section titled “Scaling Considerations”- Small deployment (1-10 users): 8-16 GB
- Medium deployment (10-50 users): 16-32 GB
- Large deployment (50-200 users): 32-64 GB
- Enterprise (200+ users): 64+ GB
Memory Breakdown (Typical Medium Deployment):
- MongoDB: 4-8 GB
- Redis: 2-4 GB
- Server API (4 replicas): 4-8 GB
- Hub Service: 1-2 GB
- Web App: 1-2 GB
- Supporting Services: 1-2 GB
- Operating System: 2-4 GB
- Buffer: 2-4 GB
Memory-Intensive Operations:
- Large dataset processing
- In-memory caching
- Concurrent user sessions
- AI model operations
- Database indices
Storage
Section titled “Storage”Minimum
Section titled “Minimum”- Size: 50 GB
- Type: SSD preferred
- IOPS: 1,000+
Recommended for Production
Section titled “Recommended for Production”- Size: 100+ GB SSD (NVMe preferred)
- IOPS: 3,000+ sustained
- Throughput: 100+ MB/s
- Redundancy: RAID 1 or RAID 10
Storage Breakdown
Section titled “Storage Breakdown”System and Application (10-20 GB):
- Docker images
- Application code
- Operating system
- Logs
Database (Variable):
- MongoDB data files
- Database indices
- Growth rate depends on usage
File Storage (Variable):
- Uploaded files (if FILE_STORAGE=LOCAL)
- Temporary files
- Exports and reports
Logs and Backups (10-20 GB):
- Application logs
- Database backups
- Archive data
Storage Scaling Guidelines
Section titled “Storage Scaling Guidelines”Expected Data Growth:
- Light usage: 5-10 GB/month
- Moderate usage: 20-50 GB/month
- Heavy usage: 100+ GB/month
Storage Recommendations by User Count:
- 1-10 users: 50-100 GB
- 10-50 users: 100-500 GB
- 50-200 users: 500 GB - 2 TB
- 200+ users: 2+ TB
Storage Type Recommendations:
- Development: Standard SSD
- Production: NVMe SSD
- High-performance: NVMe RAID 10
- Archive: Standard HDD (secondary storage)
External Storage: For production deployments, consider Amazon S3 or equivalent:
- Unlimited scalability
- Reduced local storage requirements
- Better disaster recovery
- Cost-effective for large datasets
Operating System
Section titled “Operating System”Supported Operating Systems
Section titled “Supported Operating Systems”Linux (Recommended)
Section titled “Linux (Recommended)”Ubuntu:
- Minimum: Ubuntu 20.04 LTS
- Recommended: Ubuntu 22.04 LTS
- Notes: Best tested, most documentation available
Debian:
- Minimum: Debian 11 (Bullseye)
- Recommended: Debian 12 (Bookworm)
Red Hat Enterprise Linux / CentOS:
- Minimum: RHEL 8 / Rocky Linux 8
- Recommended: RHEL 9 / Rocky Linux 9
Amazon Linux:
- Supported: Amazon Linux 2023
- Notes: Optimized for AWS deployments
macOS (Development Only)
Section titled “macOS (Development Only)”Supported:
- macOS 12 (Monterey) or newer
- Docker Desktop for Mac required
Notes:
- Suitable for development and testing
- Not recommended for production
- Performance may vary
Windows (Development Only)
Section titled “Windows (Development Only)”Supported:
- Windows 10/11 Pro or Enterprise
- Windows Server 2019/2022
- WSL 2 (Windows Subsystem for Linux) required
Notes:
- Requires Docker Desktop for Windows
- WSL 2 backend required
- Development and testing only
- Not recommended for production
OS Configuration
Section titled “OS Configuration”Kernel Version:
- Linux kernel 5.4+ (5.15+ recommended)
File System:
- ext4 (recommended)
- XFS (for large databases)
- Avoid NFS for database storage
System Limits:
# Increase file descriptor limitsulimit -n 65536
# In /etc/security/limits.conf* soft nofile 65536* hard nofile 65536Swap:
- Minimum: 2 GB
- Recommended: Equal to RAM for systems with <16GB RAM
- Recommended: 8-16 GB for systems with >16GB RAM
Docker Requirements
Section titled “Docker Requirements”Docker Engine
Section titled “Docker Engine”Minimum Version: 24.0.0 Recommended Version: Latest stable (25.0+)
Installation:
# Ubuntu/Debiancurl -fsSL https://get.docker.com | sh
# Verify installationdocker --versionConfiguration:
{ "log-driver": "json-file", "log-opts": { "max-size": "10m", "max-file": "3" }, "storage-driver": "overlay2"}Docker Compose
Section titled “Docker Compose”Minimum Version: 2.20.0 Recommended Version: Latest stable (2.24+)
Installation:
# Usually included with Docker Desktop# For Linux servers:sudo apt-get install docker-compose-plugin
# Verify installationdocker compose versionDocker Resource Allocation
Section titled “Docker Resource Allocation”Minimum:
- CPU: 4 cores allocated to Docker
- Memory: 8 GB allocated to Docker
- Disk: 50 GB
Recommended:
- CPU: 8+ cores allocated
- Memory: 16+ GB allocated
- Disk: 100+ GB
Docker Desktop Settings (macOS/Windows):
- Settings → Resources
- Allocate at least 8 GB RAM
- Allocate at least 4 CPU cores
- Allocate at least 50 GB disk
Network Requirements
Section titled “Network Requirements”Required Ports (must be open):
| Port | Service | Purpose | Exposure |
|---|---|---|---|
| 80 | HTTP | Web application | Public |
| 443 | HTTPS | Secure web application | Public |
| 8080 | Traefik Dashboard | Reverse proxy admin | Internal only |
Internal Ports (Docker network only):
- 27017: MongoDB
- 6379: Redis
- 8000: FastAPI server
- 5173: SvelteKit dev server (development)
Firewall Configuration
Section titled “Firewall Configuration”Inbound Rules:
# Allow HTTPsudo ufw allow 80/tcp
# Allow HTTPSsudo ufw allow 443/tcp
# Allow SSH (for management)sudo ufw allow 22/tcp
# Enable firewallsudo ufw enableOutbound Rules:
- Allow all outbound (default)
- Required for:
- Package updates
- AI API calls (OpenAI/Azure)
- External integrations
- Email sending
- License validation
Bandwidth
Section titled “Bandwidth”Minimum:
- Download: 10 Mbps
- Upload: 5 Mbps
Recommended:
- Download: 100 Mbps
- Upload: 50 Mbps
Enterprise:
- Download: 1 Gbps
- Upload: 500 Mbps
Bandwidth Usage Estimates:
- Per user session: 1-5 Mbps
- File uploads: Variable (depends on file size)
- AI requests: 0.1-1 Mbps per request
- Background sync: 1-10 Mbps
Domain Requirements
Section titled “Domain Requirements”Required:
- Fully qualified domain name (FQDN)
- DNS A record pointing to server IP
- SSL/TLS certificate (Let’s Encrypt recommended)
Examples:
app.yourcompany.comquerri.yourcompany.comanalytics.yourcompany.com
Wildcard Support (optional):
*.querri.yourcompany.comfor multi-tenant
External Service Dependencies
Section titled “External Service Dependencies”Required Services
Section titled “Required Services”WorkOS (Authentication)
Section titled “WorkOS (Authentication)”Purpose: User authentication and SSO Requirement: Active WorkOS account Pricing: Varies by plan Setup: https://workos.com
Network Requirements:
- Outbound HTTPS to
api.workos.com - OAuth callback URL accessible
AI Provider (OpenAI or Azure OpenAI)
Section titled “AI Provider (OpenAI or Azure OpenAI)”Purpose: AI-powered data analysis and chat
Option 1: OpenAI
- Requirement: OpenAI API key
- Pricing: Pay-per-use
- Setup: https://platform.openai.com
Option 2: Azure OpenAI
- Requirement: Azure subscription with OpenAI access
- Pricing: Varies by region
- Setup: Azure Portal
Network Requirements:
- Outbound HTTPS to
api.openai.com(OpenAI) - Outbound HTTPS to your Azure endpoint (Azure OpenAI)
API Rate Limits:
- Consider your usage tier
- Monitor quota and rate limits
- Plan for peak usage
Optional Services
Section titled “Optional Services”Stripe (Billing)
Section titled “Stripe (Billing)”Purpose: Subscription billing and payments Requirement: Stripe account (if using billing) Setup: https://stripe.com
SendGrid (Email)
Section titled “SendGrid (Email)”Purpose: Transactional emails and notifications Requirement: SendGrid account (if using email) Alternative: SMTP server Setup: https://sendgrid.com
Amazon S3 (File Storage)
Section titled “Amazon S3 (File Storage)”Purpose: Scalable file storage Requirement: AWS account (if using S3 storage) Alternative: Local storage Setup: AWS Console
Monitoring Services (Optional)
Section titled “Monitoring Services (Optional)”- Sentry: Error tracking
- Segment: Analytics
- Userflow: User onboarding
Browser Compatibility
Section titled “Browser Compatibility”Supported Browsers
Section titled “Supported Browsers”Recommended:
- Google Chrome 120+
- Microsoft Edge 120+
- Safari 17+
- Firefox 121+
Minimum:
- Chrome 100+
- Edge 100+
- Safari 15+
- Firefox 100+
Mobile Browsers:
- Chrome Mobile (Android)
- Safari Mobile (iOS 15+)
Not Supported:
- Internet Explorer (any version)
- Chrome <100
- Safari <15
Browser Features Required
Section titled “Browser Features Required”- JavaScript enabled (required)
- Cookies enabled (required)
- WebSockets support
- Local storage support
- Modern CSS (Grid, Flexbox)
Database Sizing
Section titled “Database Sizing”MongoDB Sizing
Section titled “MongoDB Sizing”Working Set:
- Should fit in RAM for optimal performance
- Estimate: (Number of projects × 10 MB) + (Number of files × average file size)
Index Size:
- Approximately 10-20% of data size
- Must fit in RAM for best performance
Example Sizing:
- Small (100 projects, 1,000 files): 2-4 GB database
- Medium (1,000 projects, 10,000 files): 20-40 GB database
- Large (10,000 projects, 100,000 files): 200-400 GB database
Recommendations:
- 8 GB RAM for small deployments
- 16 GB RAM for medium deployments
- 32+ GB RAM for large deployments
Scaling Guidelines
Section titled “Scaling Guidelines”Vertical Scaling
Section titled “Vertical Scaling”When to scale vertically:
- Single server not fully utilized
- Simpler management
- Cost-effective for small-medium deployments
Upgrade path:
- Add more RAM (easiest)
- Add faster storage (SSD → NVMe)
- Add more CPU cores
- Upgrade to faster CPUs
Horizontal Scaling
Section titled “Horizontal Scaling”When to scale horizontally:
- Single server at capacity
- High availability required
- Geographic distribution needed
- Very large deployments (200+ users)
Architecture:
- Multiple server-api replicas (already supported)
- Load balancer (Traefik or external)
- Separate database server
- Shared storage (S3)
- Redis cluster (optional)
Deployment patterns:
- Database on dedicated server
- API servers on multiple instances
- Load balancer distributing traffic
- Shared file storage (S3 required)
Performance Benchmarks
Section titled “Performance Benchmarks”Expected Performance
Section titled “Expected Performance”Small Deployment (4 cores, 8 GB RAM):
- Concurrent users: 10-20
- Projects: Up to 500
- Query response: <2 seconds
- File upload: 10 MB/s
Medium Deployment (8 cores, 16 GB RAM):
- Concurrent users: 50-100
- Projects: Up to 5,000
- Query response: <1 second
- File upload: 50 MB/s
Large Deployment (16 cores, 32 GB RAM):
- Concurrent users: 200-500
- Projects: Up to 50,000
- Query response: <500ms
- File upload: 100 MB/s
High Availability Considerations
Section titled “High Availability Considerations”For Mission-Critical Deployments
Section titled “For Mission-Critical Deployments”Database:
- MongoDB replica set (3+ nodes)
- Automated failover
- Geographic distribution
Application:
- Multiple server instances
- Load balancing
- Health checks
- Automatic restart
Storage:
- RAID for local storage
- S3 for high availability
- Regular backups
Network:
- Redundant internet connections
- DDoS protection
- CDN for static assets
Monitoring:
- Uptime monitoring
- Performance monitoring
- Alerting
- Log aggregation
Disaster Recovery
Section titled “Disaster Recovery”Backup Requirements
Section titled “Backup Requirements”Database Backups:
- Frequency: Daily minimum
- Retention: 30 days
- Storage: Off-site
- Testing: Monthly restore tests
File Backups:
- Frequency: Daily (if LOCAL storage)
- Retention: 30 days
- Note: S3 has built-in redundancy
Configuration Backups:
- Version control for
.env-prod - Secure storage
- Document all customizations
Recovery Time Objectives
Section titled “Recovery Time Objectives”RTO (Recovery Time Objective):
- Small deployment: 4 hours
- Medium deployment: 2 hours
- Large deployment: 1 hour
- Enterprise: 15 minutes
RPO (Recovery Point Objective):
- Standard: 24 hours (daily backups)
- Enhanced: 1 hour (continuous replication)
- Critical: Near-zero (synchronous replication)
Compliance Considerations
Section titled “Compliance Considerations”Data Residency
Section titled “Data Residency”- Choose appropriate AWS region (if using S3)
- Configure database location
- Ensure services comply with regulations
- Document data flows
Security Standards
Section titled “Security Standards”- SOC 2 compliance (if required)
- GDPR compliance (for EU users)
- HIPAA compliance (for healthcare data)
- PCI DSS (if processing payments)
Audit Requirements
Section titled “Audit Requirements”- Enable comprehensive logging
- Log retention policies
- Access control logging
- Regular security audits
Pre-Deployment Checklist
Section titled “Pre-Deployment Checklist”Hardware
Section titled “Hardware”- CPU meets minimum requirements (4+ cores)
- RAM meets minimum requirements (8+ GB)
- Storage meets minimum requirements (50+ GB SSD)
- Network bandwidth adequate (10+ Mbps)
Software
Section titled “Software”- Operating system installed and updated
- Docker Engine 24.0+ installed
- Docker Compose 2.20+ installed
- Required ports open (80, 443)
- Firewall configured
External Services
Section titled “External Services”- Domain name registered and configured
- SSL certificate obtained (Let’s Encrypt recommended)
- WorkOS account created and configured
- AI provider configured (OpenAI or Azure)
- Optional services configured (Stripe, SendGrid, etc.)
Configuration
Section titled “Configuration”-
.env-prodfile created with all required variables - Credentials generated and secured
- File permissions set correctly
- Docker volumes configured
Testing
Section titled “Testing”- Test deployment in staging environment
- Verify all services start correctly
- Test authentication flow
- Test AI functionality
- Test file uploads
- Load testing completed (if production)
Next Steps
Section titled “Next Steps”- Complete installation following these requirements
- Configure environment variables
- Set up monitoring for production deployments
- Plan backup strategy for data protection
- Review security best practices