This section provides performance data and benchmark comparisons for our core platform and components. Our goal is to be transparent about our performance and to continuously improve.
| Endpoint | Average Latency (ms) | p95 Latency (ms) | Requests/sec |
|---|---|---|---|
/api/auth/login | 45 | 120 | 1,500 |
/api/data/query | 80 | 250 | 800 |
/api/automation/run | 150 | 400 | 500 |
/api/users/profile | 35 | 95 | 2,000 |
All benchmarks were run on our standard production environment with a simulated load of 10,000 concurrent users.
We use Lighthouse to measure the performance of our front-end components.
| Component | Performance Score | First Contentful Paint (FCP) | Time to Interactive (TTI) |
|---|---|---|---|
| Data Grid | 92 | 1.1s | 2.5s |
| Dashboard | 88 | 1.4s | 3.1s |
| Forms | 95 | 0.9s | 2.1s |
| Marketing Page | 98 | 0.7s | 1.8s |
Comparison of our automation engine against other popular open-source solutions.
| Workflow Type | Our Platform (avg. time) | Competitor A (avg. time) | Competitor B (avg.time) |
|---|---|---|---|
| Data Sync (10k records) | 1.2 min | 1.8 min | 2.1 min |
| ETL Process | 3.5 min | 4.2 min | 5.0 min |
| Scheduled Report | 0.8 min | 1.1 min | 1.3 min |
Our benchmarks are automated and run on a nightly basis against the main branch. The infrastructure is provisioned on AWS using c5.large instances for our backend services and a standard browser environment for front-end tests.
We are committed to improving these numbers and will update this page as new optimizations are released. For more details on our testing methodology, please see our testing repository.