Enterprise Data Synchronization Made Simple
Automate data transfer between databases, APIs, Google Sheets, and more with real-time monitoring and enterprise security.
Limited beta spots available · Free credits for early adopters · Production-ready platform
Dashboard Overview
Recent Job Runs
Active Connections
See Kusi in Action
Production-ready platform with enterprise-grade monitoring and real-time observability
Recent Job Runs
| Job Name | Status | Rows | Duration | Started |
|---|---|---|---|---|
Salesforce → PostgreSQL Contacts sync | Running | 1,247 / 2,500 | 00:02:34 | 2 min ago |
Google Sheets → BigQuery Marketing data | Success | 8,432 | 00:01:12 | 15 min ago |
Typeform → PostgreSQL Survey responses | Success | 234 | 00:00:45 | 1 hour ago |
MySQL → Snowflake Orders data | Failed | 0 | 00:00:08 | 2 hours ago |
API → PostgreSQL Customer events | Success | 15,678 | 00:03:21 | 3 hours ago |
Real-time monitoring with WebSocket updates, job metrics, and system health
Real-Time Updates
Watch jobs execute row-by-row with WebSocket monitoring
Comprehensive Metrics
Track success rates, performance, and credit consumption
Data Observability
Monitor freshness, detect drift, and ensure data quality
Everything you need for data synchronization
Powerful features to automate your data pipelines with confidence
Multi-Source Integration
Connect to PostgreSQL, MySQL, Redshift, Snowflake, BigQuery, Google Sheets, Salesforce, Typeform, REST APIs, CSV, Excel, and more. SSH tunnel support for secure connections to private databases.
Real-Time Monitoring
Track job execution with live WebSocket updates showing row-by-row progress. Monitor performance metrics, credit consumption, and system health from a comprehensive dashboard with detailed analytics.
Enterprise Security
Production-grade security with JWT authentication, optional 2FA (TOTP), comprehensive audit logs, role-based access control (RBAC), security headers, and rate limiting.
Automated Scheduling
Schedule jobs with flexible cron expressions (every minute to yearly). Execute multiple tasks in parallel with automatic retry, dependency management, field mapping, and conflict detection.
How Data Flows Through Kusi
Understand the complete journey of your data from source to destination
Data Sources
Kusi Engine
Data Destinations
Flexible Transfer Modes
Full Sync
Replace all data at destination with fresh copy from source. Perfect for snapshots and reports.
Incremental
Only sync new or modified records based on timestamp. Minimal credit usage for large datasets.
Upsert
Insert new records and update existing ones based on primary key. Best for maintaining live data.
Enterprise-Grade Data Pipeline Features
Understanding Jobs & Tasks
Learn how Kusi organizes and executes your data synchronization workflows
What is a Job?
A Job is a collection of related data synchronization tasks that run together on a schedule. Think of it as a workflow container that orchestrates multiple data movements.
Scheduled Execution
Jobs run on cron schedules (e.g., every hour, daily at 2am, or manually triggered)
Multiple Tasks
Each job contains 1 or more tasks that can run in parallel or sequentially
Run History
Every execution creates a Job Run record with metrics and logs
What is a Task?
A Task is a single data transfer operation from one source to one destination. Tasks define what data to move, how to transform it, and where it goes.
Task Components:
Task Settings:
Job Execution Lifecycle
Example: Daily Sales Pipeline Sync
Execution Results:
Credit Usage:
Parallel Execution
Tasks within a job run simultaneously for maximum speed and efficiency
Smart Failure Handling
Individual task failures don't stop other tasks. Auto-retry with exponential backoff.
Real-Time Progress
WebSocket updates show live progress, row counts, and status for each task
Transparent Usage Tracking
Built-in credit system tracks resource consumption. See exactly what each job costs.
Credit Formula = (Rows × 0.00001) + (Seconds × 0.001) + (Bytes × 0.00000001)
Rows Processed
Each data row transferred between sources and destinations
Compute Time
Total job execution time measured in seconds
Data Transferred
Total data size transferred in bytes
Example: Salesforce to BigQuery Sync
Job Details:
Credit Calculation:
Resource Tracking
Automatic tracking of rows processed, compute time, and data transferred per job.
Cost Visibility
See exactly what each job costs in credits before and after execution.
Usage Analytics
Built-in dashboard shows credit consumption trends and top resource consumers.
Powerful API Integration
Trigger jobs, monitor progress, and manage data pipelines programmatically with our REST API
Get Started with API Keys
Generate API keys from your dashboard to authenticate requests. Keys can be scoped to specific organizations and permissions.
cURL
curl -X POST http://localhost:8000/api/v1/jobs/run \
-H "Authorization: Bearer your-api-key-here" \
-H "Content-Type: application/json" \
-d '{
"job_id": "your-job-id",
"trigger": "manual"
}'Python
import requests
api_key = "your-api-key-here"
base_url = "http://localhost:8000/api/v1"
# Trigger a job
response = requests.post(
f"{base_url}/jobs/run",
headers={"Authorization": f"Bearer {api_key}"},
json={"job_id": "your-job-id", "trigger": "manual"}
)
print(f"Job started: {response.json()}")JavaScript
const apiKey = "your-api-key-here";
const baseUrl = "http://localhost:8000/api/v1";
// Trigger a job
const response = await fetch(`${baseUrl}/jobs/run`, {
method: "POST",
headers: {
"Authorization": `Bearer ${apiKey}`,
"Content-Type": "application/json"
},
body: JSON.stringify({
job_id: "your-job-id",
trigger: "manual"
})
});
const data = await response.json();
console.log("Job started:", data);Complete API Coverage
Job Management
Create, update, delete, and trigger jobs programmatically
Real-Time Status
Query job run status, progress, and metrics via REST or WebSocket
Connection CRUD
Manage data sources and destinations through the API
Usage Analytics
Track credit consumption, performance metrics, and execution history
Full OpenAPI specification at http://localhost:8000/api/v1/openapi.json
Interactive docs available at /docs (Swagger) and /redoc
Webhook Notifications
Configure webhooks to receive real-time notifications when jobs complete, fail, or hit specific thresholds.
Job Events
job.started, job.completed, job.failed, job.cancelled
Alert Triggers
Threshold alerts, credit warnings, connection failures
Retry Logic
Exponential backoff with configurable retry attempts