When working with large amounts of data (thousands or tens of thousands of rows), SyncSheets uses its bulk processing engine to handle the sync efficiently without timing out or overloading your server.
How Bulk Processing Works
Instead of processing all data in a single request, the bulk processor breaks the job into smaller batches:
- Batch Size: 500 rows per batch by default.
- Max Execution: Each batch runs for up to 25 seconds before pausing.
- Async Processing: Batches are processed in the background using Action Scheduler or loopback HTTP requests.
When to Use Bulk Processing
Bulk processing is automatically used when:
- You are exporting more than 1,000 rows of data.
- You trigger a bulk export or import via the REST API.
- Your data provider returns a large dataset that exceeds normal processing limits.
Monitoring Progress
You can monitor bulk job progress through:
- The admin UI, which shows the current processing status.
- The REST API endpoint:
GET /wp-json/syncsheets/v1/bulk/status/{job_id}
Canceling a Bulk Job
To cancel a running bulk job, use the cancel endpoint: POST /wp-json/syncsheets/v1/bulk/cancel/{job_id}
Configuration
You can adjust the Chunk Size in the job settings (Step 5 of the wizard). A smaller chunk size reduces server load but takes longer to complete. The default value of 100 rows per chunk works well for most hosting environments.