Hi everyone,
I'm looking for advice on implementing a nightly data sync workflow.
My Setup:
- CSV file with 25,000 records that needs to sync to an Airtable table
- Currently using the Sync API, which has a 10,000 record limit
- Need to perform a complete refresh (delete all + re-sync) every night
The Challenge: Since my dataset is 2.5x larger than the sync limit, I need to send the data in 3 batches. However, I'm struggling to find the right approach to:
- Clear the entire table before syncing
- Successfully sync all 25,000 records in multiple batches
What I've Tried:
- Various sync configuration settings
- Setting up a webhook-triggered automation to delete all records before syncing
- Different batching strategies
My Question: Has anyone successfully implemented a nightly sync workflow for CSV data that exceeds the 10,000 record sync limit? I'm open to alternative approaches if the Sync API isn't the right tool for this use case.
Any guidance or examples would be greatly appreciated!
Thanks in advance!
