No, I wouldn’t recommend using the Sync API for that, because the Sync API tool has a ton of limitations. For example, you can’t manually edit nor delete any of the records after they are imported into Airtable.
For your needs, I would highly recommend using Make’s advanced CSV automations alongside Make’s Airtable automations for that.
I demonstrate how easy & simple this is to setup on this Airtable podcast episode.
If you’ve never used Make before, I’ve assembled a bunch of Make training resources in this thread.
However, Make has a limit of 40 minutes for each one of its automations, and I don’t think you’ll be able to get through all 25,000 records in 40 minutes.
So you’ll probably need to create 3 or more automations that all do the same exact thing, but in each automation, you would use Make’s “filtering” function to only process certain row numbers.
So let’s say you had 4 automations. The 1st automation would only process rows 1 through 6,250. The 2nd automation would only process rows 6,251 through 12,500. And so on.
Make takes a little bit of time to learn, so you might not figure it all out on the first setup, but check out my training resources above, and you’ll be able to figure it out in no time.
Hope this helps!
If you have a budget and you’d like to hire the best Airtable consultant to help you with this or anything else that is Airtable-related, please feel free to contact me through my website: Airtable consultant — ScottWorld
Hm I don’t think that’s actually possible with the Sync API because of the 10k row limit you mentioned; you either end up being unable to delete the old records or being unable to upload new records
How automated does this need to be? If you’re okay with running a bash script once a day then you could hire someone to do that for you, wouldn’t take more than a half hour probably and letting it run in the background. ChatGPT can sort you out real quick too if you’re comfortable with that. End result would be a script that you ran in a folder with the 25k row CSV file, and it would:
- Delete all the records in the table
- Chunk the 25k row CSV file into 10 API calls each. The rate limit's 5 requests per second, so assume you did 4, so 40 records per second, would take about 11 minutes to run
Could make it more automated but then you’re looking at a lot more complexity / cost