No, I wouldn’t recommend using the Sync API for that, because the Sync API tool has a ton of limitations. For example, you can’t manually edit nor delete any of the records after they are imported into Airtable.
For your needs, I would highly recommend using Make’s advanced CSV automations alongside Make’s Airtable automations for that.
I demonstrate how easy & simple this is to setup on this Airtable podcast episode.
If you’ve never used Make before, I’ve assembled a bunch of Make training resources in this thread.
However, Make has a limit of 40 minutes for each one of its automations, and I don’t think you’ll be able to get through all 25,000 records in 40 minutes.
So you’ll probably need to create 3 or more automations that all do the same exact thing, but in each automation, you would use Make’s “filtering” function to only process certain row numbers.
So let’s say you had 4 automations. The 1st automation would only process rows 1 through 6,250. The 2nd automation would only process rows 6,251 through 12,500. And so on.
Make takes a little bit of time to learn, so you might not figure it all out on the first setup, but check out my training resources above, and you’ll be able to figure it out in no time.
Hope this helps!
If you have a budget and you’d like to hire the best Airtable consultant to help you with this or anything else that is Airtable-related, please feel free to contact me through my website: Airtable consultant — ScottWorld
Hm I don’t think that’s actually possible with the Sync API because of the 10k row limit you mentioned; you either end up being unable to delete the old records or being unable to upload new records
How automated does this need to be? If you’re okay with running a bash script once a day then you could hire someone to do that for you, wouldn’t take more than a half hour probably and letting it run in the background. ChatGPT can sort you out real quick too if you’re comfortable with that. End result would be a script that you ran in a folder with the 25k row CSV file, and it would:
- Delete all the records in the table
- Chunk the 25k row CSV file into 10 API calls each. The rate limit's 5 requests per second, so assume you did 4, so 40 records per second, would take about 11 minutes to run
Could make it more automated but then you’re looking at a lot more complexity / cost
Thank you both.
Adam, I think you’re correct. The Sync API docs are ambiguous to me but it reads like you can’t sync more than 10k records unless you use the setting to retain previous records. The issue of course being that there’s no way to delete the existing records via the API.
- To confirm, the only approach you think will work is:
- Use `Create records` endpoint to write records (10 recs at at time)
- Use `Delete multiple records` endpoint to delete (10 recs at a time)
Yeah, so that’d take like 22 minutes to run actually, sorry
Could try messing with a Run a Script action that would run via an Airtable automation to delete the records for you too actually, trigger it via webhook that the Bash script hits, and then:
- Run a Script action to grab all the records in the table
- Chunk them out into an array of 1k each
- Use a repeating group on that output with a Run a Script to delete them in batches of 1k (We have to do this to get around the 30 second time limit for the Run a Script step)
Downside to this is that you’d need to experiment with how long this would take. I reckon probably just give it a 5 minute window before it starts creating and you should be good?