Jun 29, 2018 01:37 PM
Hello,
I’m trying to sanely bulk update a set of records from a source to Airtable. Mechanically, I have it working, but I’m thinking that I need to employ some better practices for managing these updates.
Here’s the current state:
Data
Process/Script
The problem
As written a number of async requests get issued and we run into conflicts. Then data doesn’t get written. I want to make sure the sync is bullet proof so we don’t run into data loss issues or have to do a lot of additional QA. I mitigated by setting a timeout for the first request. I’m thinking of two ways to improve this:
Do other folks have code samples or examples of doing a similar bulk process where you have a lot of updates on records? My main goal is to have confidence that data is loading to Airtable.
Additionally, as an aside, a bulk option for updates would be great so that we can just send the record payload or send it in chunks and get back a summary response when the operation completes.
Jun 29, 2018 01:54 PM
Check out this comment regarding how to work around the api limits:
Jun 29, 2018 01:59 PM
Oh, I’m going to give that a shot, that looks like it may solve the main problem for me! Thanks.
Update: was able to quickly refactor using bottleneck and added some logging via Winston for good measure. Seems to be working great! Thanks again for the tip. I poked around, but missed that post.