Bulk uploading causes tons of automation failure emails

Topic Labels: Scripting extentions
1052 7
Showing results for 
Search instead for 
Did you mean: 
5 - Automation Enthusiast
5 - Automation Enthusiast


I have a few automations that do things like set default values for new records and one that randomly generates something. They work fine if I’m adding new records in manually but if I import a bulk CSV with 10,000 records then I get loads of emails complaining that the automation failed. But when i check after a few hours all the defaults are set and everything that needs generating has been. I think what happens is that it sees that there’s so many records and whilst its automating away its complaining that there’s too many to do.

This isn’t too annoying as it still works I just get a load of emails I need to archive. But if it genuinely breaks I don’t realise for a while because I just thought the automation failure emails where from someone else doing a bulk upload.

In short could Airtable make it so automation failure emails only fire when it actually fails not just when there’s a lot to do and if possible could the error emails contain a bit more debugging info on what went wrong.

Thanks in advance

7 Replies 7

Are you aware of the limitations of the CSV importer? There are some other relevant links below that article, think they might help you confirm what exactly is happening.

5 - Automation Enthusiast
5 - Automation Enthusiast

Yeah we’re already using the CSV import app instead of the standard CSV import and I get that we’re trying to add a lot of records but its more an issue that occurs when an automation has a lot of things to do at once than with CSV imports. The same thing happens when we bulk update a field to contain a certain value that then triggers an automation on each of those records.

Is it possible these automations are timing out on the server—Airtable’s—side? Have you checked their logs?

5 - Automation Enthusiast
5 - Automation Enthusiast

None of them use external requests so they shouldn’t be failing that way. And its all simple stuff like update records field to x etc. The most complex one randomly generates two numbers then updates a record to contain those two numbers but that shouldn’t be intense enough to cause a timeout.

What is the trigger for your automation(s)? If it’s something like “When record updated”, it may be multi-firing depending on how each record is updating as a result of the CSV import process. It’s nice to envision a clean all-at-once update where all fields are populated simultaneously, but it’s possible that several micro-updates are happening in succession. This would lead to multiple automation triggers, with some automations being fed incomplete record data, and only one being fed the fully-updated record.

Its on a on created trigger. So automation 1 sets some defaults for the record one of these defaults is a strategy, whenever the strategy is set another automation is trigger to populate another randomly generated field. So it shouldn’t be getting invalid data in.

Sorry for the delayed reply. My gut feeling on this is to revise how the system operates. Instead of triggering each record separately, I’d be more inclined to use a script to batch-process incoming records. In fact, the script could even handle parsing of the CSV file and creation of new records (rather than using other CSV import methods). The biggest benefit would be that it would only need to run once to process everything vs an automation that runs thousands of times. If you’d like to explore scripting options, message me.

In the meantime, I recommend talking to Airtable support directly (if you haven’t done so already) about the volume of error emails that you’re receiving using your current process.