Precisely what we see all the time.
Hi, Thanks for you answer.
So you don’t kown any workaround? I was thinking testing the records a few seconds after creation to delete/recreate if incomplete. However that this won’t always work. It looks like there is some cache because even on second try Airtable seems to dowload a corrupted file though the original one is correct.
Any info on the correction of this bug?
For one client in the UK I have developed a process that waits minutes and retries up to 6 times and even this has generally failed. Occasionally it works, but the effectiveness is simply not good enough.
Airtable has been quiet about this issue.
This issue is still a problem today. Does airtable team have some info on the roadmap to fix this issue?
Well I am struggling with this issue since december 2019.
I had tens of mails with Airtable support and still no visibility on the roadmap to solve this issue. I even reproduces the error with Postman and share it with the support team.
It’s really breaking one of our Airtable application as it is impossible to reliably attach file through API.
Airtable is an excellent product to manually edit databases, but when you work on automation through API, maybe you should look for an other solution.
Yep - my findings as well. Here’s more detail on my tests…
I was having an issue with this. If you give the API an invalid attachment url… rather than returning an error, the API accepts the request. Then when you look at the history of the record, it confusingly shows you uploading and deleting an attachment.
Yep - this is also a poor aspect of the API. But, even accounting for bad URLs, good URLs fail as well from time-to-time.
It’s more nuanced that this. The history is reflecting on a process that failed to make a copy of the document at the provided URL. To be more clear - the process takes your URL, makes a temporary copy of the document, and makes a permanent copy which is tantamount to pushing it into the Airtable CDN (as far as I know).
It is this final step that seems to fail for some good URLs and all bad URLs.
I am facing the same problem while uploading files and this is very annoying.
Before trying every possible solution blindly, I would like to check if someone found a workaround for this ?
- Does the file location make a difference ? (I am using Google Drive but could host files on S3 or self-host if it solves the problem)
- Custom apps, scripts or API are all subject to that intermittent failure ?
- Any other suggestion ?
@Bill.French, I know you did a lot of experiments on this problem. Are you still using you retry strategy to handle that intermittent error ?
Yes, but retries typically fail as well. I have 30 clients who regularly see this issue bubble up from time-to-time.
On my side I tried reducing the size of the picture before upload, but no this has no significant change.
However I observed a strange behaviour : when I add through API a bunch of records with attachments, lots of them have broken attachments (picture icon without thumbnail and broken link of data).
If I delete these records and retry to add the same records through API, most of the records will be OK. I suspect that once Airtable hase cached once the link, everything goes better.
Retries could be a good strategy, but silence frome Airtable on this issue is really desapointing.
For the record, this wasn’t an issue a few time ago and started to be totaly broken on december 2019 (I have airtable bases showing this chronology…)
As do I. I reported this initially on Dec 19, 2019 and to this day, the errors are intermittent and difficult to reproduce unless the platform is in a state of mind to fail.
@Valerian_Lebert, @Bill.French, @Florian_Verdonck, I can report that I do also observe similar very strange behavior with Airtable attachments seeming to be uploaded, and then automatically being deleted seconds later. This happens regardless of which application is being used to drive the API, whether third party integrators like Zapier/Integromat, Airtable automation or Airtable script.
I tried @Florian_Verdonck’s method with Airtable automations (thanks for the tip @Florian_Verdonck!), and that also suffers from the same issue, and worse, gets stuck in an infinite loop because it just keeps running ad infinitum with the automation continually adding the attachment, and then it getting deleted seconds later, so it just keeps going and going.
What I have found does work though is to wait for some time, e.g. half an hour or an hour, and then to process them in bulk using the Convert URLs to attachments script. For some reason, there seems to be some throttling of attachment upload bandwidth, hence why one needs to wait a while.
Any further updates from Airtable on this issue?
On a side note, I am curious as to whether you all have found a way to add in the file names when doing the uploads either via automations or via Airtable scripts.
I am not sure if this issue occurs regardless the source of the file.
My feeling is that it happens “more” when using Google URLs but this is a feeling and I never did real tests about this.
You may try to create a function that “proxy” the source file to check if it increases the success rate.
I would suggest two options for this :
Without code : Create an Integromat Scenario with a Webhook as trigger, Download File and the return a webhook response with the file in the body. (so you return the file FROM integromat, and not the original source)
Low-code : Create a Pipedream (or AWS Lambda etc) function that do the same thing (probably cheaper if huge volumes)
Side note : yes it is possible in Script and Automations (with a script block). Update a record action do not allow specifying the filename (it always takes the original filename I think).
Thanks for the information. I did’t know the Convert URLs to attachments trick.
I just tested it, and I have the same errors. See screen attached : some empty attachment fields and some “broken pictures”:
From my point of view, I observe an improvement in general on this bug (more attachment get uploaded without erros than before). But maybe it is also because I changed some process on my side (resizing the pictures to smaller files).
However it is still a pain that attachment upload is not 100% reliable.
Missing attachment can be fixed with your script as the stript will retry to upload them. But “broken ones” (when you don’t see the thumbnail) are really difficult to handle.
I did - it’s clearly more likely with Google-based resources, but it is not limited to Google URLs. I’ve proven this with a variety of endpoints suggesting the flaw is in Airtable.
My approach has an additional retry test to ensure it doesn’t go on forever. This sort of API behaviour will negatively impact only your users, not Airtable themselves.
I have seen this issue with the Airtable API but have performed few tests with the SDK so that avenue is inconclusive for me.
Indeed, but let’s be clear - waiting does not remedy the issue in some unknown way. Waiting only influences when attempts are made. Running these processes earlier is no different than running them later. It’s like a slot machine - do you think waiting a half hour will change the bet outcome?
Attachment download inconsistant
And BTW - the polarity of this title seems inverse. The issue is really “uploading”; ergo - creating a new attachment is an upload process into Airtable’s CDN. I tend to think of my local file system in terms of “downloads” in the same manner that browsers place files retrieved from the Interwebs in a “Download” folder.
I think this is not accurate. We have mounds of evidence in log files showing that retries rarely work.
Just to be clear: We are only talking about uploads via API/Script, not manual uploads?
So, not sure if this helps, but: On my base i am crawling/indexing corporate annual reports from 1000 companies. I noticed that the PDF upload to the attachment field sometimes fails. Weirdly enough, it seems to be that it fails just for a specific set of companies, i.e. domain names. But for those it fails for multiple files…
Could it be that there is a systemic issue at the level of the domain in the sense that the domain doesn’t allow airtable to download unless when using the frontend? In what way could it make a difference that airtable tries to download the file when triggered by API in contrast to when a frontend user enters the URL of the attachment? Sounds not very plausible, but isn’t it weird that out of hundreds of URLs, I have 40 URLs that dont work but these 40 URLs are all related to only say 20 different domains?