Feb 26, 2020 03:02 AM
Hi
We use a script to create airtable records through API.
Attachement (pictures) are stored on a public webserver. The URL is passed by API request and Airtable dowload the attachement on his own cloud to be available in the attachment field.
This solution worked properly for a year maybe.
Since december 2019 we get some “broken files”. It seems that Airtable fails at downloading the attachment ant the attachment appears whit a generic icon instead of a picture preview. The picture can’t be downloaded properly.
It is very inconsistant : we have records with for exemple 4 pictures in the attachment field and one of them appear broken. Deleting the record and recreating it the same way may fix the record. The picture is always available on our webserver.
As we have a long history I could determine that this problem began to occurr on last december.
I read some similar issues on the forum that have been fixed by “patching” the airtable with the support
Can I have any help on this issue?
Oct 25, 2021 04:46 PM
I think this is not accurate. We have mounds of evidence in log files showing that retries rarely work.
Dec 09, 2021 05:16 AM
Just to be clear: We are only talking about uploads via API/Script, not manual uploads?
Dec 09, 2021 05:41 AM
So, not sure if this helps, but: On my base i am crawling/indexing corporate annual reports from 1000 companies. I noticed that the PDF upload to the attachment field sometimes fails. Weirdly enough, it seems to be that it fails just for a specific set of companies, i.e. domain names. But for those it fails for multiple files…
Could it be that there is a systemic issue at the level of the domain in the sense that the domain doesn’t allow airtable to download unless when using the frontend? In what way could it make a difference that airtable tries to download the file when triggered by API in contrast to when a frontend user enters the URL of the attachment? Sounds not very plausible, but isn’t it weird that out of hundreds of URLs, I have 40 URLs that dont work but these 40 URLs are all related to only say 20 different domains?
Dec 14, 2021 05:37 PM
Hi all, thanks to everyone who has commented on this topic and shared their analyses and fixes. I figured out a workaround for those interested.
First, our use case: we’re building a database of 50+ fields, 8k records and rely on the display of thousands of images for review/assessment. There are multiple images for each record; small image size.
What did NOT work:
What DID work:
I’m not overpromising here - this is a workaround, but maybe it can help others. I created a new table within the same base and Sync’d to Google Drive; select your folder; be sure to sync a thumbnail. The thumbnails and metadata populated immediately (what a relief after so much trouble!). I then used Attachments-to-URLs to turn the thumbnails into URLs. Just a note that the thumbnail won’t display as a proper image in full size.
Next - linking the tables. I had to extract the filenames (using a formula) in order to link to my original table, but once that was set up I could populate the original table with the right URL links in the right place. Finally, I used Convert URLs to Attachments to convert the URL links to actual attachments. For some reason, they actually populated as full images (don’t ask, I dunno). The great benefit of this for our needs is the ability to include multiple images in the same attachment field.
Hope that might be helpful to some!
UPDATE - the final converted images are indeed thumbnails and not full images. There is a link to view the full image available as metadata, so that might scratch our itch for the time being. My optimism got the best of me!
Jan 20, 2022 06:01 PM
how long did this script take? I have around 400 URLs to get
Jan 31, 2022 02:16 PM
Apologies for the delayed response - you’ve probably sorted this out by now. The Convert URLs to Attachments should take on the order of minutes for 400 URLs.