Help

Breaking change: URLs to attachments have been modified

Topic Labels: API
Solved
Jump to Solution
6480 10
cancel
Showing results for 
Search instead for 
Did you mean: 
Scott_Bowler
5 - Automation Enthusiast
5 - Automation Enthusiast

Hi guys,

It looks like a change has happened overnight - URLs to attachments have changed from this structure:

https://dl.airtable.com/.attachments/d656d5c54fbc38ad9195c2f62cd02dae/5eb64051/Supreme_Apex-1120x740.jpg

To this structure:

https://v5.airtableusercontent.com/v1/10/10/1668182400000/q_tvhAt9ivi_oCjJ5LvWQQ/ghcs73y0Vma8Rz6c7EJaen3sRTVMOCrqmPpkS38kxl2SOiMEE5boOfVOorY5nfQ9PZkrQCYfehg76B4CcI8-MA/1wi2qtg7XnHCVhZhtKjxg_BXwwegwsGxphbPlJRYlao

This has broken our API integrations for saving images to WordPress / WooCommerce as WordPress requires a file extension to exist.

Has there been an announcement anywhere about this major change?

1 Solution

Accepted Solutions
ScottWorld
18 - Pluto
18 - Pluto

Yes, we were alerted to this about 7 months ago, and all of their support documentation & API documentation was updated 7 months ago to reflect this change. They also sent out an email about this to customers many months ago.

But they probably could’ve done a better job of reminding everybody about the change this week, which went into effect on November 8th.

In short, attachment URLs now expire after 2 hours.

They also recently added the ability to temporarily opt out of this change for an additional 3 months, but beyond that timeframe, you would need to change your workflow:

See Solution in Thread

10 Replies 10
ScottWorld
18 - Pluto
18 - Pluto

Yes, we were alerted to this about 7 months ago, and all of their support documentation & API documentation was updated 7 months ago to reflect this change. They also sent out an email about this to customers many months ago.

But they probably could’ve done a better job of reminding everybody about the change this week, which went into effect on November 8th.

In short, attachment URLs now expire after 2 hours.

They also recently added the ability to temporarily opt out of this change for an additional 3 months, but beyond that timeframe, you would need to change your workflow:

Scott_Bowler
5 - Automation Enthusiast
5 - Automation Enthusiast

Thanks for sharing that info - we’ve opted out for now whilst we adjust to the new approach

Hi @Scott_Bowler,

This has broken our API integrations for saving images to WordPress / WooCommerce as WordPress requires a file extension to exist.

If you need API to return file with file extension you can build a wrapper on top of Airtable API using custom webbook with custom response with Header Content-Disposition: attachment; filename=FILENAME. The Airtable API response for attachments contains filename property, so you will have the file extension there. A service like Make, Pipedream, Zapier etc. can do that.

Here is a video I made some time ago that shows this - it is using Make, plus adding some extra control to set expiry date or turn off access via custom link:

Thanks for posting this @Greg_F; it’s a very helpful approach for those who are comfortable with adding a fair degree of complexity to overcome Airtable’s desire to lower their CDN costs and hosting obligations while adding more security. A few observations and questions come to mind…

API Feels the Burden

The first is the burden-shifting that occurs by using this approach. Does this not simply move all attachment requests from the CDN service to your Airtable’s instance API? As we know, API requests impact each instance of Airtable and will, therefore, impact user performance even for those users doing usual and customary activities.

This approach has the undesirable effect of burdening the API for every external image requested, which could be significant for some customers. And it could be a subliminal burden.

Imagine three dozen crawlers on a website, all requesting images previously served by a CDN that uses highly optimized caching and conditional GETS to mitigate load. Now, every request will go through Make which repeats the entire request->response process without caching or conditional GETS and all while tallying a new operating cost in Make itself.

I fear that replacing a CDN with a REST-RESPONSE server has some significant risks and disadvantages and why I recommended this.

Thoughts?

Have You Tested This?

Given the recent cutover to the new signed attachment URLs, have you tested this approach with Bases that have not opted out of the cutover? My curiosity is on two vectors:

  1. What is the response time? Images are the one thing that browsers try to optimize in terms of rendering performance, but I’m wondering what the response time to a fully rendered image request is when the browser cache is not primed.

  2. What (if any) testing has been done to ensure that the request is being made to a URL that has not yet expired? Is there a way for the process to get it wrong in any context, thus revealing a broken image?

Airtable has done a good job of warning people that urls would expire, but the fact that file names would disappear was buried in the details. I have had similar issues with Airtable scripts with similar workaround (providing the file name separately).

Right, it should be very clearly highlighted that using Airtable attachments URLs from API in any CDN function is a horrible idea. With Make/Zapier, you will likely go bankrupt, with Pipedream - there are all the considerations you are mentioning + assuring proper caching behavior to avoid hitting API limits.

My understanding is that OP wants to upload attachments to WP server, which will handle the delivery of images and only the lack of file extension (not validity of Airtable links) is causing an issue.

As for CDN use obviously moving data to proper CDN, S3 or own server is the right choice. I am actually working on a project right now, where Airtable is the source of images, but I am using following configuration:

  • API endpoint on Vercel which fetches the Image from Airtable API URL
  • Next JS <next/image> component which gets the URL from my Vercel API and creates optimized version of source image (in terms of source set, compression and format) cached on Vercel CDN.

It is all React/Next, so not much use in other cases. Setting up tag properly with source sets matching todays device sizes and pixel ratios is a lot of boiler plate and work, so the Next/Image does magic:

Cloudinary also has a similar React element in their SDK:

So buried that I wasn’t even aware of it until reading this thread. And as if on cue, I’m now getting broken images in a massive custom app that I built for my employer. Lovely way to start the weekend!

False alarm. I think that the extension was open just long enough for the former image links to expire. Reloading the page forced the images to refresh with the latest URLs.

I think that custom extensions can now request urls that work in the context of the logged in user.