Help

Airtable - Amazon S3 Bucket Integration

Solved
Jump to Solution
5812 6
cancel
Showing results for 
Search instead for 
Did you mean: 
mshah72
6 - Interface Innovator
6 - Interface Innovator

Is there any option available to integrate Airtable and Amazon S3 Bucket for transferring data to and fro? If yes. please guide me with third party tools as well as how can I create by myself.

1 Solution

Accepted Solutions
kuovonne
18 - Pluto
18 - Pluto

You can use Make.com to send attachments from Airtable to your S3 Bucket.

If you want to get files out of your S3 bucket, the files need to be set to public, and you can use those public urls to attach them to Airtable.

See Solution in Thread

6 Replies 6
kuovonne
18 - Pluto
18 - Pluto

You can use Make.com to send attachments from Airtable to your S3 Bucket.

If you want to get files out of your S3 bucket, the files need to be set to public, and you can use those public urls to attach them to Airtable.

Yawnxyz
5 - Automation Enthusiast
5 - Automation Enthusiast

It'd be so nice to swap out Airtable's S3 with our own, so the interface works out of the box, and our own CDN as the backend for all the files we upload.

Can you please elaborate it more?

Make is an integration service provider. You can use it to connect Airtable with your S3 bucket.

You will need to create a Make account. They have a free tier that may work for your needs, and a range of monthly subscription levels if you need more features.

Hey there-

Thanks again for another amazing answer. I'm curious whether there's a script we can use in Automations to send a file from Airtable to an S3 bucket?

For example, here's a piece of Javascript code that can download a file, we could potentially sub in the Expiring file URL from an Airtable file and then when we have the data blob, send that to S3...

const { airtableExpiringFileURL } = input.config()

function downloadAsDataURL (url) {
  return new Promise((resolve, reject) => {
    fetch(url)
      .then(res => res.blob())
      .then(blob => {
        const reader = new FileReader()
        reader.readAsDataURL(blob)
        reader.onloadend = () => resolve(reader.result)
        reader.onerror = err => reject(err)
      })
      .catch(err => reject(err))
  })
}

// simply use it like this
downloadAsDataURL (airtableExpiringFileURL[0])
  .then((res) => { 
    console.log(res) 
  })
  .catch((err) => {
    console.error(err) 
})


I keep getting errors though when I try the script unfortunately 😢

Not sure if this helps you, but I created a small serverless API endpoint that takes the expiring URL and mirrors the recordID and file name in my Cloudflare R2 bucket. The API endpoint then returns an array of permanent links, which I write back into a "permalink" column in Airtable. Haven't used S3 though so my code probably won't help you; the Airtable code looks very much like yours, except I just send it to my own API endpoint as a POST fetch request.