Skip to main content

I’m looking to backup all the data in my base and I saw you can do it by automation but I’m not sure how. I don’t see download files as an action, would it need to be done as a script? Ideally I would like this to download into SharePoint but if it downloads locally and I have to upload manually to SharePoint that is fine.

I’m not sure if you can automate the downloading of CSV files with a script, but you could automate the process of creating your own CSV files and then saving those CSV files to Sharepoint by using Make’s advanced automations & integrations.

I give a brief demonstration on how to do this (using Google Drive instead of Sharepoint) on this Airtable podcast episode: Importing and Exporting CSV files with Make.

If you’ve never used Make before, I’ve assembled a bunch of Make training resources in this thread. For example, here is one of the ways that you could instantly trigger a Make automation from Airtable

However, instead of setting up your own automations, it might be easier to turn to one of the backup solutions that already exist for Airtable:

Hope this helps!

- ScottWorld, Expert Airtable Consultant


Hm, you can download views as CSVs and upload those to Sharepoint if you want?  So the idea would be to create one view per table that contains all of the fields and records

Off the top of my head, the caveats to this would be that attachments, formulas wouldn’t export well.  I reckon the linked fields might not be great either

---

I don’t think Airtable supports an automated backup solution right now I’m afraid.  You’re worried that Airtable might disappear or something I take it?  Just curious what you’re trying to guard against as the automated snapshots Airtable does have worked well for me in the past


Hello,


You are correct, direct "download file" 

Automating data export often requires using a Script Action in an Automation to:

Retrieve the data (via the scripting API).

Format it into a file (like CSV or JSON).

Send it to an external service (like SharePoint) using an external API call (e.g., using fetch()).

 


For a hands-off backup, you’ve got two main paths: lean on Airtable’s automatic base snapshots for a safety net, or roll your own export via a Scripting action that runs on a schedule. For many teams, snapshots cover the “oh no” moments while an export script handles off-platform storage.

If you want the SharePoint route, an Automation with a Scripting action can iterate each table, build a CSV string in memory, and POST it to SharePoint using fetch() with your Graph API token.

A few gotchas to plan for: linked records should be flattened to IDs or names consistently, formulas and lookups export as their computed text, and attachment fields only give you time-limited URLs, so if files matter, fetch and re-upload them to SharePoint in the same run.

This setup ends up more reliable than trying to click-export “views as CSV,” and once it’s scheduled you’ll have a repeatable backup landing in SharePoint.

If you hit edge cases with very large tables or rate limits, split the run by table or timestamp and rotate files by date so restores stay simple. Hopes this helps.