Upcoming database upgrades. Airtable functionality will be reduced for ~15 minutes at 06:00 UTC on Feb. 4 / 10:00 pm PT on Feb. 3. Learn more here

How to programatically import CSV to AirTable, creating a new table in an existing base (CSV is a file on a cloud VM)?

Topic Labels: API
7634 23
Showing results for 
Search instead for 
Did you mean: 

I am sure this has been asked before, I have searched high and low, and have found no solution to do the following, even with Zapier.

I have a .csv file that I want to import direct to Airtable.

The .csv was created programatically (in Node.js), and exists on a Google Cloud VM (Red Hat Linux).

There is nothing in the AirTable API that allows for direct insertion of a full CSV as a new tab in an existing base.

I want to name the tab in the existing base the same name as the CSV file, without the .csv extension.

If anyone has actually done this successfully, and smoothly, please respond. The only articles I can find on how to import a CSV to AirTable involve a human being and clicking in the browser.

23 Replies 23

To my knowledge, this part must be done manually. I’m not aware of any API-available option—including using tools like Zapier or Integromat—for creating tables. You can only add or modify data in existing tables. That aside, both of those listed services can parse CSV data, so once the table has been manually made, either service can be set up to import the data.

Thank you for the info.

Even if I have a pre-existing table created manually:

I’m not seeing that Zapier can do this (get all rows in my CSV to Airtable). Zapier looks to be a tool connecting two web app end points.

A CSV on my local file system is not presented to me when I go through the Zapier wizard, as an integration end point to choose.

After I generate a CSV on the cloud VM (from a select from a DB), I’m not sure how Zapier can help or even work, or play a role. The data source is the CSV on a file system (cloud VM). Zapier has no access to the CSV.

I’m not sure about Integromat but will investiate.

I’m not seeing any solution that Zapier would provide in this case, but I see it mentioned quite often.

Admittedly I don’t have experience with cloud VM systems, but if it can be accessed online—and the “cloud” part of “cloud VM” leads me to believe that it can—my gut says that either Zapier or Integromat could retrieve the data and insert it into Airtable. I just don’t know the specifics, so I’ll have to defer to others to chime in on that front.

It can’t be accessed by Zapier until it exists through a web service. Nothing but roadblocks between your data and Airtable.

If the table already exists (with fields that match the file or, at least, fields that can be mapped to the file) and you are comfortable using the Airtable API, you could easily create a process/script on your VM that iterates through the CSV file and loads it into Airtable. This could be written in JS, Python, just about anything. The API has two record create methods - one is for single records the other is for a group of up to 10 records.

As you already have a process that produces the CSV file using Node, could you or your developer extend this and push the records into AT? Arguably, there is no need for the CSV file (unless it serves some other purpose) as the records could be POSTed straight from the Node app to AT.


For those that are interested, here’s an example Python script that imports records one by one (takes only a few seconds for 100 records):

import requests
import csv

post_url = ''
post_headers = {
    'Authorization' : 'Bearer YOUR API KEY',
    'Content-Type': 'application/json'

f = open('MOCK_DATA.csv')
csv_f = csv.reader(f)

for row in csv_f:
    name = row[0]
    number = int(row[1])
    role = row[2]

    data = {
    "fields": {
        "Field1": name,
        "Field2": number,
        "Field3": role


    post_airtable_request =, headers = post_headers, json = data)

Here’s the first few lines of the CSV file referenced:

Ross,92,Marketing Manager
Delmor,10,Help Desk Operator
Jazmin,79,Budget/Accounting Analyst III
Moore,50,Senior Developer
Winny,95,Quality Engineer
Goran,34,Environmental Tech
Smitty,60,Tax Accountant

The end result is:




I want to thank everyone for the responses and ideas, and will look forward to an expanded API with additional features:

1. bases and sheets within bases created programatically through the API
2. insertion of a CSV direct

I was able to work through all CRUD operations using the Airtable API using Node / axios (to an existing base and sheet created by clicking and typing).

I still have a few things to work out (like how to insert once the column changes type from text to checkbox), I posted a new issue/question on this, but otherwise have a good handle on the API at this point.


Thanks for your suggested solution for this programatic import of CSV to AirTable. I used the code below, which reads the CSV fine however I am getting 422 response from AirTable. I have troubleshooted to the best of my skills, but can’t fix the error. What am I missing?

Please note the ingredient name is a lookup ID, that’s why its in brackets.

import requests
import csv

post_url = '’'
post_headers = {
    'Authorization' : 'Bearer YOUR API KEY',
    'Content-Type': 'application/json'

f = open('import_data.csv')
csv_f = csv.reader(f)
for row in csv_f:
    inventory = row[0] 
    usage = row[1] 
    received = row[2] 
    report_date = row[3] 
    batch_id = row[4] 
    lookup_id = row[6]
     name = row[7]
    data = {
        "fields": {
             "Name": name,
            "Ingredient": [lookup_id],
            "Inventory": inventory,
            "Usage": usage, 
           "Receipts": received, 
           "Report Date": report_date, 
           "Batch Number": batch_id  


post_airtable_request =, headers=post_headers, json=data)

I have been able to validate the post request using these credentials works via Postman, but still having issues. Here’s the first few lines of the CSV file I used:
949.7318,180.3,0,05/21/20,4421b1fa-9bb6-11ea-8f85-8c85904e2678,Agrado Plus,recg1ZpkYgUMXRgZf,Agrado Plus - 2020-05-21
4408,0,0,05/21/20,4421b1fa-9bb6-11ea-8f85-8c85904e2678,EGold Calcium Salt,recvCnq1AXSX7Y6cq,EGold Calcium Salt - 2020-05-21
14354 5441,0,05/21/20,4421b1fa-9bb6-11ea-8f85-8c85904e2678,Smartamine ML,rec36WGlYOrUdEDIu,Smartamine ML - 2020-05-21

Here’s what my local terminal is showing when I run the code using the csv:
Screen Shot 2020-05-28 at 12.43.09 PM

I think a 422 error is an obscure indication you are trying to write data of a certain type into a field that cannot handle that type.

Recommend you downsize the import to one or two fields and slowly introduce them to figure out which is triggering the errors.

I’m no API wizard (haven’t even used it yet), but just looking at the code, this styled apostrophe before your closing quote in the post_url string sticks out to me. Could that be part of the problem?

Screen Shot 2020-05-28 at 9.56.02 AM

Good eye! That would cause a 422 I think.

Definitely would be the issue, unfortunately that was just a typo on my part as I wrote the post :slightly_smiling_face:

Will keep troubleshooting, thanks for the responses all!

Hi @Ryan_Wigley - I think the answer lies with the comment made by @Bill.French. I think the python cvs parser treats everything as a string, hence, you are trying to put this:

'949.7318' into a number field and it won’t go. However, if you do this:

inventory = float(row[0])

and convert it to a float before sending to AT, it should work. You can convert floats and integers - I think the rest will be fine as strings. (I have an integer conversion in my script above).


@Ryan_Wigley - side point - and this might just be the formatting on the post rather than your script, but the last 4 lines of your script are outdented, so will only run once after the looping has occurred - you need to indent them into the loop to run on each iteration. Probably not an issue as your console is showing multiple requests, but thought I’d mention it

@JonathanBowen @Bill.French

Thanks all, that did the trick. It was a matter of shaping the corresponding columns as floats / integers. I did not realize I could pass a partial POST request (i.e. not include all fields at once) so it was very helpful that you pointed it out Bill. Code works like a charm now!

I’m getting an error using the script above
Traceback (most recent call last):
File “”, line 42, in
post_airtable_request =, headers = post_headers, json = data)
AttributeError: ‘str’ object has no attribute ‘post’

import requests, json
import csv
import os, sys

post_url = ''
post_headers = {
    'Authorization' : 'Bearer keyxxxxxxxxxxx',
    'Content-Type': 'application/json'

branch = sys.argv[1]
filename = branch + '_stats.csv'
f = open(filename)
csv_f = csv.reader(f)
for row in csv_f:
    method = row[0] 
    name = row[1] 
    requests = row[2] 
    failures = row[3] 
    median_rt = row[4] 
    avg_rt = row[5]
    min_rt = row[6]
    max_rt = row[7]
    avg_content_size = row[8]
    request_ps = row[9]
    data = { 
        "fields": {
            "Name": name,
            "Requests": requests,
            "Failures": failures,
            "Median Response Time": median_rt, 
            "Average Response Time": avg_rt, 
            "Minimum Response Time": min_rt, 
            "Maximum Response Time": max_rt,
            "Average Content Size": avg_content_size,
            "Requests Per Second": request_ps
    post_airtable_request =, headers = post_headers, json = data)

I’m somewhat fluent in Python, but I suspect the issue is that data needs to be serialized (which I believe is json.dumps()). Since dumps() performs some pretty-printing stuff, I think you also have to change the separators to make it behave more like json.stringify().

Try this…

post_airtable_request =, headers = post_headers, json =
json.dumps(data, separators=(',', ':')))

I tried the snippets you shared and this one

post_airtable_request =, headers = post_headers, json = json.stringify(data))
I’m still having the same error

If you have a URL for a csv file that you’d like to import in Airtable continuously, we’ve built an extension that handles this.