We need to transfer data between Excel and Airtable. We have 40K Records but we seem to be limited to 15K. We are using the Block.
If you can break the CSV into smaller chunks, you can do multiple 15K uploads to the target table. Sloppy, but I think it works.
Thank you, that is what we do. Is Airtable working on fixing this bug?
Interesting. Using the standard, non-Block CSV import, I was able to import a CSV file with 98,500 records — nearly double my plan’s max of 50k. The real limitation there seemed to be file size rather than number of records: Each record contained only a lat/long pair, but 98.5K of them brought the file to 2,090,728 bytes, just a smidge under 2Mb.
The CSV Import Block, as you noted, has a hard limit of 15,000 lines — so I guess either 14,999 or 15,000 records, depending on whether the file includes headers.
I think it might smart to create a formal feature request post asking for the Import Block to be enhanced to support, say, 50k records, to bring it in line with Pro plan limits.
Hey @Howie - I’m on a Pro plan and I’m throttled to a 2MB csv. I’m working on a project starting with a 2.05MB file. No dice. Can’t get it in at the initial base import nor from the csv block and I can’t paste it in cold. It’s been AWHILE since I’ve had to think in single bytes, but here we are. I could slice and dice the csv, but… ~ahem~
Hi Brad—you should be able to upload a 2.05 MB CSV file using the CSV importer block without problems. Please send an email to email@example.com about this and they can work with you to figure out what the issue is.
Thanks @Katherine_Duh - I tried it again and it went. Super!
It would be great if, in the future, the .csv block could capture the virgin .csv headers as the Base import does. In this case, I wound up naming 22 field headers by hand. The header mapping function works great and I love the potential for later file merging but this is only beneficial when the field headers have already been established.
Clearly this is a specific and weirdo case (I don’t always have 22+ fields to start from) but having a Boolean ‘is the first record a set of field headers’ would be ideal.
I’m having the same hold back - We’ve got multiple inventory files from mulitpile vendors with 22,000+ items on them and although they are generally less than or just over 2mb in size I can’t upload them to merge the information due to the 15k line limit.
We were full steam headed towards product info management meets inventory/sales data tables but have been stopped dead in our tracks with this inability to upload/update our data. Support@airtable.com has been unresponsive for about a week on another email and now this support email is awaiting response as well. If we can open those limits on the csv block to allow greater than 15k lines/2mb size (maybe shoot for twice that size?) we’d be back on track with Airtable as a possible difference maker in our business.
Has something change in airtable in the last few days? We upload CSV files created out of Excel and our normal limit has been about 12K records using the block import. Suddenly we are limited to a little over 5k.
I had recommended that airtable to Wework and its been a pain to use. I’m starting to look like an idiot for this making this recommendation.
All we do is upload data for collaborative updating. After update, we download the data to a csv for import back into excel. Airtable seemed little capable of handling this before today and now it is simply unusable. Yet what we are doing is incredibly simple and basic.
What can we do to make airtable usable again?