Help

Re: How many characters can Long Text field store?

3390 6
cancel
Showing results for 
Search instead for 
Did you mean: 

Have asked Google, the Duck, even Bing, and can’t find an answer to this question: How much wood would a woodchuck… No, sorry, that’s a different question for a different time. My current question is: How much text can be stored in a Long Text field (of a single record)?

11 Replies 11
Zack_S
8 - Airtable Astronomer
8 - Airtable Astronomer

100,000 characters

Thanks for fast reply, Zack! I did glance at that article but didn’t see that last line.

What happens if (somehow) a user pastes more text into a field than it can hold?

An error occurs

image

Thanks. I didn’t want to test this for fear it would cause my base to catch on fire and self-destruct.

Thanks again for your quick help!

William

146,285.714 at an average 70% compression. :winking_face:

Has anything like this ever happened to you?

The closest I’ve come is runnaway automations or cascading issues with schema changes. Attempting to change the value in a single cell has never made my base self-destruct.

Don’t know how to take your response, Bill. You’re the kind of guy who would actually know that it’s 146,285.714 at 70% compression. But you added that smiley face, so I think you were trying to pull my leg. :mechanical_leg:

On a non-ironic, non-sarcastic note: After getting the info from @Zack_S , I created an experimental base and tested this. Pasted about 53,000 characters into a long text field — no problems. Copied a different text block of 52,000 characters and tried to paste into the same field: The base froze. By “froze”, I mean I got a “Page not responding” error. I kept clicking the “Wait” button, but eventually it was clear I had to force exit the page, which I did.

The good news is that this 100,000 character limit seems to be per field, per record. So if it happens on occasion that the primary Notes field in a given record is getting “full”, I can move its text to a “NotesArchived” field and start from zero.

I’ve been doing this stuff for a long time. “Catch on fire” was a bit of drollery, although I have made more than a couple of laptops get fairly warm in the past. As for self-destruct, yes, some time ago (15 years?) I had a FileMaker database gag, crash, and become corrupted during a very large ingestion of data. And I kind of understood what I was dealing with there in terms of the hardware. I have no idea how capable Airtable’s servers are (presumably they’re very capable) but I don’t like to take risks for the heck of it.

My conversion process from FileMaker to Airtable seems to be taking a few years longer than I expected, but I’m continuing with it quite happily. But these limits are a new thing to me. There is a practical limit to the number of users who can connect to a FileMaker database, but as for data limits — number of records, number of characters in a single text field of a single record — there are no practical limits, so it’s a little new for me to have to worry about these issues. Keeping me on my toes.

Ah yes, I have had other non-Airtable databases become corrupted. I have not had that experience with Airtable. At most I have had Airtable bases become temporarily unresponsive.

Ha Ha! Totally having a good laugh because you are spot on. Here’s why I know.

This project stores the polys of every state in the US. As you likely know, CA and TX are the two biggest states in terms of size, but that’s not all. Their polygon boundaries are extremely complex if you care about precision in mapping. As it turns out, the GeoJSON representations of the two largest states exceed 100k and therefore, the only way to store this data as part of the data record is to compress them a little. GeoJSON, oddly enough, compresses to about 70% of its original size which makes it possible to store about 145k of data in a long text field. For JSON compression in this case, I used json.hpack, but I’m particularly interested in BSON for future JSON compression tasks.

In the example project I didn’t use this technique, but in the production equivalent of this example I did because storing these two states in code was just stupidly hackish. Airtable is all about hacks but I draw the line at stupidly hackish. :winking_face:

BTW, I’m not suggesting you leap into solutions with compression scripts and such, but, depending on the requirements and use cases, it’s not difficult to jam a little more into the long text field if it really makes sense and you’re desperate.

I don’t think this error is related to Airtable nor the size of the text; rather, it’s likely a memory issue with the machine and. the browser. My hunch is a script extension can easily read large blobs of text from a service and write large blobs into long fields. That was my experience anyway. In gact, for the US ploys I read the entire 25MB file over the wire, chunkified it, compressed the segments, and stored into long fields in just 30 seconds.

This is correct.

Holy cow, that project of yours (with the maps using Vega-Lite extension) is cool and bloody impressive. I have no idea what I’d use Vega-Lite for right now and not sure I’m going to want to work that hard, but I intend to look at it a little more as soon as I can. Thanks for that link.

In the meantime, not going to worry about compressing my text data. 100,000 at 100% (or is it “0% compression”?) will do for now.

Why this matters to me: dealing with Airtable’s ceilings

Actually, let me explain why I started this thread. I didn’t say earlier. As I’ve said before, most of my clients are law firms doing “mass tort” litigation. Lots of clients, sometimes a fair amount of data too. I build case-management apps for them. Most of them are basically very customized project management databases.

Anyway, one of the things that my clients especially like about my apps is that I always provide a tool that lets them make notes whenever they talk to a client, take an action on a client’s behalf, update the client’s record, etc. If we could import into and export from Airtable’s native record comments, I might be happy to use those, but we can’t. So have to do this as I’ve done it in FileMaker: Record in CLIENTS is linked to (possibly) many records in a NOTES table.

And here’s where it gets tricky. Now and then one of my client-users (a lawyer or paralegal) will really be an enthusiastic note writer, and clients might end up with scores, sometimes literally hundreds of notes (usually small ones). If the case has, say, 10,000 clients in it (that’s big, but it’s happened) that = a lot of linked records. And I start sweating that 50,000 record per base limit in my Pro account.

So I’m going to have them enter memos in a NewMemo field, and then an automation takes the text in the new memo field and puts it at the top of the AllMemos field (along with a date-time stamp and the user’s login name). But as you can see, I’m between the rock of the 50K record limit and the hard place of the 100,000 character limit on Long Text fields.

After some careful analysis of my existing databases in FileMaker I’m confident that the character limit on long text fields is an easier one to deal with. If a particular record’s AllMemos field starts to get anywhere near 100,000 characters in length, I’ll just move two-thirds of it over to an AllMemosOlder field. But I’m not worried much about that any more. I’ve done a detailed analysis of the app that makes the heaviest use of notes, and in that app, less that 0.5% of the records have over 50,000 characters. 90+% of the records have notes that are less than 5000 characters long.