Help

Re: Millions of Records Through (Not In) Airtable

3362 0
cancel
Showing results for 
Search instead for 
Did you mean: 

This question caught my eye on OpenSide’s Slack channel this morning. It’s a common question for anyone doing anything serious with Airtable. I’ve written about this subject many times including here and here.

Does anyone know if Airtable can support more than 100k records per base on enterprise?

I generally try to reframe this topic to suggest a more important question you should be asking -

Is there a way to optimize the data model to accommodate the equivalent of at least a years worth of transactions?

High-volume record requirements are generally indicative of transactions; data flows that are increasingly less relevant as time passes. And there is at least one trick you can use to expand the capacity of small transaction records by not storing individual records in a linked (relational) table; rather, store them as JSON objects in a long text field. This is, of course, a sanctioned approach as evidenced by this Airtable feature .

To pull this off, a button (in the client record) fires a process (perhaps a script block) that unpacks and renders the JSON object containing all of the transactions related to the current client record in a script block grid control. Alternatively, users are able to copy the transactions into a table reserved for this which makes their viewing more pleasurable. It’s not ideal, but it does provide some options.

No one can really make an assessment that employs clever tricks to overcome record counts without first understanding the operational requirements. However, the idea is to first question the data model in the context of Airtable’s harsh limitations and try to discover ways to accommodate the business requirements without making a bigger mess.

For example, imagine a “reflector” that always sustains a replica of all data in Firebase and an operational algorithm that magically prunes Airtable in the background. This approach makes it possible for Airtable to capture up to 4 billion records and never once need archiving or suffer performance issues from flying too close to the 100,000 record ceiling.

Often, significant transaction flows through Airtable are operationally relevant for a while and then become baggage suitable only for aggregate analytics or transactional research as time passes. The reflector approach overcomes this in a relatively inexpensive manner but does require you to think carefully about the use case requirements for aggregate analytics or transactional research. FireFoo, for example, is $9/month and provides a rudimentary UI for browsing and querying vastly large data sets in Firebase. You can also instantly add a layer of reflection into Google Big Query to get a full SQL engine under your Firebase datastore.

I have three clients who have employed this technique with really good results; one has 15M records in Firebase, all of which went through Airtable forms and incoming webhooks.

Two of them have designed really nice Interfaces as dashboards that include aggregate metrics from Firebase which are updated hourly and pushed (in aggregate) back into specific tables. This provides a seamless and all-inclusive view of 100% of the data (all 15M records). Anyone who looks at their operation assumes Airtable has 15M records in it, but it’s an illusion, of course.

2 Replies 2
Cycle_Cycle
6 - Interface Innovator
6 - Interface Innovator

This is amazing @Bill.French - I’ve read it three times and about to read it three more.

Your workarounds, as always, are genius

Genius is a term that should be reserved for the 1% of the brightest 1% - I’m more like the working class of genius. LOL