Help

BIG base - has anyone encountered slow performance?

Topic Labels: Base design
12425 4
cancel
Showing results for 
Search instead for 
Did you mean: 
Omer_Nesher
6 - Interface Innovator
6 - Interface Innovator

Hey all,

I have a big base. 44,740 records.
This thing is slow!
Searching / scanning a barcode via mobile phone almost halts the app.
Desktop experience isn’t much better.

Smaller bases works alright.

Is it just me? anything I can do or is it a server issue?

4 Replies 4
Pete
7 - App Architect
7 - App Architect

Hi Omer – really sorry that you’re experiencing base slowness.

Improving Airtable’s speed and performance is and will continue to be a top priority.

In general, base performance will improve as a base is simplified, and these improvements should be realized automatically as changes are made. Here are a few suggestions that may help improve performance:

  • If you are near the record limit for this base, consider deleting any unused records.

  • Eliminate any non-crucial formula / linked record / rollup fields. Paring down the complexity of your linkages or formulas can greatly reduce latency.

  • If you’re you making many calls to the API, consider reducing API usage or inserting a pause in between calls; high write volume can affect base performance for large bases.

This is a sure sign your data model design could use some rethinking. Consider these points:

  1. It’s not a good idea to assume everything should scale; this is not Airtable’s strong suit and you knew (or should have known) this going in.

  2. Your app is mobile which is code for likely less connectivity and bandwidth than desktop apps. Mobile apps require a little planning if you need to scan 50,000 records or more from the field. It’s very likely that Airtable’s perveived performance is not Airtable at all - it’s simply a mobile app trying to load 50,000 items across an LTE connect at best and into a [comparatively] tiny processor.

  3. You have a few outs; here’s one - consider a hash index into your Airtable data using the barcode as the key and the Airtable record as the attribute. This would be easy to build and index using the Airtable API producing an ElasticSearch-like index.

  4. Imagine a little web app that loads the hash index and takes a scanned barcode and performs a lookup into the index. A web app like this could easily handle up to one million items and the lookup performance would be blistering fast - about 250ms on average.

  5. The web app would display hit(s) and link directly to the Airtable mobile app providing instant lookup of the desired record.

  6. Bob’s your uncle.

Hey Bill,

While in actuality, you are indeed right, I still argue that this situation shouldn’t happen.

  1. a 50,000 records DB is not a big DB.
  2. I’m a paid client, so if Airtable sets the limit at 50K, they should understand what SLA means. and should have known that they can absolutely support this sum of records. If the UX of 50K records isn’t up to par with the reasonable UX, than, don’t offer it. offer something YOU KNOW you can support and maintain.
  3. I’m running the code on a samsung S20 connected to a fiber. So, I doubt it’s something that has to do with the environment variables.
  4. I’ll have a go at your advice. thanks! I appreciate it very much.
  5. Who the heck is Bob :winking_face:

It is to a soccer mom. :winking_face: Which is equally as irrelevant as your statement.

Okay - so with the levity out of the way, let’s unpack this a bit more.

Jamming a 50,000 record table with 1,000 fields into the active memory of a cell phone might be considered “big”. It depends, right? It’s possible, but there are mitigating factors such as caching architecture of the device; available storage on the device, and the biggest one - speed of the storage system on the device.

Indeed, it shouldn’t. But, I suspect we can also agree that invariably, vendors will bring forth products that are inconsistent when it comes to customer expectations and industry standards. Airtable has a well-documented [public] history of performance issues dating back to 2016. I used that data to avoid squeezing my clients into a possible bind. And I have also helped a few clients recalibrate their expectations by showing them how to simulate a 10 million record (Air)table.

Agree. They should. I’m not defending Airtable per-se, but it’s my experience that any database with a [stated] relatively low ceiling is probably one worth running a few benchmarks against before building anything or relying on that statement. It only took me an hour to build the tests to validate that while 50,000 records were possible, it was certainly not a recommended practice.

Google advertises 5-million cells as the ceiling; Airtable advertises 50,000 records for its ceiling. You do see the missing variable in this statement, right? I’m afraid of widths, not heights.

A Google sheet with 5 million cells (populated or not) looks and feels a lot like Airtable with 50,000 records. What does that indicate? It tells me that the tipping point is not likely the underlying architecture; rather, it’s the limitation of the underlying compute stack typically employed.

My assessment is that neither 5 million cells or 50,000 records will ever be performant given today’s commonly available consumer devices.

I’m not an expert in matters of the S20 processor, memory, or 5G, LTE, and fiber. But I do know that fibre connectivity is gated by massive infrastructure including “small” cell towers that act as repeaters. Without enough nearby small towers, your 5G “fiber-backed” experience can be abysmal. And even if you are getting fairly good connection indicators, you have other factors to consider including the architecture of the device itself. Just saying 5G and Fiber doesn’t magically mitigate it as the potential cause of the sluggish performance. I would not be so quick to rule out connectivity and/or the compute stack as a suspect because there are numerous reports of S20’s struggling when low on storage (for example).

Summary

It’s my experience that in almost every performance problem, there is rarely a single cause. In your case, I think there are three, possibly four causes:

  1. Available storage on the device is constraining data caching operations.
  2. 5G and/or the fiber infrastructure is not delivering optimal performance.
  3. App architecture.
  4. Airtable itself, 'nough said…