Smart way to work with ZIpcode table that is over the limits

Topic Labels: Base design
1130 8
Showing results for 
Search instead for 
Did you mean: 
4 - Data Explorer
4 - Data Explorer

Hello all,

I am a new user and working on a project to help people find what trees are native to their location. I am building a database to match plants to zipcodes, which are also mapped to growing zones, etc.

My question is about working smartly with my zipcode base, which is over the 50,000 record limit for the plan I am on. I realize that I do not need all of these records, but really just the ones that relate to users on my app. Is there a good way for me to create a “just what I need” base on-the-fly as users add their location information?

Other smart ideas are very very welcome. The answer can’t simply be to have an enormous database of zipcodes when it turns out I may only need 5 today, and 500 by the end of the year, can it?

Again, many thanks for your help.

Chris Robinson

8 Replies 8

Sure, you can create a zip code field in your users table that is a linked record field (which links to your zip codes table). Every time you enter a new zip code there that hasn’t been used before, it will automatically create that zip code as a new record in your zip code table.

That sounds like a really cool project, is there an alpha up somewhere that you could link me to?

How do users interact with your project though? I’d love to provide suggestions but without knowing exactly how the workflow is all the stuff I’m thinking about feels kind of wrong ha

For example, I was thinking if the zipcode data doesn’t change much, you could technically convert it all into a huge JSON object that we then run a script on whenever a new user signs up that would add the relevant zipcode records into the database.

(I’m assuming each record is a zipcode that’s linked to plant records)

This would mean a maybe 15-30 second lag time though, which means that it would only really work if we don’t expect users to change their location often (i.e. they add their location during an account sign up or something). This would definitely not work if it was some sort of scrollable map, for example


Thanks for the comment. Help me understand this a bit btter please. If I am linking to the full zipcode table, then that is what I am doing now, which doesn’t obviate my need for a full table of zipcodes, right?

Full disclosure, I am also storing zipcode related info in my zipcode table, including a base lat,long, growing zone, county, city, state, etc. I have a version of the table that breaks the zips into states, which removes one field, but it’s still big. I can trim the fat on that a bit, but it is still too big to fit comfortably in the 50,000 limit on my airtable plan.


The only way that my idea could work is if you started off with a completely empty zip code table, and then you would add in all the other stuff (latitude, longitude, growing zone, county, city, state, etc.) after the fact — AFTER the few zip codes that you actually needed were added as new records into the zip code table.

And you could probably even automate adding in all of that extra stuff by tapping into some zip code API’s like the ones at — or whatever data source you’re pulling your zip code information from, as long as they have an API.

Otherwise, you’d have to figure out some Javascript workaround like what @Adam_TheTimeSavingCo was mentioning above, or you’d need to turn to a more robust enterprise-level database system like Claris FileMaker Pro, which was my previous development language for 30 years before becoming an Airtable consultant. If you’re interested in testing the waters with Claris FileMaker Pro, feel free reach out to me through my website and I’ll put you in touch with some great FileMaker developers.

I can also help you out with API integration of Airtable, if you decide to stick with Airtable:

Airtable consultant — ScottWorld

Why would there be such a long lag time?

Hmm… I was thinking:

  1. API call to create record
  2. Wait for “on record creation” trigger to execute
  3. Script runs and identifies / formats all records to be created
  4. Records created

And 15-30 seconds for it to complete felt right to me

Reckon that might be too pessimistic?

I guess when you include the back and forth with the REST API calls, and waiting long enough to be sure that the automation ran, your time estimates are reasonable. I was thinking about the script execution time.

This is the right way to think of location science, however, Airtable has no facility to maintain “huge” anything. :winking_face:

Perhaps not very relevant is the need for polys in location data - like zip code boundaries. If you have the polys you can also determine if a given lat/lng is inside the geo-area without actually storing anything else - it’s just a geo-algorithm. This project has state polys embedded in a long text field and that is ideal if you can reduce the boundary resolution to 200 meters which is a form of GeoJSON compression. This approach allowed me to store all fifty states and 3200+ county boundaries inside Airtable (similar to this project). Note, Texas and California boundaries are larger than 100k and thus unable to fit into a single record.

This is what typically inspres me to find a way to compress the data. You must ask -

is it possible to compress a few hundred records inside a single field?