And I’m guessing this is likely to reach 100 bases someday, right? Let’s round it up to 100 for safety sake.
And before going nuts on the best implementation approach, tell me why the data needs to be in one table? Also, please reassure me that you’ve scoped the data set and photos (and other hefty binary files) to be sure Airtable can handle the aggregate content size you anticipate.
Something or someone’s are going to expect this data in a unfied table but for what purpose exactly? Is it simply a publishing process? Search? Review and pushing on to other departments? Are you going to edit the unified data?
I’m pretty sure you can’t use the new sync feature to funnel 100 separate base/tables into a single base/table. How do you feel about creating 100 Zap recipes? Yeah - I didn’t think so - that’s a bad idea anyway.
You are in a box canyon and there are rain clouds forming upstream. You need to climb out of this and I suspect the only route available - assuming all other constraints are rigid - is to unify this data set with a custom process.
Yes, it’s feasible, but it may not be practical. I’ll let you decide.
It is certainly possible to instrument every vendor table with a script action that watches for new (or changed) records and adds them automatically to the master base/table. But this requires the development of a single action script that is deployed separately into each existing vendor table. This is not ideal, of course, and will present issues with human edit failures, debugging, separate instance testing, etc.
An alternative approach -
- Single script process looks for updates based on modified record dates; the process runs hourly against all 100 vendor bases.
- As it finds updates, it syncs the data into the master base/table.
I am biased (for good reason) so I would build such a beast in Google Apps Script. The script blocks in Airtable might be able to handle this, but it would be long-running at times and script automation have a pretty short tolerance for long-running processes.