Airtable is already a great way to give folks in our business the ability to edit a wide variety of datasets. With Interface Designer, we expect to move even more of our data into Airtable.
But Airtable is just one piece within a large ecosystem of business systems. We need to connect with a wide variety of custom internal systems and off-the-shelf systems. Event buses like Kafka make it easy to add new connections. Instead of connecting each system to each other system, you just connect everything to the event bus.
It would be great to have a Kafka Connector for Airtable so we can stream out every single update to other systems that care about them. Other databases like Snowflake and Redshift have connectors.
There are two ways to build this on the current system: the Script app and the API. Both approaches work, but they involve a lot of custom coding. It would be great to have an officially-supported option.
Scott, @James_Rosen is correctly rejecting the SDK and rest APIs (which is used by no-code adhesives like Integromat and Zapier) for good reason. These are extremely inefficient pathways that are latent, complex, and costly to build and maintain.
Real-time integration technologies like Kafka make it possible to stream data between apps, thus negating the need for REST APIs altogether. In-so-doing, you speed up the interchange by magnitudes (50ms vs 2,000 to 5,000ms).
You also mitigate memory requirements while expanding the types of devices that can contribute data. Think mangOH Red - tiny IoT boards chirping sensor data.
Kafka is very similar to MQTT, PubNub and even Firebase is fundamentally architected as a real-time database. Soon, all databases will have to work like this because it’s the right way to move data.
I asked Airtable about real-time data integration in 2018 - they had no intentions then and it’s probably not on the radar now, but it should be.