Preamble: I’m still working on implementing some features in the Scripting/Custom blocks which I think would remove the need for most of the Airtable API calls the proxy server is making. But even with custom blocks, I think I will still need a “get records for base” call that needs authentication.
Setup: Node.js app (12.x), Airtable.js (0.8.1). API key stored in .env file that’s loaded into process.env.
My application relies on a setup like @Bill.Frenchdescribed here where there is a central “directory” that has a record per Airtable base. My proxy server can return a list of these records, which in turn gives me the Base ID to make CRUD calls to that individual base. This works great and performs cross-base operations that suit my need.
The question: Until now, authentication hasn’t been much of a concern. Most of the execution has been run locally, so I haven’t needed to worry about a 3rd-party user accessing/updating my bases by hitting random endpoints if they know the URL. And as I mentioned in the preamble, this ideally won’t be much of a problem moving forward as the server transitions to just accept JSON data to modify and return to an Airtable scripting/custom block. This wouldn’t prevent a 3rd-party user from hitting the endpoint, but the endpoint would no longer modify an Airtable base they shouldn’t have permission for.
For situations where the API key is still needed server-side, is there a way to pass this in dynamically? Adding it to the body of a POST request is my first idea, but I’m not sure how secure that is. Should there be a man-in-the-middle attack, someone could intercept the key. Does some kind of hash/encryption solve that issue? Something like SubtleCrypto.digest() looks promising, but I don’t think that’s supported in scripting blocks (but should be supported in a custom block??).
Apologies if all that was a bit long-winded/confusing. I’m happy to clarify anything that doesn’t make sense. I’m curious to hear if anyone has run into this issue or has any solutions!