Hi Airtable Community,
I’m facing a challenge with importing large CSV files (10000+ records) into Airtable and would love some guidance. Specifically, I need to automatically detect and delete duplicate records upon adding or updating via CSV import.
Currently, I’m using a script to handle duplicates, but due to Airtable's 30-second execution limit, it times out when processing high-volume data. I’d appreciate any suggestions on:
- Optimized duplicate detection and deletion methods that can handle large data volumes quickly and efficiently, potentially in batch processing.
- Alternative automations or built-in Airtable solutions that would automatically identify duplicates at the point of import, or shortly thereafter, to help with overall processing speed and accuracy.
If anyone has successfully managed similar high-volume imports or has insights on achieving faster processing, I’d be grateful for your advice!
Thanks in advance.

