Help

The Community will be temporarily unavailable starting on Friday February 28. We’ll be back as soon as we can! To learn more, check out our Announcements blog post.

Script exceeded execution time limit of 30 seconds

Topic Labels: Automations
Solved
Jump to Solution
4078 3
cancel
Showing results for 
Search instead for 
Did you mean: 
Melrose_del_Win
4 - Data Explorer
4 - Data Explorer

Hi All,

I’m trying to run this automation script that works perfectly on my test base of ~10 records but throws this error when I move it to my real base with ~1100 records.

Error: Script exceeded execution time limit of 30 seconds

The below code is based on this tutorial here, tweaked slightly with recommendations I found in another similar question. Any help or ideas would be appreciated!
Learn Airtable scripting #1: basics & removing duplicates with Giovanni Briggs - YouTube (from Automate All the Things channel)

var table = base.getTable(“Table 1”);

var query = await table.selectRecordsAsync ({

fields: ["Name" ]

});

console.log(query);

let duplicates = query.records.filter((record)=>{

return query.records.find((potentialDuplicate)=>{

   return record.getCellValue("Name") === potentialDuplicate.getCellValue("Name") && record.id !== potentialDuplicate.id;

})

});

console.log(duplicates);

let updates = duplicates.map(update => {

return {

    "id":update.id,

    fields: {

        "Duplicate": true

    }

}

})

console.log(updates);

while (updates.length>0){

//removed await in front of the below Async to see if that would improve time

table.updateRecordsAsync(updates.slice(0,50));

updates = updates.slice(50);

};

1 Solution

Accepted Solutions
Bill_French
17 - Neptune
17 - Neptune

Not surprising - think that snippet makes (1100-1)->squared calls to try to find duplicates. e.g., for each row, look at the other 1099 rows - really bad design.

You need to use hash indices to do this fast.

See Solution in Thread

3 Replies 3
Bill_French
17 - Neptune
17 - Neptune

Not surprising - think that snippet makes (1100-1)->squared calls to try to find duplicates. e.g., for each row, look at the other 1099 rows - really bad design.

You need to use hash indices to do this fast.

Thanks! I will try it out.

On a different note…

All asynchronous methods must use await to function correctly. It’s possible that some updates might not get saved correctly if not await-ed.