Help

Re: Script exceeded execution time limit of 30 seconds

3670 0
cancel
Showing results for 
Search instead for 
Did you mean: 
tdang79
4 - Data Explorer
4 - Data Explorer

Hi All, 

I’m trying to run this automation script that works on a base with 500 records but now throws this error after I migrated some records and now have a total of  ~3500 records.

Can someone help me to fix this error.

Thank you in advance!

 

//on which you want to run the vlookup
let mainTable = base.getTable("Technology Returns");
let mainTableRecords = await mainTable.selectRecordsAsync();

//lookup
let lookupTable = base.getTable("Warehouse Returns");
let lookupRangeRecords = await lookupTable.selectRecordsAsync();

//Replace "Item.barcode" with column name which has the values you want to look up
for (let record of mainTableRecords.records) {
let lookupValue = record.getCellValue("WebSPOC");

//Replace "Barcode" with column name which is the range to search in
//Replace "Name" with columnn name which value should be returned
for (let rangeRecord of lookupRangeRecords.records) {
if (rangeRecord.getCellValue("WebSPOC") === lookupValue) {
let returnValue = rangeRecord.getCellValue("Receiving hardware serial number");
 
//Replace "Proper Name" with column name from mainTable which should contain the link
mainTable.updateRecordAsync(record, {
"Receiving hardware serial number": returnValue,

});
}
}
}

 

5 Replies 5

Using a Javascript map() method to build your records object, and the Airtable API call updateRecordAsync, you can update 50 records at a time instead of the one-at-a-time that your script is currently doing. which I suspect will give you speed gains.

If you could provide a couple of screenshots that show both the Technology Returns and Warehouse Returns tables, and also the WebSPOC and "Receiving hardware serial number" fields (which I assume is a link field between Tech and Ware tables?) - I can perhaps write an example script for you to try.

It looks like you have nested loops and are updating records one at a time. I recommend two changes

- update records in bulk. note that using map might not work for you as @Karlstens suggest since you have an “if” and do not update every record. 

- build a hash object of values so that you do not have nested loops. Bill French has written some very good articles on the forums about how to do this. There are various other ways to speed up the nested loops, but not having the nested loops to begin with is the way to go. 

Thank you

You are correct. I have two tables in the base. Warehouse Returns and and Tech Returns. I wanted to compare both tables where WebSPOC field to match and have "Receiving hardware serial number" from Warehouse Returns table feed to Technology Returns if there is a match. Here are the screen shots.

 

Screenshot 2022-12-16 at 7.00.42 AM.pngScreenshot 2022-12-16 at 7.01.01 AM.png

@Karlstens 

Will you be able to write an example script for me to try to my last response?  Thank you in Advance!

I explored batch Table updates a month or two ago - and although my example below isn't tied to your schema or data, it hopefully shows the method clearly enough so that you can understand the process and perhaps fits this to work in your own scenario.

In this example script for the Script Extension App, it ultimately updates every single record within a base, specifically the "Todo" Single Select field.

Karlstens_0-1672174215215.png

Running this script will take time, a few minutes - depending on how many records there are to process, resulting in the Status field being updated for each record. If the same script is modified then loaded into an Automation Script, it will max out at processing about 5,000 records due to the 30 second time limit.

//Gather all records within a base (50,000). 
let table = base.getTable("Table 1");
let query = await table.selectRecordsAsync({fields: ["Name", "Status"], sorts: [ {field: "Name", direction :"asc"}]});
let records = query.records;

//The update that each record is to receive
let myData = records.map( element => ({
    id: element.id,
    name: element.name,
    fields: {
        "Status": { name: "Todo"}
    }
}));

//Divide the updated data into batches of 50 records (due to API hard-limit).
let myBatch = [], size = 50;
while (myData.length > 0) myBatch.push(myData.splice(0, size));

//Stamp the batches with the updated Status into the table (again, 50 records per updateRecordsAsync()).
for (let i = 0; i < myBatch.length; i++) {
     await table.updateRecordsAsync(myBatch[i]);
};


As a simple example this script works, but I'm always interested in making refinements, efficiencies and improvements - so keen to hear of any ideas regarding batch record updating.

I suggest trying it out yourself in a simple base/table, before attempting to retro-fit it to your project. As your project contains Link Fields, it requires extra time and thought to figure out and apply this process.