Dec 15, 2022 10:21 PM
Hi All,
I’m trying to run this automation script that works on a base with 500 records but now throws this error after I migrated some records and now have a total of ~3500 records.
Can someone help me to fix this error.
Thank you in advance!
Dec 16, 2022 03:07 AM
Using a Javascript map() method to build your records object, and the Airtable API call updateRecordAsync, you can update 50 records at a time instead of the one-at-a-time that your script is currently doing. which I suspect will give you speed gains.
If you could provide a couple of screenshots that show both the Technology Returns and Warehouse Returns tables, and also the WebSPOC and "Receiving hardware serial number" fields (which I assume is a link field between Tech and Ware tables?) - I can perhaps write an example script for you to try.
Dec 16, 2022 05:30 AM
It looks like you have nested loops and are updating records one at a time. I recommend two changes
- update records in bulk. note that using map might not work for you as @Karlstens suggest since you have an “if” and do not update every record.
- build a hash object of values so that you do not have nested loops. Bill French has written some very good articles on the forums about how to do this. There are various other ways to speed up the nested loops, but not having the nested loops to begin with is the way to go.
Dec 16, 2022 07:09 AM
Thank you
You are correct. I have two tables in the base. Warehouse Returns and and Tech Returns. I wanted to compare both tables where WebSPOC field to match and have "Receiving hardware serial number" from Warehouse Returns table feed to Technology Returns if there is a match. Here are the screen shots.
Dec 27, 2022 08:15 AM
Will you be able to write an example script for me to try to my last response? Thank you in Advance!
Dec 27, 2022 12:56 PM
I explored batch Table updates a month or two ago - and although my example below isn't tied to your schema or data, it hopefully shows the method clearly enough so that you can understand the process and perhaps fits this to work in your own scenario.
In this example script for the Script Extension App, it ultimately updates every single record within a base, specifically the "Todo" Single Select field.
Running this script will take time, a few minutes - depending on how many records there are to process, resulting in the Status field being updated for each record. If the same script is modified then loaded into an Automation Script, it will max out at processing about 5,000 records due to the 30 second time limit.
//Gather all records within a base (50,000).
let table = base.getTable("Table 1");
let query = await table.selectRecordsAsync({fields: ["Name", "Status"], sorts: [ {field: "Name", direction :"asc"}]});
let records = query.records;
//The update that each record is to receive
let myData = records.map( element => ({
id: element.id,
name: element.name,
fields: {
"Status": { name: "Todo"}
}
}));
//Divide the updated data into batches of 50 records (due to API hard-limit).
let myBatch = [], size = 50;
while (myData.length > 0) myBatch.push(myData.splice(0, size));
//Stamp the batches with the updated Status into the table (again, 50 records per updateRecordsAsync()).
for (let i = 0; i < myBatch.length; i++) {
await table.updateRecordsAsync(myBatch[i]);
};
As a simple example this script works, but I'm always interested in making refinements, efficiencies and improvements - so keen to hear of any ideas regarding batch record updating.
I suggest trying it out yourself in a simple base/table, before attempting to retro-fit it to your project. As your project contains Link Fields, it requires extra time and thought to figure out and apply this process.