Save the date! Join us on October 16 for our Product Ops launch event. Register here.
May 03, 2023 10:44 PM
Good morning team,
I’m trying to run this automation script that works on a base with 82,000 records but now throws this error since May 3, 2023. This is the first time that it happened on my base.
Just a bit of a background, my Airtable base is connected to Softr as well.
This is the code developed:
Can someone help me to fix this error.
Thank you in advance!
Solved! Go to Solution.
May 04, 2023 07:40 AM
Hey @Francoloco23!
Try this out:
const { record_id } = input.config();
const sourceTable = base.getTable("Bulk Upload");
const destinationTable = base.getTable("Bulk Upload Extract");
const selectedField = sourceTable.getField("Attachments");
const nameField = destinationTable.getField("File name");
const attachmentField = destinationTable.getField("Attachments");
let record = await sourceTable.selectRecordAsync(record_id);
let recordAttachments = record.getCellValue(selectedField);
if (recordAttachments) {
let newRecords = recordAttachments.map(attachment => ({
fields: {
[nameField.id]: attachment.filename,
[attachmentField.id]: [{ url: attachment.url }]
}
}));
while (newRecords.length > 0) {
await destinationTable.createRecordsAsync(newRecords.slice(0, 50));
newRecords = newRecords.slice(50);
}
}
The reason your script was exceeding the maximum execution time is because you were doing an indiscriminate query for all 82,000 records in your table. The number of records isn't a problem. Instead, the problem was that you were not just querying every record, but you were also grabbing every field object as well, which is a massive amount of data to pull just for you to filter out a gargantuan 99.9% of that data.
You can optimize your script performance by utilizing the selectRecordAsync method instead of the selectRecordsAsync method.
Since you already have a record id of the record you're looking for, there's no need to grab every record when you can simply query the record you're looking for directly.
May 04, 2023 07:40 AM
Hey @Francoloco23!
Try this out:
const { record_id } = input.config();
const sourceTable = base.getTable("Bulk Upload");
const destinationTable = base.getTable("Bulk Upload Extract");
const selectedField = sourceTable.getField("Attachments");
const nameField = destinationTable.getField("File name");
const attachmentField = destinationTable.getField("Attachments");
let record = await sourceTable.selectRecordAsync(record_id);
let recordAttachments = record.getCellValue(selectedField);
if (recordAttachments) {
let newRecords = recordAttachments.map(attachment => ({
fields: {
[nameField.id]: attachment.filename,
[attachmentField.id]: [{ url: attachment.url }]
}
}));
while (newRecords.length > 0) {
await destinationTable.createRecordsAsync(newRecords.slice(0, 50));
newRecords = newRecords.slice(50);
}
}
The reason your script was exceeding the maximum execution time is because you were doing an indiscriminate query for all 82,000 records in your table. The number of records isn't a problem. Instead, the problem was that you were not just querying every record, but you were also grabbing every field object as well, which is a massive amount of data to pull just for you to filter out a gargantuan 99.9% of that data.
You can optimize your script performance by utilizing the selectRecordAsync method instead of the selectRecordsAsync method.
Since you already have a record id of the record you're looking for, there's no need to grab every record when you can simply query the record you're looking for directly.