Jul 03, 2024 08:32 AM
async function setAutomationFlags(table, recordIds, flagValue) {
const automationFlags = recordIds.map(recordId => ({
id: recordId.id,
fields: { "Automations": flagValue }
}));
await updateRecordsInChunks(table, automationFlags);
}
const chunkArray = function (arrayToChunk, chunkSize = 50) {
const arraysOfChunks = []
for (var i = 0; i < arrayToChunk.length; i += chunkSize)
arraysOfChunks.push(arrayToChunk.slice(i, i + chunkSize))
return arraysOfChunks
}
const updateRecordsInChunks = async function (table, records) {
const arrayOfChunks = chunkArray(records)
for (const chunkOfRecords of arrayOfChunks) {
const updateResult = await table.updateRecordsAsync(chunkOfRecords)
//console.info(`🔁 Updated ${chunkOfRecords.length} record(s) in the '${table.name}' table`)
}
}
await setAutomationFlags(someTable, someRecords, true)
await setAutomationFlags(someTable, someRecords, null)
Solved! Go to Solution.
Jul 03, 2024 09:47 AM
Thank you for sharing your workaround. It is interesting to see different solutions to common problems.
Are you using these scripts in an automation script, scripting extension, or both? If you are using this in scripting automation, does the checkbox trigger some other automation, or is the filtered view used in a later action within the same automation?
Although it isn't possible to filter records in the selectRecordsAsync() query, it is possible to filter the returned records from the query result using JavaScript. Can you explain why you choose to use a view instead of filtering records in JavaScript?
It looks like your system takes an array of record IDs as an input. Is there a reason why you do not specify those record IDs in your selectRecordsAsync() call? If you have less than 100 record IDs, you can get all the records in one call. If you have more than 100 record IDs, you can get them in batches of 100. (Although if I expect anything close to 100, I tend to just query the entire table and then filter down to the records I want, unless I am worried about other limitations.)
How do you make sure that different scripts running at the same time do not have conflicts in setting/clearing the checkbox field? Do you use a different checkbox field for each automation?
I notice that your script does not clear any checkboxes that may be already selected. Is this why you recommend filtering the records from the view again?
How fast does this system run compared to other methods? I would expect that filtering in JavaScript would be faster than updating records and then querying a view.
My gut feeling is that this system is more complex than I would want to maintain, but since you designed the system, maybe that is not an issue for you.
Jul 03, 2024 09:47 AM
Thank you for sharing your workaround. It is interesting to see different solutions to common problems.
Are you using these scripts in an automation script, scripting extension, or both? If you are using this in scripting automation, does the checkbox trigger some other automation, or is the filtered view used in a later action within the same automation?
Although it isn't possible to filter records in the selectRecordsAsync() query, it is possible to filter the returned records from the query result using JavaScript. Can you explain why you choose to use a view instead of filtering records in JavaScript?
It looks like your system takes an array of record IDs as an input. Is there a reason why you do not specify those record IDs in your selectRecordsAsync() call? If you have less than 100 record IDs, you can get all the records in one call. If you have more than 100 record IDs, you can get them in batches of 100. (Although if I expect anything close to 100, I tend to just query the entire table and then filter down to the records I want, unless I am worried about other limitations.)
How do you make sure that different scripts running at the same time do not have conflicts in setting/clearing the checkbox field? Do you use a different checkbox field for each automation?
I notice that your script does not clear any checkboxes that may be already selected. Is this why you recommend filtering the records from the view again?
How fast does this system run compared to other methods? I would expect that filtering in JavaScript would be faster than updating records and then querying a view.
My gut feeling is that this system is more complex than I would want to maintain, but since you designed the system, maybe that is not an issue for you.
Jul 03, 2024 10:12 AM
@kuovonne so many questions 😀. The primary reason I wasn't using this as an option in the selectRecordsAsync function was I actually didn't realize it existed. I now see it in the documentation, it's just not shown in the example code. Partially why I posted this was to see if there were other options. I did find that this method works well and much faster than not pre-filtering and there are multiple cases where we need more than 100 records but rarely more than a few hundred. I'm going to play around with this and see which offers better performance. Perhaps I can make an asyc function that uses promises and slices up the array of record IDs into batches of 100 tossing them into 1 array. Looks like I have a small side project for the weekend.
to your question about ensuring we get the records we're looking for, I map what records are returned and filter those on the various conditions to ensure it's just the records I need.
The speed gained isn't so much in filtering the records via JavaScript, it's more about API timeouts when we have a table with 10k+ records.
Thanks for taking the time to review and giving me a different idea on how to approach this!
Jul 03, 2024 03:25 PM
I made the modifications to selectRecords using the recordIds as well as promises, and it looks like it's reducing the execution time by about 50%! Here are the helper functions I'm using now:
// Helper function from https://stackoverflow.com/questions/8495687/split-array-into-chunks
const chunkArray = (arrayToChunk, chunkSize = 50) => {
const arraysOfChunks = [];
for (let i = 0; i < arrayToChunk.length; i += chunkSize) {
arraysOfChunks.push(arrayToChunk.slice(i, i + chunkSize));
}
return arraysOfChunks;
};
// Helper functions that take a table reference and array of un-chunked records
// and update/create the records in chunks
const createRecordsInChunks = async (table, records) => {
const arrayOfChunks = chunkArray(records);
const promises = arrayOfChunks.map(chunk => table.createRecordsAsync(chunk));
await Promise.all(promises);
};
const updateRecordsInChunks = async (table, records) => {
const arrayOfChunks = chunkArray(records);
const promises = arrayOfChunks.map(chunk => table.updateRecordsAsync(chunk));
await Promise.all(promises);
};
const deleteRecordsInChunks = async (table, records) => {
const arrayOfChunks = chunkArray(records);
const promises = arrayOfChunks.map(chunk => table.deleteRecordsAsync(chunk));
await Promise.all(promises);
};
// Fetch records in chunks
const fetchRecords = async (table, recordIds, fields) => {
const arrayOfChunks = chunkArray(recordIds, 100);
const promises = arrayOfChunks.map(chunk =>
table.selectRecordsAsync({
fields,
recordIds: chunk,
})
);
const results = await Promise.all(promises);
return results.flatMap(result => result.records);
};