Help

Re: A loop in a filter in a another loop

1905 0
cancel
Showing results for 
Search instead for 
Did you mean: 
Peter_Borg
6 - Interface Innovator
6 - Interface Innovator

I got a script to work, am quite pleased with myself and thought I’d share it. I was also guided by some other posts here (in particular this one) though I didn’t fully understand it. I haven’t coded for many years, and never in javascript.

This script runs through a list of job bookings at various properties (for cleaning services in this instance). It is useful for each booking to know when is the next booking for that property. The script runs through all bookings in the scheduling window (limited by view). For each job it finds all other jobs at that property. It runs through all those jobs in order to find the very next one if it exists. If so, it links the current booking to the next booking.

All this is needed because bookings come in in no particular order, and there are often lots of cancellations and adjustments before the job gets done, and because the actally cleaning is often scheduled long after the bookings come in and can be done any time between one job and the next.

Script takes a while to run (maybe because it is a dumb design?) but is worth it.

let table = base.getTable("Jobs");
let view = table.getView("2. Schedule Jobs");
let property = table.getField('Property String');
let start = table.getField('Scheduled Start');
let result = await view.selectRecordsAsync({
    sorts: [{field: "Scheduled Start", direction: "asc"}]
});

for (let record1 of result.records) {
    await table.updateRecordAsync(record1, {'Next Job': null});
    let match = record1.getCellValue(property);
    let current = record1.getCellValue(start);
    let records2;
    records2 = result.records.filter(record => record.getCellValue(property) == match);
    let next;
    for (let record2 of records2) {
        let next = record2.getCellValue(start);
        if (next > current) {
            await table.updateRecordAsync(record1, {'Next Job': [{id: record2.id}]});
            break;}
    }
//    output.text(match);
} 
output.text('Updates Completed')
5 Replies 5

This is great!

Two things come to mind that could [possibly] make it faster:

  1. Updating a single record at a time is probably not ideal.
  2. Instead of looping through records2 inside the main loop, consider creating a JSON hash index of all the records2 items and using an indirect methodology of updating record1.

#1 will most certainly reduce the update time significantly, and #2 may make it simpler and a little faster yet again. But both are good learning exercises to advance your skill set. :winking_face:

Thanks @Bill.French still plenty to learn at this end. Appreciate the feedback.

Hi @Bill.French do you have a tutorial of doing this with JSON?

@Rose_Haft1,

Here’s an example of what it would look like to refactor @Peter_Borg’s initial update inside his loop such that it is batched as JSON records to be updated and updated as a batch (rather than one at a time) outside of the loop:

const recordsToNullJobs = result.records.map(record => {
  return {
    id: record.id,
    fields: {
      "Next Job": null
    }
  }
})

const recordsWithJobsNulled = await batchAnd('Update', table, recordsToNullJobs)

/*
    Use this function to perform 'Update', 'Create', or 'Delete'
    async actions on batches of records that could potentially 
    more than 50 records.

    ::PARAMETERS::
    action = string; one of 3 values:
           - 'Update' to call table.updateRecordsAsync()
           - 'Create' to call table.createRecordsAsync()
           - 'Delete' to call table.deleteRecordsAsync()

    table = Table; the table the action will be performed in

    records = Array; the records to perform the action on
            - Ensure the record objects inside the array are
            formatted properly for the action you wish to
            perform

    ::RETURNS::
    recordsActedOn = integer, array of recordId's, or null; 
                   - Update Success: integer; the number of records processed by the function
                   - Delete Success: integer; the number of records processed by the function
                   - Create Success: array; the id strings of records created by the function
                   - Failure: null;
*/
async function batchAnd(action, table, records) {
  let recordsActedOn;

  switch (action) {
      case 'Update':
          recordsActedOn = records.length;
          while (records.length > 0) {
              await table.updateRecordsAsync(records.slice(0, 50));
              records = records.slice(50);
          };
          break;
      
      case 'Create':
          recordsActedOn = [];
          while (records.length > 0) {
              let recordIds = await table.createRecordsAsync(records.slice(0, 50));
              recordsActedOn.push(...recordIds)
              records = records.slice(50);
          };
          break;

      case 'Delete':
          recordsActedOn = records.length;
          while (records.length > 0) {
              await table.deleteRecordsAsync(records.slice(0, 50));
              records = records.slice(50);
          }
          break;

      default:
          output.markdown(`**Please use either 'Update', 'Create', or 'Delete' as the "action" parameter for the "batchAnd()" function.**`);
          recordsActedOn = null;
  }
  return recordsActedOn;
}

result.records is coming from line 5 of his script, and this version replaces what is being done on line 10 of his script.

The Array.map() function is being used to transform each record into a JSON object representing a record to be updated with a null for “Next Job”. The return statement is returning the new JSON object into the array called recordsToNullJobs, so this new array is a package of JSON objects representing records that will be updated in the batchAnd('Update'...) call, and formatted in the way Airtable’s API wants them to be for the Table.updateRecordsAsync() function.

Rose_Haft1
8 - Airtable Astronomer
8 - Airtable Astronomer

Danke dir. Will be trying to finish up and then diving a bit more into this. Appreciate it!