Help

Re: Batch copy a field to another field in same table (10,000 records)

1672 0
cancel
Showing results for 
Search instead for 
Did you mean: 
kirkbross
5 - Automation Enthusiast
5 - Automation Enthusiast

I’m trying to simply copy one text field to another text field in a table with 10,000 records, in batches of 50 with the Scripting App.

What am I doing wrong in this block:

let table = base.getTable('TableName');
let query = await table.selectRecordsAsync();
let records = query.records;

const BATCH_SIZE = 50;

updateLotsOfRecords(records );

async function updateLotsOfRecords(records) {
  let i = 0;
  while (i < records.length) {
    const recordBatch = records.slice(i, i + BATCH_SIZE);
    for (let record of recordBatch ) {
      let soureValue = record.getCellValue('SourceField');
      
      await table.updateRecordAsync(record, {
        'DestinationField': soureValue
      });
    }
    i += BATCH_SIZE;
  }
}
4 Replies 4

You have a loop nested within a loop and are using updateRecordAsync without an s, so you are not really updating in batches.

I suggest reworking the code so that you do not have the nested loop. Map the records into an array of updates, then have the while loop that actually submits the updates with updateRecordsAsync with an s.

I know you’re probably trying to automate a substantially more complicated process than this little snippet, but I find it kind of funny that this would be just a simple copy-paste job in the correct view for a user not spawned out of JavaScript. Thought it’s worth pointing out just in case you’re halfway through building a serverless Rube Goldberg machine lol.

Yes! I do love me a Rube Goldberg machine, but this is [eventually] to convert image URLs to base64 strings with an async conversion function hitting each record. However, I’m now realizing that the Scripting app can’t utilize a lot of things like FileReader() so I may have to do this via the API and just create a utility page on my Vue app.

I am still running into errors that are hard to debug. When you’re dealing with 10K+ records and 50K+ images with varying filetypes, some which like .heif which are hard to deal with, the run ends up failing somewhere along the way. Tediouos.

This is just me spitballing, but assuming we’re still talking vanilla JavaScript (i.e., you’re using btoa()), maybe try the following approach:

implement this convenient field formula to pre-fetch the original attachment URLs, then just load them to a buffer array using [].push(yourURLfield). From there, loop through the urls with a for…of loop instead of your original approach:

for(let foo of bar.records)
{//encode and return the result}

As an added bonus, this would allow you to query for potentially problematic file formats/urls prior to that critical portion of your runtime. As you probably know already, the sole ability to see whatever’s giving you trouble the way the interpreter does is oftentimes enough to force that ‘a-haa’ moment.