Help

512MB memory limit in an automation script

Topic Labels: API
972 3
cancel
Showing results for 
Search instead for 
Did you mean: 
Francoloco23
5 - Automation Enthusiast
5 - Automation Enthusiast

Hi,

I am trying to run the following script based on my screenshots and I am getting an error “Your script exceeded the 512MB memory limit.”.

What am I doing wrong and how can I fix this? Thanks!

 

3 Replies 3

How much data are you trying to fetch? If the data you need to fetch approaches the limit, can you fetch less data?

What is your experience level with coding and how did you get this script? How sure are you that this memory issue is the only issue?

There are a lot of wavy red underlines in your script, indicating potential issues with the script. I recommend writing the script so that it does not have so many wavy red underlines. 

There are various techniques that can be used to manage memory usage in a script, but the current script is not written in a way to easily say which techniques apply.

If you have budget and just want a working script, I suggest hiring an experienced script writer. If you want to learn to code yourself, I suggest starting with simpler projects.

Hello @Francoloco23,
As @kuovonne  suggest improve your code which is related to memory and time limits.

For now, there are no specific suggestions because it's not an issue but we can see its limitations.
When you do multiple API requests on a single automation script block this happens. because API req/res took time. Sometimes it took more than we expected. So cross-check these things first. 
Then define timeout for them. After that, you can divide those multiple requests into multiple scripts.

sflorez
5 - Automation Enthusiast
5 - Automation Enthusiast

Hey @Francoloco23 ,

The error that you are experiencing is due to automations having a memory limit, meaning there is a fixed amount of data the automation can "hold" per instance of a script step. 

In this case it seems that the automation you are attempting to run is loading csv files via network requests, parsing them and then aggregating the data to set some compiled output. The problem here is that depending on the size of the csv files that you are getting back from those requests and the number of csv files you are requesting, you can easily blow past the memory constraint of a single script run. 

My suggestion would be to determine the average size of the csv files you will be pulling over the wire. This will allow you to then more accurately batch the requests you need to make into multiple script runs as user @dilipborad suggested. Hope this points you in the right direction.

Sebastian at Stradia Partners