writeDump - causes memory issues / exceptions

I have a small AWS instance for testing that has only 1 GB RAM.
When I attempt to dump a specific (deeply nested object) I get the following error;
GC overhead limit exceeded

I have tried all the formats and setting expand to false / etc.

when I tried to use serializeJSON(), instead I get
Java heap space

The app works fine - it is only an issue when I am trying to dump the object.
So my question is;

Is there any possibility for something at a higher level (than the JVM throwing an error) - like;

  • Can be determined in advance of the DUMP - as to whether or not will cause a memory issue?

and if so produce a “nicer” error?

  • There is XXX HEAP / XXX STACK available to the JVM.
  • You have asked to dump something that is XX : It is too big.

Additionally - is there a better way to create dumps that are not so resource hungry?

Is there a better way for me to obtain a dump - other than writeDump() / dump() ?

Thanks!

You could try to allocate more memory to the JVM and or your VM
by default if you blow through the installer I believe the max amount is 512 MB.
The java args are
-Xms set initial Java heap size
-Xmx set max Java memory used
so -Xms 800M -Xmx800M would yield 800 MB of memory allocated to java

As for DUMPING, I still use something I ganked from a blog post by @bennadel a long time ago, and it just works.

<Cfoutput>
<cfloop
list="application,session,variables,client,url,form,request,server,cgi"
index="i">
<cfdump var=#evaluate(i)# label="#i#">
</cfloop>

<TABLE>
<TR>
  <TH>Variable Name</TH>
  <TH>Value</TH>
</TR>
<!--- loop over the Form structure and output all
      of the variable names and their associated 
      values --->
<CFLOOP COLLECTION="#Form#" ITEM="VarName">
 
  <TR>
    <TD>#VarName#</TD>
    <TD>#Form[VarName]#</TD>
  </TR>
 
</CFLOOP>
</TABLE>
</cfoutput>	
2 Likes

Hi Terry,

Increasing the size of the heap is not the answer.
On my local machine I have 32GB of RAM
And have loaded Lucee with 4GB of Heap and it still bombs out.

How big is the object? What I somtimes do to get a copy of an object: I serialize it to JSON and write it into a .txt/.json file. It won’t magically write down the data types, but I use it as an alternative to cfdumps to see object/keys/values in a raw manner. You can also beautifully browse that files with a browser, or reimport it to a cfml struct.

1 Like

Details help.

In the Tomcat / Java world, resources matter. Memory wins out over processing power.
You could try appending *+UseParallelGC to your lucee java start command, this however works a third of the time.

You could try to force your application to run, be it ultra slow with
-XX:MaxGCPauseMillis=20000
Wait 20000 miliseconds for garbage collection

-XX:GCTimeRatio=96
Set the time your application does garbage collection vs running the application. ie roughly 61 percent of the time do Garbage cleanup

-XX:ParallelGCThreads=12 (how many cores times the number of processors)
So if you have 20 cores, and 4 processors, this would be set to 80

-XX:ConcGCThreads=3 (how many threads should we set for garbage collection)

-XX:InitiatingHeapOccupancyPercent=10
Default 45, this is where the garbage collector picks up and starts processing, Lets do it sooner than later

-XX:G1NewSizePercent=10
Default 5, when the Java heap is initiated lets slice out 10 percent of that for your debug

-XX:G1MaxNewSizePercent=90
Default 60, how much space can the heap take up of your jvm slice

-XX:G1OldCSetRegionThresholdPercent=30
Default 10, how much should the GC try to clean up during the clean phase

-XX:G1ReservePercent=2
Default 10, how much memory by percent should the garbage collectors always keep available

You may need to tweak the heck out of the time and pause values, but it will run. be it slowly.