Jump to content

Recommended Posts

Posted (edited)

I have a monster script of a program. About 35,000 lines. In a given situation, the memory starts increasing as a certain (complex) operation is performed. It goes from 150mb eventually to about 1,500mb over the span of a few hours. As far as I'm aware, I'm not deliberately storing data that I'm processing in memory. After I process it, I dump it into a file and move onto the next task. But there are so many moving parts, that it's hard for me to pinpoint what's causing the memory growth.

So is there any way to check what variables in a program are growing, or where a potential leak is coming from?

Edited by lowbattery
Posted (edited)

There is no easy way to perform what you want to achieve.  But you can quite easily find WHERE the leak is happening.  You will need to log at various strategic places (e.g. beginning and end of a function, beginning and end of a task, etc) the result of ProcessGetStats() (see help file for more details).  As an example here, you see that by not closing the file makes the script grow rapidly in memory size.  Instead of writing to the console, write it in a log file, put some timestamp and location within the script and you will find where the memory starts growing.

Example()

Func Example()
  Local $aMemory = ProcessGetStats(), $hFile
  ConsoleWrite("WorkingSetSize: " & $aMemory[0] & "/  PeakWorkingSetSize: " & $aMemory[1] & @CRLF)
  For $i = 1 To 1000
    $hFile = FileOpen("Torus.jpg")
    ;FileClose($hFile)
  Next
  $aMemory = ProcessGetStats()
  ConsoleWrite("WorkingSetSize: " & $aMemory[0] & "/  PeakWorkingSetSize: " & $aMemory[1] & @CRLF)
EndFunc   ;==>Example

ps. see Debug Management UDF in help file to support logging into a file

Edited by Nine
Posted (edited)

Look in the Task Manager for Details. If not visible already, you can add GDI-Objects and Handle count for the process. Both are notorious (at least for me) regarding memory leaks. If the counter goes up all the time of either one, you have a first indication where to look (GDI resources not cleaned up properly, file or pipe handles orphaned).

Also I trace such leaks by first commenting out the time consuming parts, and if the mem usage goes up fast after that, comment out again large sections to narrow the search down.

Edited by KaFu
Posted

Two points that I have stumbled over several times are 1) recursive nested file queries (filefindfirstfile, filefindnextfile), for which I forgot to close the search handle "local $s=FileFindFirstFile()" with fileclose($s) before leaving the loop, once a directory level was done, and 2.) reading the output of commandline tools, where I missed to close the stream:

$pid=Run(“C:\temp\myprogram.exe”,@TempDir,@SW_HIDE,$STDERR_MERGED)

...

StdioClose ( $pid)

Earth is flat, pigs can fly, and Nuclear Power is SAFE!

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...