Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 09/08/2022 in all areas

  1. For sure, the GuiCtrlSetData is slowing your loop to a crawl! Here is 1 way to sum one of the CSV columns for each retailer using a dictionary object to store and lookup the sums. This took between 90-160 seconds on my humble laptop to process my test file with 5 million lines. You'll most likely need to adjust the columns for your data, but hopefully this gets you going in the right direction. If you want to do something different with your data, let us know and we'll see what we can come up with. #include <FileConstants.au3> #include <MsgBoxConstants.au3> #include <WinAPIFiles.au3> #include <Array.au3> Example() Func Example() Local $timer = TimerInit() ; Create a constant variable in Local scope of the filepath that will be read/written to. Local Const $sFilePath = "testFile.csv" ;~ Local $sStr = "", $j = 1 ;~ For $j = 1 To 5 ;~ For $i = 0 To 1000000 ;~ $sStr &= "wholesaler" & $j & ",retailer" & $j & ",cat1,model1,saledate,transdate," & $j & @CRLF ;~ Next ;~ Next ;~ ; Create a temporary file to read data from. ;~ If Not FileWrite($sFilePath, $sStr) Then ;~ MsgBox($MB_SYSTEMMODAL, "", "An error occurred whilst writing the temporary file.") ;~ Return False ;~ EndIf ;~ ConsoleWrite(TimerDiff($timer) & @CRLF) ;~ $timer = TimerInit() ; Open the file for reading and store the handle to a variable. Local $hFileOpen = FileOpen($sFilePath, $FO_READ) If $hFileOpen = -1 Then MsgBox($MB_SYSTEMMODAL, "", "An error occurred when reading the file.") Return False EndIf ;create dictionary object to hold the sums Local $oRetailerSums = ObjCreate("Scripting.Dictionary") Local $sFileLine, $aLineParts, $i = 0 ; loop through the file lines While 1 ;read the next line $sFileLine = FileReadLine($hFileOpen) If @error Then ExitLoop ;process this line $aLineParts = StringSplit($sFileLine, ",") If @error Then ContinueLoop ;add this value to the matching retailer sum $oRetailerSums.Item($aLineParts[2]) = $oRetailerSums.Item($aLineParts[2]) + $aLineParts[7] ;display line after every 100,000 lines to indicate progress $i += 1 If Mod($i, 100000) = 0 Then ConsoleWrite("Line " & $i & " " & $sFileLine & @CRLF) EndIf WEnd ConsoleWrite(TimerDiff($timer) & @CRLF) ; Close the handle returned by FileOpen. FileClose($hFileOpen) ;Create array of retailer/value pairs for display Local $aRetailersFound = $oRetailerSums.Keys() Local $aRetailerSums[UBound($aRetailersFound)][2] For $i = 0 To UBound($aRetailersFound) - 1 $aRetailerSums[$i][0] = $aRetailersFound[$i] $aRetailerSums[$i][1] = $oRetailerSums.Item($aRetailersFound[$i]) Next _ArrayDisplay($aRetailerSums) EndFunc ;==>Example
    2 points
  2. Well there are a lot of deeper answers if you google it, but basically 32-bit application have 2GB available memory for the application and 64-bit have 8GB i seem to remember. So is it a pretty solution? No. It's a solution that does not solve your problem, should you allocate more again. But if your application never exceeds the 64-bit limit and you don't wish to waste time optimizing if not needed, then 64-bit is the solution
    1 point
  3. Subz

    Version counter

    FileGetVersion(@ScriptFullPath)
    1 point
  4. Subz

    Version counter

    In Scite Click: Tools > Compile Select: Resource Update tab Select: FileVersion Auto Increment
    1 point
  5. This is already fixed in the latest available Beta in v21.316.1639.14 Not sure about this one but I don't get this issue in the latest beta. Did you try that? Which version are you using?
    1 point
  6. I poked around the Wayback Machine and found the following: AutoIt v2.42 AutoIt v2.51 AutoIt v2.60 AutoIt v2.61 AutoIt v2.62 AutoIt v2.63 It took some remembering that AutoIt was hosted under a different domain in the old days. If there was another name prior to 2000, Wayback might be able to help again.
    1 point
  7. Hi @baconaise. Have you tried running your script as a 64-bit application?
    1 point
  8. You can find the limits for AutoIt e.g. here : LimitsDefaults Excerpt : Maximum number of elements for an array = 16,777,216 Your problem is apparently the default limit of 2,147,483,647 bytes in RAM. @kurtykurtyboy's suggestion to process the .csv file line by line is good, assuming that this is what you need. Another option would be to read the data into a SQLite database and process it with the respective SQL statements.
    1 point
  9. Could you give more context on what you are trying to do with the data? Do you need all of the data at once or can you process 1 line at a time? Here is a modified example from the help file to loop through each line of the file using FileReadLine. #include <FileConstants.au3> #include <MsgBoxConstants.au3> #include <WinAPIFiles.au3> Example() Func Example() Local $timer = TimerInit() ; Create a constant variable in Local scope of the filepath that will be read/written to. Local Const $sFilePath = "testFile.csv" ;~ Local $sStr = "" ;~ For $i=0 to 5000000 ;~ $sStr &= "00000000000000000000000000000000000000000000000" & @CRLF ;~ Next ;~ ; Create a temporary file to read data from. ;~ If Not FileWrite($sFilePath, $sStr) Then ;~ MsgBox($MB_SYSTEMMODAL, "", "An error occurred whilst writing the temporary file.") ;~ Return False ;~ EndIf ;~ ConsoleWrite(TimerDiff($timer) & @CRLF) ;~ $timer = TimerInit() ; Open the file for reading and store the handle to a variable. Local $hFileOpen = FileOpen($sFilePath, $FO_READ) If $hFileOpen = -1 Then MsgBox($MB_SYSTEMMODAL, "", "An error occurred when reading the file.") Return False EndIf ; loop through the file lines Local $sFileLine While 1 ;read the next line $sFileLine = FileReadLine($hFileOpen) If @error Then ExitLoop ;process this line WEnd ConsoleWrite(TimerDiff($timer) & @CRLF) ; Close the handle returned by FileOpen. FileClose($hFileOpen) EndFunc ;==>Example
    1 point
×
×
  • Create New...