Opened 15 years ago
Closed 15 years ago
#1411 closed Feature Request (Rejected)
FileReadLine enhancement to enable more efficient reading of very large files.
Reported by: | Bowmore | Owned by: | |
---|---|---|---|
Milestone: | Component: | AutoIt | |
Version: | Severity: | None | |
Keywords: | FileReadLine Large Files | Cc: |
Description
I would like to suggest a possible enhancement to the FileReadLine() function to improve the speed of processing very large text file (> 500MB, millions of rows) which are too large for autoIT to load into memory in one chunk using FileRead() or _FileReadToArrary(). What I would like is an extra optional parameter for FileReadLine("filehandle/filename"[,line[,NumLines=1]]) so that it would be possible to read a file for example in 50000 line chunks with each call to FileReadLine() rather than having to call FileReadLine() for every line in the file.
Example of how I envisage it would be used:
$sFile = "C:\data\very large file.txt" $hFile = FileOpen($sFile, 0) $iStartLine = 1 $iNumLines = 50000 $sData = FileReadLine($hFile, $iStartLine, $iNumLines) While Not @error $aData = StringSplit($sData, @CRLF, 1) ; ... ; Process $aData ; ... $iStartLine += $iNumLines $sData = FileReadLine($hFile, $iStartLine, $iNumLines) Wend FileClose($hFile)
Attachments (0)
Change History (1)
comment:1 Changed 15 years ago by Valik
- Resolution set to Rejected
- Status changed from new to closed
Guidelines for posting comments:
- You cannot re-open a ticket but you may still leave a comment if you have additional information to add.
- In-depth discussions should take place on the forum.
For more information see the full version of the ticket guidelines here.
You can easily do this yourself. Use FileRead() to read in fixed block sizes. Then use StringSplit() to break the data into an array of lines. Always prepend the last line of the previous read to the next read since you probably split that line in the middle and you won't miss any data.