Modify

Opened 16 years ago

Closed 16 years ago

#1411 closed Feature Request (Rejected)

FileReadLine enhancement to enable more efficient reading of very large files.

Reported by: Bowmore Owned by:
Milestone: Component: AutoIt
Version: Severity: None
Keywords: FileReadLine Large Files Cc:

Description

I would like to suggest a possible enhancement to the FileReadLine() function to improve the speed of processing very large text file (> 500MB, millions of rows) which are too large for autoIT to load into memory in one chunk using FileRead() or _FileReadToArrary(). What I would like is an extra optional parameter for FileReadLine("filehandle/filename"[,line[,NumLines=1]]) so that it would be possible to read a file for example in 50000 line chunks with each call to FileReadLine() rather than having to call FileReadLine() for every line in the file.

Example of how I envisage it would be used:

$sFile = "C:\data\very large file.txt"
$hFile = FileOpen($sFile, 0)
$iStartLine = 1
$iNumLines = 50000
$sData = FileReadLine($hFile, $iStartLine, $iNumLines)
While Not @error
  $aData = StringSplit($sData, @CRLF, 1)
  ; ...
  ; Process $aData
  ; ...
  $iStartLine += $iNumLines
  $sData = FileReadLine($hFile, $iStartLine, $iNumLines)
Wend
FileClose($hFile)

Attachments (0)

Change History (1)

comment:1 by Valik, 16 years ago

Resolution: Rejected
Status: newclosed

You can easily do this yourself. Use FileRead() to read in fixed block sizes. Then use StringSplit() to break the data into an array of lines. Always prepend the last line of the previous read to the next read since you probably split that line in the middle and you won't miss any data.

Modify Ticket

Action
as closed The ticket will remain with no owner.

Add Comment


E-mail address and name can be saved in the Preferences .
 
Note: See TracTickets for help on using tickets.