﻿id	summary	reporter	owner	description	type	status	milestone	component	version	severity	resolution	keywords	cc
1411	FileReadLine enhancement to enable more efficient reading of very large files.	Bowmore		"I would like to suggest a possible enhancement to the FileReadLine() function to improve the speed of processing very large text file (> 500MB, millions of rows) which are too large for autoIT to load into memory in one chunk using FileRead() or _FileReadToArrary(). What I would like is an extra optional parameter for FileReadLine(""filehandle/filename""[,line[,NumLines=1]]) so that it would be possible to read a file for example in 50000 line chunks with each call to FileReadLine() rather than having to call FileReadLine() for every line in the file.

Example of how I envisage it would be used:
{{{

$sFile = ""C:\data\very large file.txt""
$hFile = FileOpen($sFile, 0)
$iStartLine = 1
$iNumLines = 50000
$sData = FileReadLine($hFile, $iStartLine, $iNumLines)
While Not @error
  $aData = StringSplit($sData, @CRLF, 1)
  ; ...
  ; Process $aData
  ; ...
  $iStartLine += $iNumLines
  $sData = FileReadLine($hFile, $iStartLine, $iNumLines)
Wend
FileClose($hFile)
}}}"	Feature Request	closed		AutoIt		None	Rejected	FileReadLine Large Files	
