mark2004 Posted October 27, 2009 Share Posted October 27, 2009 Hi All. I had heard so much about SQLite being so fast so I was considering converting my existing script from using text files to using a database. But, when I wrote some simple scripts to compare speed I was surprised to see that the SQLite functions were WAY slower than just writing to text files. The code below just writes 2 rows to a table. The INSERT statement takes about 280 ms. If I increase it to writing 4 rows, then it goes up to 385 ms. Using the good old _FileWriteFromArray() it takes about 3 ms. What gives??? Am I doing something wrong because the whole world seems to agree that SQLite is faster than using text file?? Is it something to do with the way Autoit incorporates it or is this an underlying SQLite issue?? Thanks for any light you can shed on this mystery.... #include <sqlite.au3> #include <sqlite.dll.au3> $start=TimerInit() Dim $timer[6] _SQLite_Startup() $timer[1]=TimerDiff($start) _SQLite_Open("C:\Mark\Programs\Database\test.db") $timer[2]=TimerDiff($start) _SQLite_Exec(-1,"Insert into tblTest8 values ('1',2,3);" & _ "Insert into tblTest8 values (Null,50,6);") $timer[3]=TimerDiff($start) _SQLite_Close() $timer[4]=TimerDiff($start) _SQLite_Shutdown() $timer[5]=TimerDiff($start) ConsoleWrite(@CRLF & "1- " & $timer[1] & @CRLF) ConsoleWrite(@CRLF & "2- " & $timer[2] & @CRLF) ConsoleWrite(@CRLF & "3- " & $timer[3] & @CRLF) ConsoleWrite(@CRLF & "4- " & $timer[4] & @CRLF) ConsoleWrite(@CRLF & "5- " & $timer[5] & @CRLF) Link to comment Share on other sites More sharing options...
ame1011 Posted October 27, 2009 Share Posted October 27, 2009 I'm not sure you understand the benefits of using a database. If your data is so simple that it can be stored using text files do so. In my case, my latest project currently has around 10 tables, some with as many as 12 columns. I can select any of the variables stored using complex queries and I can preform most operations within the actual sql query itself. This is where using a database outperforms text files. [font="Impact"] I always thought dogs laid eggs, and I learned something today. [/font] Link to comment Share on other sites More sharing options...
mark2004 Posted October 27, 2009 Author Share Posted October 27, 2009 My project would have many tables with up to around 20 columns per table. I may have to load as many as 100 or so records per screen refresh and I would have to write only 1 record at a time. I was just trying to get a handle on how much faster the data could be read/written. I can definitely see the advantage in using complex queries. I have to search through many text files to do something similar. So no question there. But it looks like the actual read/write for a given record would definitely be faster with pipe delimited text files. It doesn't seem to be an advantage even for very large data insertions. Link to comment Share on other sites More sharing options...
Zedna Posted October 27, 2009 Share Posted October 27, 2009 (edited) Use SQLite when you have many rows. Then you will see the power of it.Also use transactions for speed optimize.If you explicitly don't call BEGIN/COMMIT then SQLite internally call COMMIT after each command which slows down whole process._SQLite_Startup() _SQLite_Open("C:\Mark\Programs\Database\test.db") _SQLite_Exec (-1, "BEGIN;") _SQLite_Exec(-1,"Insert into tblTest8 values ('1',2,3);" & _ "Insert into tblTest8 values (Null,50,6);") ; ... many other insert/update/delete commands here ... _SQLite_Exec (-1, "COMMIT;") _SQLite_Close() _SQLite_Shutdown()EDIT:Also as opposite to plain TXT you can quickly pick up set of desired rows from large amount of records by SELECT ... FROM ... WHERE ... Edited October 27, 2009 by Zedna Danyfirex 1 Resources UDF ResourcesEx UDF AutoIt Forum Search Link to comment Share on other sites More sharing options...
leos Posted October 27, 2009 Share Posted October 27, 2009 Every insert operation it is treated by SQLite as a transaction. After each transaction SQLite will flush all data to database file. So each insertion needs supplementary time to complete. Solution: You can group multiple insertions into a single transaction, something like this <pseudo code> BEGIN TRANSACTION Insert Insert ….. COMMIT </pseudo code> This mechanism speeds insertions a lot. Details about this technique in sqlite documentation http://www.sqlite.org/ Link to comment Share on other sites More sharing options...
DaHack Posted February 28, 2013 Share Posted February 28, 2013 Before anyone says anything, I know it is an old post, but I have fallen upon it and the answer was not given. AutoIT is slower if you use SQLite's UDF's. It is faster to create .sql files and then run a shell command with sqlite3. For instance, I parsed an output file from another software that had 7000 lines and I created an .sql file and then ran a shell command to input it into the SQL database. This was 80% faster than using the _SQLite_Exec() command. Of course in both tests I used the BEGIN & COMMIT statements. If you are inputing alot of data create a text file and run a shell command:RunWait(@ComSpec & " /c sqlite3 my_sqlite_db.db3 < temp_commit.sql","",@SW_HIDE)Having said that I use the SQLite UDF's all the time for smaller convenient transactions. I generally use them for storing all my data in my applications, but AutoIT is not known for it's performance, it is for it's ease of use and massive support group. Time is precious, search wisely. Link to comment Share on other sites More sharing options...
jchd Posted February 28, 2013 Share Posted February 28, 2013 Once this dead body is necroed, yes the CLI (command-line interface) can reveal faster in many instances but not all. Another advice is to postpone index creation until most/all of the bulk data is inserted. Depending on schema and columns characteristics, the new WAL mode can be beneficial (only with more recent sqlite3.dll versions). DB parameters (cache size and more obscure options) can also make a large difference. Also loading a :memory: DB then backuping it to disk is also a good way to speed things up wastly. So yes, the invokation of DLL functions with AutoIt makes it significantly slow but there is no definite rule of thumb for optimal speed. OTOH, bulk insertion is generally a one-time job launched at DB creation, hence rarely a routine bottleneck in everyday's use. This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe hereRegExp tutorial: enough to get startedPCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta. SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt) Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now