sulcarter95 Posted yesterday at 09:05 AM Posted yesterday at 09:05 AM I’m working on an AutoIt script to automate file processing tasks, including renaming, moving, and extracting data from text files. However, I’m running into some issues with handling large batches efficiently. The script works for a small number of files, but performance slows down when processing hundreds at once. Are there any best practices for optimizing file handling in AutoIt? Should I use multithreading or another approach to improve performance? Any guidance or sample scripts would be greatly appreciated!
RTFC Posted yesterday at 12:30 PM Posted yesterday at 12:30 PM Please explore "parallel/concurrent processing" UDFs in the Wiki. argumentum 1 My Contributions and Wrappers Spoiler BitMaskSudokuSolver BuildPartitionTable CodeCrypter CodeScanner DigitalDisplay Eigen4AutoIt FAT Suite HighMem MetaCodeFileLibrary OSgrid Pool RdRand SecondDesktop SimulatedAnnealing Xbase I/O
Nine Posted yesterday at 01:29 PM Posted yesterday at 01:29 PM (edited) Maybe posting the code that shows the largest slow down could help us to suggest an adapted solution. Edited 23 hours ago by Nine “They did not know it was impossible, so they did it” ― Mark Twain Spoiler Block all input without UAC Save/Retrieve Images to/from Text Monitor Management (VCP commands) Tool to search in text (au3) files Date Range Picker Virtual Desktop Manager Sudoku Game 2020 Overlapped Named Pipe IPC HotString 2.0 - Hot keys with string x64 Bitwise Operations Multi-keyboards HotKeySet Recursive Array Display Fast and simple WCD IPC Multiple Folders Selector Printer Manager GIF Animation (cached) Debug Messages Monitor UDF Screen Scraping Round Corner GUI UDF Multi-Threading Made Easy
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now