I use AutoIt mainly in the context of (heavily numerical) scientific research, and frankly, I wouldn't touch Excel with a bargepole, for a number of reasons. Firstly, it imposes highly restrictive limits on the size and shape of data sets. Secondly, although internally, it uses 15-digit precision in calculations (binary double precision), once a workbook is saved, all accuracy beyond four decimal places is lost; plus, errors due to premature rounding are common, and difficult to trace. Furthermore, the provided statistics are weak, and graphs are low-res and more suitable for a business powerpoint presentation than scientific publication. The problem is that it tries/claims to be a jack of all trades, so it ends up doing most things poorly. I grant you that worksheet editing is quick and intuitive, so it's perfect for accounting and taxes and such like. But when I make data edits I need to be able to keep a track record, so I keep separate AutoIt scripts that perform explicit data preprocessing for me (plus a personal worklog text file to keep track of what I did when, and which script produces what (type of) output).
Of course I don't know what kind of modelling you do, or what your data sets actually look like. But if they can get large (>2GB), or if numerical accuracy or statistical significance is important, I wouldn't rely on an MS-Office tool to do science with. I'd say use a fast matrix/mathematical environment for computing with large datasets (I use Eigen, which is free (AutoIt wrapper library here, a dedicated statistics package for EDA and testing (I use Minitab (not free, but intuitive) and SAS (not at all intuitive), free version here), and a graphics package that allows you to control dpi for publishable figures (I mainly use gmt, also free; but may not be suited to your purpose).