Palestinian Posted April 24, 2014 Share Posted April 24, 2014 Hello everyone, I'm thinking about starting a new project that will be used to 'import' a huge amount of data (patient cases) from a certain website to be used for searching porposes. Each case got a unique case ID added to the URL, example: http://raisunhcr.org/user/Member_Biodata.aspx?caseid=237602 I've been trying to contact the UNHCR team in hope of getting the database from them but their number is busy atm. What is my best approach for such a project? What function might do the job better than other functions? is it recommended to store the data in excel sheets and work with them or is there a better/faster way to accomplish that task? ' I considered SQLite but it sounded way too gibbrish for me. ' At the moment there are 326621 cases that I need to import, they increase by -+300 on daily bases, I will also be needing to check if any data changed in the files that I already imported by comparing them with the website version of the files (assistance gets added to cases every day) Link to comment Share on other sites More sharing options...
JohnOne Posted April 24, 2014 Share Posted April 24, 2014 I'd leave the database in whatever format it already is. AutoIt Absolute Beginners Require a serial Pause Script Video Tutorials by Morthawt ipify Monkey's are, like, natures humans. Link to comment Share on other sites More sharing options...
Palestinian Posted April 24, 2014 Author Share Posted April 24, 2014 I just got off the phone with them, they don't mind the idea but they won't give me the database for security reasons ( I already got access to the information :/ ) so now I will be creating my own database using whichever format I see fit. Link to comment Share on other sites More sharing options...
JohnOne Posted April 24, 2014 Share Posted April 24, 2014 Can you not just query the database online? AutoIt Absolute Beginners Require a serial Pause Script Video Tutorials by Morthawt ipify Monkey's are, like, natures humans. Link to comment Share on other sites More sharing options...
Palestinian Posted April 24, 2014 Author Share Posted April 24, 2014 I can, however a normal search takes some time depending on the server load and the internet connection and the accuracy of the searching terms, the reason I'm doing this is to have an offline searching engine that only requires an internet connection once to compare the data then work offline the rest of the day, the data doesn't even need to be updated daily, it works if the data is updated once a week too. Link to comment Share on other sites More sharing options...
JohnOne Posted April 24, 2014 Share Posted April 24, 2014 I'd use SQL of some flavour then, lite, MY etc... Probably take you a week to update one third of a million records from individual web pages whatever format you use. Palestinian 1 AutoIt Absolute Beginners Require a serial Pause Script Video Tutorials by Morthawt ipify Monkey's are, like, natures humans. Link to comment Share on other sites More sharing options...
Palestinian Posted April 24, 2014 Author Share Posted April 24, 2014 I will try and read about SQLite again, hopefully I'll have a better mindset for it, I know it might take some time to fully update the records, but as I said a weekly update works just fine as well, thank you. Link to comment Share on other sites More sharing options...
orbs Posted April 24, 2014 Share Posted April 24, 2014 if searching is your goal, then possibly store the text of each webpage into a text file, and have it indexed by the operating system. if you deal with a database with O(10^6) textual records, each can be presented in a single webpage - that shouldn't take that much space. Signature - my forum contributions: Spoiler UDF: LFN - support for long file names (over 260 characters) InputImpose - impose valid characters in an input control TimeConvert - convert UTC to/from local time and/or reformat the string representation AMF - accept multiple files from Windows Explorer context menu DateDuration - literal description of the difference between given dates Apps: Touch - set the "modified" timestamp of a file to current time Show For Files - tray menu to show/hide files extensions, hidden & system files, and selection checkboxes SPDiff - Single-Pane Text Diff Link to comment Share on other sites More sharing options...
Palestinian Posted April 24, 2014 Author Share Posted April 24, 2014 Simply storing the text of each webpage won't work, search doesn't return a unique page if the case size is bigger than 1 (case size is determind by family members number), only way to get a unique result is by searching for Individual ID's and then using the export button, however that only exports the main information (name, ID number, DoB ), it doesn't import the actual needed data (health assistance, case assistant, medical coverage, payments, etc.) The problem with the search function on the website is that it sends you to the PA's case page, not the actual case that you are looking for, then you'll have to select the name of the patient you are looking for from the family's dropdown menu which in return updates the information on the page. Link to comment Share on other sites More sharing options...
jchd Posted April 24, 2014 Share Posted April 24, 2014 (edited) SQLite is most probably the best tool for your use case, much, much preferable to Excel. Can you dump the database, or at least capture rows or tables? Anyway, once you have a workable input way like a .CSV or something like html tables, it will be pretty easy to design a database schema to host your data efficiently. Details will be needed to make a more precise advice/guidance but in all cases there is a solution waiting. Edit: to be fair, I'm known to be an almost unconditional SQLite fan (with robust reasons). Edited April 24, 2014 by jchd This wonderful site allows debugging and testing regular expressions (many flavors available). An absolute must have in your bookmarks.Another excellent RegExp tutorial. Don't forget downloading your copy of up-to-date pcretest.exe and pcregrep.exe hereRegExp tutorial: enough to get startedPCRE v8.33 regexp documentation latest available release and currently implemented in AutoIt beta. SQLitespeed is another feature-rich premier SQLite manager (includes import/export). Well worth a try.SQLite Expert (freeware Personal Edition or payware Pro version) is a very useful SQLite database manager.An excellent eBook covering almost every aspect of SQLite3: a must-read for anyone doing serious work.SQL tutorial (covers "generic" SQL, but most of it applies to SQLite as well)A work-in-progress SQLite3 tutorial. Don't miss other LxyzTHW pages!SQLite official website with full documentation (may be newer than the SQLite library that comes standard with AutoIt) Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now