I am required to log on to a secure web site to check on client eligibility. After logging on, a date range is entered and an ID number. The resulting web page will yield "not found," "not eligible," or a table from which I extract a name, address, DOB, date, and county. I repeat the process thousands of time with the ID number + 1. I am interested in a perl script that would automate this for me. The data could go into a csv file which I could import into MS Access.
I currently use a macro to enter the URL into a browser and then scrape the data from the resulting web page. A sample URL would look like:
[url removed, login to view];PatientPCN=524994488&pn=usercontrols%2fAcuteCare%2fEligibilityVerificationResults&ProviderNo=008330501&EligibilityFrom=4%2f1%2f2009&PatientSSN=&PatientFName=&PatientDOB=&PatientLName=
with the only change being the number after "PCN=". I would need the script to enter a date range and beginning ID number before the script runs. "4%2f1%2f2009" in the URL is the date 04/01/2009. I would want to specify this.
I will be seeking a large amount of data. I would be interested in having the process search for several days.
I prefer that it look up ID numbers from an MS Access 2003 database and record the results there instead of using ID number + 1 and a csv file, if it does not add too much to the cost or complexity. However, I am on a tight budget and would settle for the latter.