I need someone familiar with ASP and PHP to direct me towards a solution (there is likely to be some software out there, I just can't find it) which will allow me to scrape (i.e. batch processing or automatic queries) records from a public database sitting on a website. One of several examples I could give: see [url removed, login to view]
This is a Spanish government website which lists all insurance agents in Spain, so that members of the public can query the website to determine if a particular agent is legitimate. Now I need to download the whole database for some analysis I am doing on the industry there. If you leave all fields blank and click on the left hand grey shaded button "Buscar" you will see there are about 99,600 records in there, and they are returned in groups of 10 with hyperlinks to the full record. Obviously I could sit there for three years and download the lot one by one. But is there a way to automate this, to throw all the records into an html or txt file so that it can be processed and pulled into Excel (clean formatting is not important - I can clean it up). IMPORTANT NOTE: this particular website does NOT return individual URLs with each query - if it did I could use any number of website downloaders out there to do this, but the fact that individual URLs are not returned makes it a problem for me and I haven't worked out a way around it. There are several other websites where I have the same problem - ASP or PHP databases which allow querying by forms.
What I am looking for is for someone simply TO REFER ME TO A PARTICULAR TOOL out there that will solve the problem. If the solution requires some special programming then I will set that up as a separate project.