I need some scripts written using the Google API or cURL to capture and store G00GLE results into a DB.
I perform Advanced Searches and then need to store as many of the results (Title, URL, Timestamp, PageRank)
My first question is if the results are more than 100,000 can we capture all of them? How many results can we store before G00GLE blocks us?
Once we have the stored results I need to run another script to capture the PageRank for each URL, again if we have over 100,000 URL's at what point does G00GLE block your reqeusts?
I'm looking to work with someone who has done this before.
The ideal process would be like this:
1. I perform an advanced search either through googles page or a custom form we dev.
An advanced search would be something like this:
[url removed, login to view]
2. Each search we perform needs to be saved under a new campaign name or name, each search we perform will use different search terms and advanced settings.
3. Once we have all the results in the db, we'll need to get the PageRank of each URL and store a timestamp when we last checked PageRank.
4. A basic webpage where we can sort the results based on PageRank, each URL clickable, opening in a new page/Tab. Once the page is sorted by pagerank, When the URL is clicked we need to save a timestamp of when we last Visited the URL so we can keep track of which URL's we've already visited.
Who can help me with this? Please base your bid on a Long Term partnership, We want to expand this project with more features, so a long term relationship is required.