I need a script that will connect to website and grab all internal/same domain links. Then go to each url using multicurl and put the page into a string. Then I need to match the most popular keyword(s) with a category with the least amount of words, which will be in a database table that is already created.
When this is all complete I should have data for every page found, neatly on the screen. Then I can click a button and it should insert all the information into a database which again is already created.
Any configurable settings should be neatly organized in a file called [url removed, login to view] . This way I can easily make configuration changes.
I will give the winning bider an FTP site, url to FTP data and database access.