Greetings,
Require "search string filehost links extractor" software which is expected to function & deliver results as below--
windows applications or scripts on linux platform(setup procedure and customization if fairly explained and guided).
web sites keep changing with newer ways of preventing automation- dealing with dynamic web!),
## Deliverables
Greetings,
Require "search string filehost links extractor" software which is expected to function & deliver results as below
1. For given search string(or multiple search strings), software/script should crawl the web and automatically find the source page url's and download links related to search string perfectly ..
Download links of other search strings found on the web page visited should not be considered.
Web pages traversal depth should be customisable. Usually download links are found in the first page or second page maximum, but in rare cases it traverses to upto five webpage links.
2. List of links for given search string are supplied through google alert email id ( here only the links containing the particular search string should be taken for processing).
3. software/script should take as input a list of custom web sites list(should be appendable), search for the search string and extract download links based on filehost list(this is a custom list of filehosts and should be appendable).
4. software/script should work with web sites with login requriement and
captcha image handling to navigate inside those sites and discover download links.
5. Should be able to automate software/script execution either through windows
scheduled task or inbuilt task scheduler, based on options 1 or 2 or 3 or 4.
6. emailing capability for results found based on search string name at end of each runtime.
Example search strings -- software titles OR scripts OR book titles OR movies OR mp3-music OR videos..
Additional information if required can be provided later
Best Regards,
Mahesh