Our host sucks & lost a bunch of data. I need to get it off google's cache as fast as possible, before it goes away. ONly take this project if you can start now.
An informative document is attached on how to find and save the articles
save all of the pages that are unique
You don’t need to search for obvious duplicate articles, just skip duplicates. If you leave out
articles though, that will result in a poor performace review, so don't do a shoddy job just to get done quickly. It will be obvious if you have all the articles b/c
you must put the filename beside the article. If it is a duplicate write "duplicate" or "not found" if you cant find it using the methods I outline in the guide
There are really only maybe 60 UNIQUE articles in this list, it just looks like more.
$30, unless you tell me that there is some real hidden time consuming difficulty I didn’t see.
$40 if you get it to me in 3 hours or less
Sorry, I can't wait 2 days, you must start immediately if you accept. I'm sure you understand due to the fact that our valuable website files are already dissapearing
from google's cache. I'm trying to hire people to get them asap