Töitä ei löytynyt
Pahoittelut, emme löytäneet etsimääsi työtä.
Löydä viimeisimmät työt täältä:
Long Term Working with 2 Helpers
Hello, I have list of urls, you need to visit each url and save web page screenshot using "Easy screenshot" firefox addon. Total about 540 urls/screenshots. Paying $15. Need it done in 24 hours. Please bid $5/hour for 3 hours for me to consider your bid. Thanks.
LUX Shopify Design install LUX Shopify Design install LUX Shopify Design install LUX Shopify Design install
I have idea and design of an application. Need an experince developer who can develop this app within short period of time. Plz do bid if you have experince, as it is quite complex app. More details will be provided personally!!
(ongoing work) Writing articles on various topics
I write good quality articles of at least 350 or more words without any plagiarism, and without any grammatical mistakes. I'm good at both U.S and British english. I assure you of high quality contents within your stipulated time. My charges are negotiable and very much flexible. I have also helped many internet marketers to generate full time income with my unique services. It would be my p...
Using the Google-provided Go code and Gmail API I want to have a system hosted on Heroku where it will collect Gmail messages from three separate accounts (credentials in a shared database), stash the messages in another table in the shared db, and "archive" them on the Gmail side. It will also need to look in the database to see if any messages need to be sent out, and will send tho...
Need a simple logo and need in few time and need best work
to review, edit, and publish article with attached references
I need a web scraper written for the following url: [login to view URL] The "Load Board" box in the middle of the page will need to be clicked. Information on the MergedFile will need to be scraped. The number of pages will vary. All pages will need to be scraped. The output should be a pipe (|) delimited file with the following column mappings: origin_city --> data located...