...point subdomains to horrible URL's with port numbers. I have a box I use to run a Plex server but it has no apache or nginx or lightspeed or ANY web server software so the urls are unmanageable example: [kirjaudu nähdäksesi URL:n] (this is fake) I have a VPS that can be used with virtualmin/webmin to setup DNS for the domains and subdomains
[kirjaudu nähdäksesi URL:n] [kirjaudu nähdäksesi URL:n] School name full address the city name in a separate column phone number website URL an email address column for -1st url i need the file today easy sites no page limitation or any blocks
Get headless chrome (with chromedriver, selenium, python) to automatically continue through iframe redirect urls. Current Chrome headless stops at a page if there is an iframe. This may be normal behavior. I want an option to where it takes the first iframe src on a page (if it exists), and continues onto that url.
Hi, my website does not have Seo friendly urls, it is made on php by another developer that does not know how to do it. So i need someone to correct both Product and Search pages, to be seo friendly. If you have more questions let me know. To proof you read description let me know code 1023 Here is a sample of the product page: http://indiceimoveis
I have a spreadsheet with 12,058 names on it and 96,464 [kirjaudu nähdäksesi URL:n] URLs that take you to search pages. You will create a spreadsheet in which the URLs are replaced with the number of search results that appear on that URL's page. For example, cell B2's URL is [kirjaudu nähdäksesi URL:n] . The number of search results on that page is 1838...
You do not need to write any tags. The tags are already written for these webpages - I will send you a list of 400 URLs in Excel and what I need from you is to send me back an excel sheet with the title and meta tag for each of the URL
...automatically detect and remove urls from meta description (before they get generated). There is one file ([kirjaudu nähdäksesi URL:n]) that is responsible of the meta description generation. You would just need to create a php function that detects urls, such as this one : [kirjaudu nähdäksesi URL:n] In the
Hello, I have a list of images URLs in an excel file, I would like to have a macro file that I can use often to download the images and then in next column put his path that the image has been download, then in the next cell, the path of the image renamed. Thanks to contact me to discuss about it
I have a simple PHP Crawler for single URL it crawls and saves record into DB Now we need a new Freelancer who has skills in PHP Crawling work It should update the sour...Crawler for single URL it crawls and saves record into DB Now we need a new Freelancer who has skills in PHP Crawling work It should update the source code to Crawl for Multiple URLs
We have a list of 4000 companies (names, URL) in which there are several duplicates. Your task - find and group the duplicates next to each other. Remove nonduplicates.
...remove urls from the meta tag ? I know this is a problem with vbulletin engine but i've already asked them about sorting this issue and they said it was out of their charge. By the way, the 2 php files responsible of the meta description tags include it and they can be seen here : [kirjaudu nähdäksesi URL:n]
I need a PHP Crawler for multiple URLs. I need a PHP Expert with good knowledge of nested Loop and Crawling the URLs I need at LOW budget
...PHP files or in .htaccess, the search results URLs, to get SEO Friendly URL Structure, on [kirjaudu nähdäksesi URL:n] Example: Instead of (before click): [kirjaudu nähdäksesi URL:n]=1&data_location_1[country]=189 after click [kirjaudu nähdäksesi URL:n] I prefer get nice friendly URLs as: (before and after users click on the link
... Under the box will be a button called "Get Title Tag Links" The box will allow for up to 5000 URLs to be entered. When someone enters in urls, and then clicks on the button, the tool will then go to all the urls listed and grab the title tag of each urls and will make the title tag the link to the page. See the attachment. I would like this tool done
Got a couple of postback URLs. We need to post their parameters into mysql database when a hit comes in.
Looking for someone to Rewrite the search results URLs to get SEO Friendly URL Structure Example: Instead of (before click): [kirjaudu nähdäksesi URL:n]=1&data_location_1[country]=189 after click [kirjaudu nähdäksesi URL:n] I prefer get nice friendly URLs as: (before and after users click on the link.) matrimo.
Looking for someone to Rewrite some URLs. Example: Instead of (before click): [kirjaudu nähdäksesi URL:n]=1&data_location_1[country]=189 after click [kirjaudu nähdäksesi URL:n] I prefer get nice friendly URLs as: (before and after users click on the link.) [kirjaudu nähdäksesi URL:n] Another example: Instead
...updated and sent to search engines. The correct structure of the web is index/category/product. Redirect all other urls to this structure, not product?=tags, product?=search, index/product or anything like that. Only seo friendly urls with the structure index/category/product, the rest redirected and canonical. In the sitemap the same, only the url
10 - 12 hours create a copy of the customers website on your server, debug and fixes to prestashop functionality. 1) All pages working with SSSL and 2) friendly URLs working. 1) prioroty 1 and 2 is priority 2) .
Hi, I have a Prestashop instance at 1.5.5. The "friendly URLS" feature will not work and I need all pages to redirect to https:// . There is a valid cert in place. I need someone to get and sort these two issues out. Please show experience and credentials. I am happy to agree an hourly rate. I am an IT professional but just do not have
Urgent work - install and slightly customize 4 of these theme landing pages : [kirjaudu nähdäksesi URL:n] URGENT WORK : must be done in 4 hours MAX
Hi~ I am new to this platform. And I want to scrape more than 2~3M of urls. I already written a script that only runs on MacOS which use App Script. This is essential condition. Currently only single machine can run a single script. So if you are an expert of App Script or you have multiple number of MacOS then it's applicable on this project.
i will provide u list of urls,all urls are from same site,its a popular adds website from usa,i need phone numbers from those urls to excel Need to be done using script no manual work please the site has some security..only experts are required i have attached the list of urls from where i want to extract phone numbers
... Overview: Given a list of urls in a txt file the script will go to each url and scrape certain information from those pages. (I will use scrapebox to get the list of urls) The script must have the following features: 1. Allow me to enter in multiple footprints to scrape multiple items from the url 2. Enter the scraped data into a csv file th...
It's a very simple task 1. Visit URL 2. Copy the IP address and result process number to google spreadsheet Payment is 0,001$ per url (10 seconds to fill) This is 0,1$ per 100 rows and results This is the payment our client make and we can not increase
Hi, i need an SEO gun to restore old pages/URLs from an old website, to the new WordPress website. Using content found on Wayback Machine, content is to be copied across page by page. URLs are to be restored EXACTLY the way as the permalink used to be. Tools such as AHREFS and Search Console are to be used to identify 404 pages and restore lost links
This is a small project to find less than 100 specified product pages with hidden URLs meeting my description. I know the form of the URL, but we need to find the right range where these pages are, so you'll need software or programming skills to help. Thank you.
I will provide you with a list of websites I want you to scrape the business emails from. I then want you to create an Excel document and put the business name in one column and the email address in the other column. Total number of websites is around 6,000.
...contains list of valid urls - one url per line) and detect whether site is an e-commerce shop by trying to detect existence of shopping cart (this will require to manually test some adresses to provide as many tests using regexp or any other technique to discover this as accurately as possible - you will be provided with a starting list of urls). The output
On this said web application that are a few uris that once you perform a GET operation to them, they will be slow and can take up to 50 seconds of loading time. The most critical one is the landing page, the remaining will be disclosed once you bid to the project.
...collect, label and organize data in a particular way for a given list of URLs. This project will NOT be paid hourly or flat rate. Instead it will be paid "piece rate", based upon your productivity levels. This means if you are good you can make more money in an hour than someone who is slow. Pay will be based upon how many data points you record within
...create an excel spreadsheet list of 4,000 websites from restaurants, and medium sized businesses in the Boston and greater Boston area. I want the website, the email address, and the company name in separate columns. I am looking to compile all of these emails for a mail merge. The skills required are: Data processing, data entry, excel, web scraping
I have over 100 WordPress url sites that need to be launched. Includes the registration and content population of each site, theme set up, and basically the monitoring of each over the initial launch periods. I seek a qualified, affordable English speaker that can walk me through this for the first ten sites. This will be done over a month or two and we go from there for the rest. Please bid you...
...DETAILS OF THE JOB I will provide you a list of article titles, they will be like, "Top 20 Tactical Belts In 2018" and "Top 20 Mens Ripped Jeans In 2018" and you are to go online and find the very best products for each article that are selling on well known online sites (Amazon, nordstrom, macys), into a list. It is CRUCIAL that you have a strong understand...
Deseo poder importar productos de Amazon a mi tienda de mercadolibre, deben poder traerse los atributos de cada articulo. Debe ser mediante cargas masivas de mínimo 5.000 productos. Debe mantenerse monitoreado a diario el precio y el stock
Hello, I need someone who can write me a script in Python which will read excel file (URLs in it). Execute one by one and if the page loads, say 'its working'. If not, say 'not working' and saves this output in a new excel file next to the URL. Let me know if you need more clarification.
It's about the domain [kirjaudu nähdäksesi URL:n]
Step 1 : User Clicks on my domain : [kirjaudu nähdäksesi URL:n] Step 2: User is redirected to [kirjaudu nähdäksesi URL:n] Step 3: User is further redirected to [kirjaudu nähdäksesi URL:n] Here i want the second URL to load properly buy must not be visible to anyone. Hint: This script will be written in PHP through iFraming. But the iFrame tag must not be used in th...