...provide IT professionals with an automated Job finding & application service. The tool will run on a Windows client and store the results of spidering multiple jobsites (websites with Jobs) into a database that allows offline searching, filtering & matching against a candidates skill set. Jobs identified as good matches (job title/location/rate/skills)
...provide IT professionals with an automated Job finding & application service. The tool will run on a Windows client and store the results of Spidering multiple jobsites (websites with Jobs) into a database that will allow offline searching, filtering & matching against a candidates skill set. The jobs identified as "stong matches" can
...[kirjaudu nähdäksesi URL:n] [kirjaudu nähdäksesi URL:n] [kirjaudu nähdäksesi URL:n] * Hotels with data from: [kirjaudu nähdäksesi URL:n] [kirjaudu nähdäksesi URL:n] [kirjaudu nähdäksesi URL:n] (we probably can supply xml feed) [kirjaudu nähdäksesi URL:n] [kirjaudu nähdäksesi URL:n] [kirjaudu nähdäk...
...ERP and CRM functions, barcode generation, interfacing with accounting software, website security, customer data compiling, email server, etc. The coder should use PHP, XML, and MySQL to construct this site. This b2b site allows employees of corporate clients ordering company-logo promotional items by accessing [[kirjaudu nähdäksesi URL:n] name]
...technology includes **PHP**, **ASP**, JSP, CGI, SOAP, API, **XML, SQL, MySQL** based Linux and Windows OS. You will be a lead developer, using these and other tools and your experience to build web-based applications for public consumption. You should embrace **PHP**, **ASP**, JSP, CGI, SOAP, API, **XML, SQL, MySQL** and have some experience in this. We think
...information from a number of websites and aggregate the information. Say for example - obtaining news worthy inforamtion by spidering a site and listing the headers such that on my site the user can click through to the original site. There needs to be some control over time. Additonally some sites will provide an xss data feed or xml feed which would need
...will built around a live XML feed (provided by a third party) of leagues, teams, players and scorers in the English and Scottish divisions. We need an additional freelance/contract programmer to join the team and help us spec out and then build the site using Java, Mysql, XML, Apache. The site back-end will need take the XML feed, extract and store
...must be adjustable of the popup - able to show an exit popup when leaving a specific site. - number of URL's for the popup and exit popup must be adjustable. - remember the websites that are most visited by the user and put the top 5 (or more, ie adjustable) in a dropdown menu so the user can visit them quickly and simple by clicking on the website and
...our domain, [kirjaudu nähdäksesi URL:n] This will be a similar lance , free-lance website..... except, geared towards highly skilled programmers that have experience cloning other websites. Should let the programmers have a portfolio of cloned sites in sister site, [kirjaudu nähdäksesi URL:n], which you will create as well. Rate it script will also be found on clone...
...necessary regardless of country. Skill Set: 1. Web Design: Must be able to reproduce a high end "look and feel" design and include existing content for ALL websites that is similar to the look and feel of http://www.webmasterempire.com. A good example of what we are looking for can be found at: [kirjaudu nähdäksesi URL:n] (this is NOT our web
...configuration file. It could have a GUI or it could be a simple textfile / XML file. In function 2, the source should be read as a browser does it, e.g. no ‘new lines’ / everything is read as one line. So basically what the program should do is to get a list of all websites in the world (at least the one that is interesting) and then check if they
...design (basic design has been done) and create a centralized CMS from scratch. Our customers will be able to modify their websites, maintain their customerprofile, view website statistics, etc. We will be able to add/modify/delete websites and customers. See below for a full featurelist. The CMS must be able to interact with FreeBSD 5.1 or Red Hat Enterprise
We get data from XML feeds from several other websites and insert that into our own site's database. However, one of these feeds doesn't work perfectly anymore. Some HTML tags are incomplete or broken off. We need someone to fix that. Fast and simple job.
Forrest is an XML system based on Apache Cocoon and is described at the web site: [kirjaudu nähdäksesi URL:n] I wish to have a suite of VB6 routines that enable me to automate the creation, content management, security and maintenance of websites based on Forrest. This processes involved would be a sequence of commands in a ‘Script’ file. This will
We need an xml code to be written to have data from our website, [[kirjaudu nähdäksesi URL:n]], posted on 2 other websites. Our data is on sql2000 and our site is written asp. The required information from the 1st receiving site can be found at <[kirjaudu nähdäksesi URL:n]>. The second receiving site will use the same
This project is to scrape several websites and to provide an output file in xml format. I would ideally like the scraper to be written in Perl. ## Deliverables 1) Complete and fully-functional working program(s) in executable form as well as complete source code of all work done. 2) Installation package that will install the software (in ready-to-run
...It has two sections (affiliates & partners). In affiliates section, we have websites to whom we give HTML results. Its html section works pretty well and searches can be performed within few seconds. In partner section, we want to give results to other search engines by XML feed. All required admin scripts for partner section are complete but partner
...website data and graphically plotting statistics that show each operator's logged time and status on the phones throughout each day. 2. Running the front-end of multiple websites and actively changing the sites' listings based on the results of the polling process. 3. Providing a web-based interface for our operators to change their call status online
... scraping several websites for their taxononomies, downloading the taxonomies into a graphical tool (boxes and connections) that can easily change the hierarchical relationship between nodes (boxes), and then downloading the final taxonomy into an XML file. 2. scraping the same websites for their taxonomies, uploading the XML file of the final
Looking for a BHO of IE that will monitor activity on certain pre-determined websites (clicks, or words typed in forms). When a user clicks a link or types in a word in a form on a pre-determined website, you will call an xml on our server that will reply with html content to add on the next page at a certain pre-determined location. ##