We would like a program that will spider a website, and build a sitemap.
The sitemap must be formatted in Google's new Sitemap XML format:
Details of their format is <[url removed, login to view]>
We would want to have a web interface where the user just inputs the URL that they want to spider.
The program running on a linux server will then spider the site, create the google xml file for that site, and then allow the user to save the file to their local computer.
The user will then be able to submit the file to Google's sitemap program with no further modifications needed to the file.
The program must be able to discern the difference between internal links and links to other URLs, and only list the links that are actually located on that domain.
1) Complete and fully-functional working program(s) in executable form as well as complete source code of all work done.
2) Deliverables must be installed on our linux server. We will provide login info.
3) All deliverables will be considered "work made for hire" under U.S. Copyright law. Buyer will receive exclusive and complete copyrights to all work purchased. (No GPL, GNU, 3rd party components, etc. unless all copyright ramifications are explained AND AGREED TO by the buyer on the site per the coder's Seller Legal Agreement).