Hello happy programmers!
We need a Linux/Perl script with a PHP front end. The purpose of this form will be to crawl our whole website to find internal links of our specification. In other words, as an example, I want to be able to search our website to see if it links to '[url removed, login to view]' and for it to find and display every page on our site that has a link to that url.
The form should have the following:
1.) A 'Go' button.
2.) A text box to type in the link/phrase to be found.
3.) Another text box to list comma delimited directories that we want to exclude (backup directories, image directories etc).
Please note: We don't want to have to use ssh in order to run this find script! It needs to be web based.
Qualified bidders only. This is a relatively simple* project, and we'd like it to be done in a timely manner.
Thanks for your bids in advance, everyone!
We don't want to confuse anyone by using the word Crawl, sorry if we have.
This doesn't need to crawl our website the way Google does, going from one page to another. Instead, it can just search the files on our server (I think this is faster).
In other words, this is more of a find script, it just happens that we're using it to find links right now. Down the road we might use it to find other strings like keywords.
We don't want this script to crawl our website, so to speak, but instead we need it to search through the files located on our server.
Using the word crawl was misleading.
Please retract your bids if you are unable to complete this project in that manner. Thanks!