The porpose is to create a website crawler which individues eternal dead links.
It should work on Windows and has not to make use of large ram (urls should be saved on MSsql not in the ram) , external dead links should be checked on whois in order to know if the domain has been registered and when it expires.
Would be great if this app make use of multhithreading, so that we can set how many threads we want to use.
We should also be able to choose the domain (the app should also scan subdomains related to the domain) , threads, frequency (number of page per second) and have, at the end a report of all dead links registered or not.
19 freelancers are bidding on average $152 for this job
Hi I read your requirement and find out that it is suitable for me. I have already build such scraper and I am sure that you can receive the result what you really need. I am looking for your reply Thank you
I love web crawling and project is interesting me. About my crawling and scrapping experience, I did crawling on from Twitter to Google. I am bit confused about the project, initiate a chat to discuss more