The porpose is to create a website crawler which individues eternal dead links.
It should work on Windows and has not to make use of large ram (urls should be saved on MSsql not in the ram) , external dead links should be checked on whois in order to know if the domain has been registered and when it expires.
Would be great if this app make use of multhithreading, so that we can set how many threads we want to use.
We should also be able to choose the domain (the app should also scan subdomains related to the domain) , threads, frequency (number of page per second) and have, at the end a report of all dead links registered or not.
16 pekerja bebas membida secara purata $153 untuk pekerjaan ini
I love web crawling and project is interesting me. About my crawling and scrapping experience, I did crawling on from Twitter to Google. I am bit confused about the project, initiate a chat to discuss more