Skip to content

nikunjagarwal321/Web-Crawler

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 

Repository files navigation

Dex-Crawler

What it is?

It is a simple web crawler tool which takes initial urls(Url Seed), crawls all the pages of the same domain and finds outs the dead-links and stale-links across the website.

How to run?

Use-case :

While using the documentation SDK Tools at https://developer.amazon.com, developers often end up at dead-links. These dead-links are links which might point towards repositories which might have been moved somewhere else. Therefore, Dex-Crawler stores all those links so that they can be removed.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages