You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In order to keep a valuable documentation, it's important to not have any dead links present in it.
Would be interesting to have some tools / web scrappers able to detect that.
I am aware of ahrefs which does the job with the free offer but might be a good opportunity to evaluate what is available. (Maybe a github action that we could integrate in some automated workflows?)
The text was updated successfully, but these errors were encountered:
I'm thinking that this shouldn't really be that hard to do? Worst case we should be able to self host it on some server (mine for example) and then every night or so scrape the site for all <a tags and check availability? If there's an issue reaching a site then push an issue?
Something free and already complete would of course be better.
In order to keep a valuable documentation, it's important to not have any dead links present in it.
Would be interesting to have some tools / web scrappers able to detect that.
I am aware of ahrefs which does the job with the free offer but might be a good opportunity to evaluate what is available. (Maybe a github action that we could integrate in some automated workflows?)
The text was updated successfully, but these errors were encountered: