deadlinkchecker bot

What is deadlinkchecker?

Dead Link Checker is a web crawler tool operated by DLC Websites that scans websites to identify broken or "dead" links. It functions as a specialized auditing crawler designed to help webmasters maintain their site's link integrity. The tool systematically navigates through web pages, checking each hyperlink to verify whether it leads to an accessible destination or returns an error.

When crawling websites, Dead Link Checker identifies itself with one of these user-agent strings:

  • www.deadlinkchecker.com Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36
  • Mozilla/5.0 (compatible; Dead Link Checker; http://www.dead-link-checker.com/)

The crawler operates by taking a provided URL and scanning it for links to other web pages. It then checks each discovered link to confirm its functionality. For links that exist and belong to the same website as the original URL, the crawler continues scanning those pages for additional links, creating a recursive checking process. The depth of this search depends on the selected scan type—quick scans only check links on the provided page, while full scans can go up to ten levels deep.

Dead Link Checker respects robots.txt directives, including both Disallow and Crawl-delay instructions, making it a well-behaved crawler that follows web crawling etiquette.

Why is deadlinkchecker crawling my site?

Dead Link Checker may be crawling your website for several common reasons:

  1. A website owner, webmaster, or SEO professional has initiated a scan of your site using the Dead Link Checker service to identify broken links.

  2. If you've previously used the service yourself, it could be performing a scheduled check if you've set up automated scanning.

  3. Someone analyzing your site's SEO health might be using the tool to evaluate your site's link integrity.

  4. A third-party service might be using Dead Link Checker to assess websites for their clients.

The crawler typically focuses on finding hyperlinks throughout your site and testing their accessibility. Its crawling frequency depends on how it's being used—it could be a one-time scan or part of regular scheduled maintenance checks. In most cases, these scans are intentionally initiated by someone with a legitimate interest in your site's functionality.

What is the purpose of deadlinkchecker?

Dead Link Checker serves several valuable purposes for website owners and SEO professionals:

  1. Website Maintenance: It helps identify broken links that could damage user experience and site credibility.

  2. SEO Optimization: Broken links can negatively impact search engine rankings. Search engines like Google consider link integrity when evaluating site quality.

  3. User Experience Enhancement: By identifying and fixing dead links, website owners can ensure visitors don't encounter frustrating 404 errors.

  4. Technical Auditing: It provides a systematic way to monitor site health, especially for large websites with numerous internal and external links.

The service offers both free manual checking options for single and multiple sites, as well as subscription packages for automated link checking. This makes it particularly useful for agencies and webmasters who need to regularly monitor multiple sites.

How do I block deadlinkchecker?

Dead Link Checker respects robots.txt directives, making this the simplest way to control its access to your site. To block it completely, add the following to your robots.txt file:

User-agent: www.deadlinkchecker.com
Disallow: /

You can also use the Crawl-delay directive to limit its crawling rate:

User-agent: www.deadlinkchecker.com
Disallow: /private-directory/
Crawl-delay: 1

The Crawl-delay value represents seconds between requests, helping reduce server load during scans.

If you want to block specific directories while allowing the crawler to check other parts of your site, you can specify particular paths:

User-agent: www.deadlinkchecker.com
Disallow: /admin/
Disallow: /private/

For more aggressive blocking, you can implement IP-based restrictions since Dead Link Checker operates from specific IP addresses. However, this approach is generally unnecessary since the crawler respects robots.txt rules.

Keep in mind that blocking Dead Link Checker means you won't benefit from its ability to identify broken links on your site, which could impact your site's user experience and SEO performance over time.

Something incorrect or have feedback?
Share feedback
deadlinkchecker bot logo

Operated by

SEO crawler

Documentation

Go to docs

AI model training

Not used to train AI or LLMs

Acts on behalf of user

Yes, behavior is triggered by a real user action

Obeys directives

Yes, obeys robots.txt rules

User Agent

www.deadlinkchecker.com Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36