AhrefsSiteAudit
What is AhrefsSiteAudit?
AhrefsSiteAudit is a specialized web crawler developed and operated by Ahrefs, a company that provides SEO and marketing intelligence tools. It functions as a diagnostic crawler designed to analyze websites for technical SEO issues and on-page optimization opportunities.
The crawler identifies itself through two distinct user-agent strings:
- Desktop:
Mozilla/5.0 (compatible; AhrefsSiteAudit/6.1; +http://ahrefs.com/robot/site-audit)
- Mobile:
Mozilla/5.0 (Linux; Android 13) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.5359.128 Mobile Safari/537.36 (compatible; AhrefsSiteAudit/6.1; +http://ahrefs.com/robot/site-audit)
Unlike general-purpose crawlers, AhrefsSiteAudit simulates both search engine behavior and actual user interactions. It renders JavaScript, loads assets, and evaluates page performance to provide a comprehensive analysis of how websites function under real-world conditions. This crawler specifically powers Ahrefs' Site Audit tool, which categorizes issues by severity and offers actionable recommendations for improvement.
AhrefsSiteAudit is recognized as a "verified good bot" by Cloudflare, a major web security and performance company, indicating its legitimate and well-behaved crawling practices.
Why is AhrefsSiteAudit crawling my site?
If you're seeing AhrefsSiteAudit in your logs, it means someone is using Ahrefs' Site Audit tool to analyze your website. This could be:
- You or someone on your team who has set up an audit through an Ahrefs account
- An SEO consultant or agency you've hired to evaluate your site
- A competitor researching your site's structure and performance
The crawler is typically looking for technical issues such as broken links, duplicate content, slow-loading pages, improper metadata implementation, and other SEO-related anomalies. The frequency of visits depends on how often the person running the audit schedules it—this could be one-time, weekly, monthly, or custom intervals.
AhrefsSiteAudit's crawling is authorized when initiated by the site owner but may be unauthorized when conducted by third parties. However, it respects standard web crawling protocols by default.
What is the purpose of AhrefsSiteAudit?
AhrefsSiteAudit's primary purpose is to power Ahrefs' Site Audit tool, which helps website owners identify and fix technical and on-page SEO issues. The crawler examines websites for problems such as:
- Broken links and 404 errors
- Slow performance and page loading issues
- Security misconfigurations
- Duplicate content
- Improper metadata implementation
- Mobile responsiveness issues
- Crawlability and indexation problems
The data collected is used to generate detailed reports that categorize issues by severity and provide specific recommendations for improvement. This helps site owners optimize their websites for better search engine visibility, user experience, and overall performance.
For website owners, this crawling provides significant value by identifying issues that might otherwise go unnoticed but could negatively impact search rankings and user experience.
How do I block AhrefsSiteAudit?
AhrefsSiteAudit respects robots.txt directives by default. To block it from your entire site, add these lines to your robots.txt file:
User-agent: AhrefsSiteAudit
Disallow: /
To block it from specific sections only:
User-agent: AhrefsSiteAudit
Disallow: /private-section/
Disallow: /development/
Allow: /
To control crawl rate rather than blocking completely, you can use the Crawl-delay directive:
User-agent: AhrefsSiteAudit
Crawl-delay: 10
This sets a 10-second delay between requests, reducing server load.
Many content management systems like Shopify block AhrefsSiteAudit by default, so you may need to modify your robots.txt to allow it if you want to use the Site Audit tool on your own site.
It's worth noting that blocking AhrefsSiteAudit will prevent anyone (including yourself) from using Ahrefs' Site Audit tool to analyze your site. This means you might miss out on valuable insights about technical issues affecting your site's performance and search visibility. If you're concerned about server load, consider using the Crawl-delay directive rather than blocking the bot entirely.
If you're a verified site owner in Ahrefs, you can also adjust crawling speeds within the Site Audit tool settings to balance between quick results and server load.
Operated by
SEO crawler
Documentation
Go to docsAI model training
Acts on behalf of user
Obeys directives
User Agent
Mozilla/5.0 (compatible; AhrefsSiteAudit/6.1; +http://ahrefs.com/robot/site-audit)