Count total pages, detect robots.txt status, and analyze your site structure in one go.
Upload .xml or .txt sitemap
A Sitemap.xml is the roadmap for search engine crawlers. If your sitemap is bloated, missing important pages, or incorrectly referenced in robots.txt, Google may struggle to index your site properly.
Our Sitemap Auditor doesn't just count numbers. It performs a deep dive into sitemap indexes, verifies your discovery files, and helps you identify which parts of your site are being prioritized for search.
Identify when your pages were last updated to ensure Google is seeing your freshest content first.
We automatically follow sitemap index files to find nested URLs that other simple counters miss.
Automatically checks if your domain correctly points crawlers to your sitemap via robots.txt.
Sitemaps should not exceed 50,000 URLs per file.
Max file size is 50MB (uncompressed).
All URLs must be properly escaped for XML.
Ensure all URLs use the same secure protocol.
A sitemap index is a file that lists other sitemap files. Large sites use them to organize thousands of URLs into manageable chunks.
Manual counting is prone to errors, especially with nested sitemaps or large datasets. Our tool handles recursion and large files instantly.
No. We process the sitemap in real-time. We do not store your URL lists or site structure on our servers.
Yes! You can upload an .xml or .txt sitemap file directly from your computer for a quick count and audit.