It automatically tracks new/removed URLs from a site’s sitemap.xml, and notifies you when something changes. This is useful for: - SEO (detecting new pages or deleted ones) - Competitive analysis (see when competitors launch new features or content) - Site reliability (alert if a big chunk of pages disappears)
Unlike generic crawlers, it focuses only on sitemap data → less overhead, more precise tracking.
I made it because I kept manually checking competitor sitemaps for updates, which was painful. Would love feedback from the community — what use cases do you see for this?