
SEO Spider
- SKU: SCREAMINGF
The Screaming Frog SEO Spider is a website crawler, that allows you to crawl websites? URLs and fetch key onsite elements to analyse onsite SEO.
The Screaming Frog SEO Spider is a website crawler, that allows you to crawl websites’ URLs and fetch key onsite elements to analyse onsite SEO.
The SEO Spider is a powerful and flexible site crawler, able to crawl both small and very large websites efficiently, while allowing you to analyse the results in real-time. It gathers key onsite data to allow SEOs to make informed decisions.
Key Features:
- Find Broken Links: Crawl a website instantly and find broken links (404s) and server errors. Bulk export the errors and source URLs to fix, or send to a developer.
- Audit Redirects: Find temporary and permanent redirects, identify redirect chains and loops, or upload a list of URLs to audit in a site migration.
- Analyse Page Titles & Meta Data: Analyse page titles and meta descriptions during a crawl and identify those that are too long, short, missing, or duplicated across your site.
- Discover Duplicate Content: Discover exact duplicate URLs with an md5 algorithmic check, partially duplicated elements such as page titles, descriptions or headings and find low content pages.
- Extract Data with XPath: Collect any data from the HTML of a web page using CSS Path, XPath or regex. This might include social meta tags, additional headings, prices, SKUs or more!
- Review Robots & Directives: View URLs blocked by robots.txt, meta robots or X-Robots-Tag directives such as ‘noindex’ or ‘nofollow’, as well as canonicals and rel=“next” and rel=“prev”.
- Generate XML Sitemaps: Quickly create XML Sitemaps and Image XML Sitemaps, with advanced configuration over URLs to include, last modified, priority and change frequency.
- Integrate with Google Analytics: Connect to the Google Analytics API and fetch user data, such as sessions or bounce rate and conversions, goals, transactions and revenue for landing pages against the crawl.
- Crawl JavaScript Websites: Render web pages using the integrated Chromium WRS to crawl dynamic, JavaScript rich websites and frameworks, such as Angular, React and Vue.js.
- Visualise Site Architecture: Evaluate internal linking and URL structure using interactive crawl and directory force-directed diagrams and tree graph site visualisations.
- Schedule Audits: Schedule crawls to run at chosen intervals and auto export crawl data to any location, including Google Sheets. Or automate entirely via command line.
- Compare Crawls & Staging: Track progress of SEO issues and opportunities and see what’s changed between crawls. Compare staging against production environments using advanced URL Mapping.