1. Preparing for Your Screaming Frog Audit
Before diving into a full site audit with Screaming Frog in 2025, its essential to properly configure the tool for accurate and efficient results. This preparation step ensures youre collecting meaningful data while saving time and resources. Here’s how you can get started:
Set the Right User-Agent String
The user-agent string tells websites what type of bot or browser is accessing them. By default, Screaming Frog uses its own user-agent, but you can customize it to mimic Googlebot or another crawler.
Why It Matters:
Some websites deliver different content based on the user-agent. Using the correct one helps you see what search engines actually crawl.
How to Set It:
- Open Screaming Frog.
- Go to Configuration > User-Agent.
- Select a predefined agent like Googlebot, or create a custom one.
Adjust Crawl Limits
If youre working with large websites, it’s smart to manage crawl limits so you don’t overload your system or exceed your license restrictions.
Settings to Consider:
Setting | Description | Where to Find It |
---|---|---|
Crawl Depth | Limits how deep Screaming Frog will crawl links from the homepage | Configuration > Spider > Limits tab |
Max URLs | Caps the number of URLs crawled during a session | Configuration > Spider > Limits tab |
Crawl Speed | Controls how many URLs are crawled per second | Configuration > Speed |
Integrate Google Analytics and Search Console
Screaming Frog can connect directly with your GA and GSC accounts to provide deeper insights into traffic and indexing status.
Benefits of Integration:
- See which pages receive actual traffic.
- Identify indexed vs. non-indexed pages.
- Correlate technical issues with performance data.
How to Connect:
- Go to Configuration > API Access > Google Analytics / Google Search Console.
- Click “Connect to New Account” and sign in with your Google credentials.
- Select the relevant GA property or GSC profile.
- Choose metrics or dimensions you want to import (e.g., sessions, bounce rate).
Create a Custom Configuration Profile (Optional but Recommended)
If you often run audits, save time by creating a reusable configuration profile tailored to your needs.
Steps:
- After setting up all configurations, go to File > Configuration > Save As…
- Name your profile and save it for future use.
- You can load this anytime via File > Configuration > Load…
A well-prepared Screaming Frog setup is key to running an effective audit. With these configurations in place, youre ready to start crawling your website with confidence and precision in 2025.
2. Crawling Your Website Effectively
Once you’ve installed and set up Screaming Frog, it’s time to start crawling your website. A proper crawl is the foundation of a successful site audit. Here’s how to do it right in 2025, using best practices tailored for modern websites.
Setting Up Your Crawl
Before hitting that “Start” button, make sure you configure Screaming Frog correctly. Go to Configuration > Spider and review what the crawler will and won’t follow. Consider enabling or disabling options based on your needs:
Option | Recommended Setting | Description |
---|---|---|
Crawl All Subdomains | Off (unless needed) | Only enable if your site uses multiple subdomains intentionally. |
Crawl Outside of Start Folder | Off | Keeps the crawl focused on the core domain/folder. |
Follow Internal “nofollow” | On | Crawls links marked as nofollow for full visibility. |
Render JavaScript | Depends on your site | If your site relies heavily on JS, enable this under Rendering settings. |
Managing Crawl Depth
Crawl depth refers to how many clicks away a page is from your homepage. Deep pages may be harder for users and search engines to reach. You can manage this within Screaming Frog by setting limits under Configuration > Spider > Limits. For most audits, keeping the maximum crawl depth between 5–7 is ideal.
Why Crawl Depth Matters:
- User Experience: Important content should be easily accessible.
- Crawl Budget: Search engines prioritize pages closer to the homepage.
- Internal Linking Insights: Helps identify orphaned or hard-to-find pages.
Handling JavaScript Rendering
If your site uses frameworks like React, Angular, or Vue.js, youll want to enable JavaScript rendering so Screaming Frog can see content loaded dynamically. Go to Configuration > Spider > Rendering, and choose “JavaScript”. You can also set a delay (e.g., 2–5 seconds) to allow content to fully load before crawling each page.
Tips for JS-heavy Sites:
- Use “Crawl Analysis” after the crawl to check rendered vs non-rendered content differences.
- Compare raw HTML with rendered HTML in the internal tab for important pages.
- Avoid unnecessary scripts that slow down or block crawlers.
Avoiding Unnecessary Pages
You don’t want your audit cluttered with pages that arent valuable for SEO—like admin URLs, staging environments, or filters/sort pages from eCommerce platforms. Use these tools within Screaming Frog:
- Exclude Rules: Under Configuration > Exclude, enter patterns (like /wp-admin/ or ?sort=price) you dont want crawled.
- Robots.txt Simulation: Test how Googlebot would interact with your robots.txt file via Configuration > Robots.txt Settings.
- User-Agent Filtering: Customize headers or user-agents if certain pages serve different content based on crawler identity.
Examples of Pages You Might Exclude:
URL Pattern | Description |
---|---|
/cart/ or /checkout/ | E-commerce checkout flow — not useful for SEO analysis. |
/search?q=* | Dynamically generated search result pages — often thin content. |
?ref=* | Affiliate/referral parameter URLs — duplicate content risk. |
/tag/* or /author/* (on blogs) | Might lead to low-value indexable pages if not managed properly. |
A clean and focused crawl ensures that the rest of your audit is accurate and actionable. Taking the time to configure these settings correctly saves hours down the road and helps uncover real SEO opportunities without noise from irrelevant data.
3. Analyzing Technical SEO Elements
Once you’ve crawled your website using Screaming Frog, it’s time to dig into the technical SEO elements that can impact your sites performance and search engine visibility. In this section, we’ll walk through how to identify and analyze issues such as broken links, redirect chains, canonical tags, status codes, and site architecture.
Broken Links (404 Errors)
Broken links can hurt both user experience and SEO. Screaming Frog makes it easy to find them:
- Go to the “Response Codes” tab
- Filter by “Client Error (4xx)”
- You’ll see all URLs returning a 404 error
You can export this list and prioritize fixing or redirecting these URLs.
Redirect Chains & Loops
Too many redirects in a chain can slow down page load speed and confuse search engines. Here’s how to check for them:
- Select the “Reports” menu
- Choose “Redirect Chains”
This report highlights URLs that go through multiple hops before reaching their final destination. Ideally, each redirect should be direct and limited to one step.
Canonical Tags
Canonical tags help prevent duplicate content issues by telling search engines which version of a page is the preferred one. To review canonical tags in Screaming Frog:
- Navigate to the “Canonicals” tab
- You’ll see columns for both declared and detected canonicals
If the canonical URL doesn’t match the actual URL or points to another domain without reason, it might need adjustment.
Status Codes Overview
Status codes tell you how your pages respond to requests. Below is a quick reference table for common HTTP status codes you might encounter during an audit:
Status Code | Description | SEO Impact |
---|---|---|
200 | OK – Page loads correctly | No issue |
301 | Permanently redirected | Avoid chains; update internal links if possible |
302/307 | Temporary redirect | Might confuse search engines; consider using 301 if permanent |
404 | Page not found | Create redirects or restore deleted content if necessary |
500+ | Server errors | Critical – needs immediate attention from your dev team |
Site Architecture & Internal Linking
A well-structured site helps both users and search engines navigate your content easily. Use Screaming Frog’s “Site Structure” visualizations to understand how your pages are connected. Key things to look for include:
- Crawl Depth: How many clicks it takes to reach a page from the homepage – aim for under 4 levels deep.
- Orphan Pages: Pages with no internal links pointing to them – these are hard for users and bots to find.
- Internal Link Distribution: Make sure important pages are getting enough link equity internally.
Crawl Depth Example:
Crawl Depth Level | Total Pages Found |
---|---|
Level 1 (Homepage) | 1 |
Level 2 (Main categories) | 10 |
Level 3 (Sub-categories) | 30 |
Level 4+ | 50+ |
If you notice too many important pages buried deep within the structure, consider reworking your internal linking strategy to bring them closer to the surface.
Troubleshooting Tips:
- If crawl depth is too high, simplify navigation or add breadcrumb links.
- If internal linking is uneven, use contextual links within content to support deeper pages.
- If canonical tags are misused, double-check CMS settings or plugins generating them.
Screaming Frog gives you all the data needed — it’s just about knowing where to look and what actions to take based on what you find.
4. Evaluating On-Page SEO Metrics
When running a site audit with Screaming Frog in 2025, one of the most important steps is evaluating your on-page SEO elements. These components help search engines understand your content and improve visibility in search results. Let’s break down how to check key on-page metrics like title tags, meta descriptions, header tags, and word counts using Screaming Frog.
Title Tags
Title tags are one of the first things both users and search engines see. Screaming Frog allows you to quickly scan for missing, duplicate, or overly long titles.
Best Practices:
- Keep titles under 60 characters
- Include target keywords near the beginning
- Avoid duplicate titles across pages
To view title tag data in Screaming Frog:
- Open Screaming Frog and crawl your site
- Select the “Page Titles” tab
- Filter by “Missing,” “Duplicate,” or “Over 60 Characters” to spot issues
Meta Descriptions
Meta descriptions may not directly impact rankings, but they influence click-through rates from search results. Screaming Frog helps identify pages missing meta descriptions or those that are too long or duplicated.
Best Practices:
- Keep meta descriptions under 155 characters
- Write compelling copy that encourages clicks
- Avoid repeating descriptions across multiple pages
Header Tags (H1, H2, etc.)
Header tags structure your content and guide both users and search engines through the page. Screaming Frog can flag issues like missing or multiple H1 tags.
What to Check:
Tag Type | What to Look For |
---|---|
H1 | Ensure each page has one unique H1 that includes the main keyword |
H2-H6 | Use subheadings to organize content logically; include secondary keywords where relevant |
Word Count
The amount of content on a page can affect its ability to rank. Pages with very little text may be seen as low-value by search engines. Screaming Frog’s “Content” tab shows word count per URL so you can spot thin content quickly.
Suggested Minimum Word Counts:
Page Type | Recommended Word Count |
---|---|
Blog Post | 800–1,500 words |
Product Page | 300–600 words |
Landing Page | 500–1,000 words |
Homepage | 400–800 words |
How to Find Word Counts in Screaming Frog:
- Crawl your website using Screaming Frog
- Select the “Content” tab in the top menu bar
- Add a column for “Word Count” if not already visible via right-click > Columns > Content > Word Count
- Sort by ascending order to find thin content pages easily
Taking time to review these on-page SEO metrics will ensure your site is well-optimized and aligned with best practices for search visibility in 2025.
5. Identifying Opportunities for Optimization
Once youve crawled your site with Screaming Frog, its time to dig into the data and uncover ways to improve your SEO performance. Screaming Frog makes it easy to spot issues that may be holding your site back—like duplicate content, low-performing pages, and missed internal linking opportunities.
Duplicate Content
Search engines like Google frown upon duplicate content because it can confuse crawlers and dilute ranking signals. Screaming Frog helps you identify these issues by flagging identical or near-identical pages across your site.
How to Find Duplicate Content:
- Go to the “Content” tab in Screaming Frog.
- Select “Duplicate” under the “Exact Duplicates” filter.
- Review the list of URLs and decide which ones need canonical tags or consolidation.
Low-Performing Pages
Screaming Frog allows you to export crawl data and combine it with analytics tools like Google Analytics or Search Console. This helps you find pages with low traffic, poor engagement, or high bounce rates that might need content updates or SEO improvements.
Example Metrics to Consider:
Page URL | Bounce Rate (%) | Average Time on Page (min) | Organic Clicks |
---|---|---|---|
/blog/old-post | 89% | 0:30 | 5 |
/product/unoptimized-page | 75% | 0:45 | 12 |
Internal Linking Opportunities
A strong internal linking structure helps distribute link equity and improves crawlability. Screaming Frog’s “Inlinks” and “Outlinks” tabs show how many links point to each page and from where. Use this info to identify orphaned pages or important content that isn’t getting enough internal links.
Tips for Better Internal Linking:
- Add contextual links from blog posts to relevant product or service pages.
- Use descriptive anchor text that includes target keywords.
- Create hub pages that link out to related subtopics.
Bringing It All Together
The goal is to use Screaming Frog not just as a diagnostic tool but as a roadmap for optimization. By targeting duplicate content, refreshing underperforming pages, and strengthening internal links, youre setting up your site for better rankings and user experience in 2025 and beyond.
6. Exporting and Reporting Audit Data
Once youve completed your site audit in Screaming Frog, the next crucial step is exporting the data and presenting it in a way that makes sense to your team, clients, or stakeholders. Whether youre creating a spreadsheet for internal review or building a visual dashboard for a client presentation, Screaming Frog gives you plenty of options to work with.
Exporting Data from Screaming Frog
Screaming Frog allows you to export almost any data set directly from the interface. Here’s how:
- Select the tab with the data you want (e.g., Internal, Response Codes, Page Titles).
- Click on “Export” in the top-left corner of the tab.
- Choose your preferred file format—usually CSV or Excel (.xlsx).
- Name your file and save it to your desired folder.
Commonly Exported Data Sets
Data Tab | Use Case |
---|---|
Internal | Full list of URLs crawled with metadata like status codes and word count. |
Page Titles | Identify missing, duplicate, or overly long titles for optimization. |
Meta Descriptions | Check for length issues or missing descriptions across pages. |
H1 / H2 Tags | Review header structure for SEO and content clarity. |
Redirects & Canonicals | Spot redirect chains and canonical tag issues quickly. |
Importing Data into Spreadsheets
You can open exported CSV or Excel files in tools like Google Sheets or Microsoft Excel. From there, you can use filters, color coding, and pivot tables to highlight problem areas and trends. It’s helpful to create separate tabs for each issue type—like broken links, duplicate titles, or large image files—to make analysis easier.
Creating Dashboards for Stakeholders
If youre working with non-technical stakeholders or clients who prefer visuals over spreadsheets, consider importing your audit data into a dashboard tool like Google Looker Studio (formerly Data Studio). With this approach, you can build interactive charts and graphs that clearly show:
- The number of pages with SEO issues
- The most common types of problems (e.g., missing meta tags)
- Progress over time if audits are done regularly
Tips for Effective Reporting
- Simplify your language: Avoid jargon when presenting findings to non-SEO professionals.
- Focus on impact: Highlight which issues are hurting performance the most.
- Prioritize action items: Use high/medium/low severity levels to guide next steps.
Example Actionable Report Summary
Issue | Total Pages Affected | Severity Level | Recommended Action |
---|---|---|---|
Missing Meta Descriptions | 42 | Medium | Add unique meta descriptions to improve CTR. |
Error 404 Pages | 15 | High | Create redirects or fix broken links immediately. |
Duplicate Title Tags | 30 | Medium | Edit title tags to reflect unique page content. |
No H1 Tag Found | 18 | Low |