The truth is…
If you don’t have good on-page and technical SEO, your website just won’t rank. And the only way to check these factors is with an in-depth SEO audit using an SEO spider tool.
Here’s the problem:
Running an SEO audit is hard work. There are so many technical and on-page factors that you have to get right.
This is where SEO Spider comes in.
This powerful SEO spider tool is a favourite amongst all SEO pros for running comprehensive SEO audits quickly and efficiently. The Screaming Frog SEO spider tool crawls your site to uncover serious technical SEO problems and optimize your onsite SEO by auditing critical elements.
This blog will show you how to take advantage of SEO Spider tool for your own website to master technical SEO audits and website crawling.
Let’s jump in!
What Will I Learn?
An SEO spider is a bot that crawls your website to gather important data and then show you a report of what it finds. They can be used to uncover technical issues and make sure that Google can access your site correctly. The spider tool crawls reports to analyse page titles meta data, site structure, and technical elements comprehensively.
The most common SEO spider tool is by Screaming Frog.
The Screaming Frog SEO spider understands how search engines work and has a ton of powerful features to help you rank higher. This incredibly feature rich rapidly improving tool helps with everything from finding broken links to analyzing meta descriptions.
The SEO spider helps you improve on-page SEO by crawling and auditing your site for common SEO issues. The Screaming Frog SEO spider is considered to be one of the best on-page SEO auditing tools available for performing a site audit and uncovering serious technical SEO problems.
Unlike most SEO tools, the Screaming Frog web crawler is a desktop-based tool.
This means you need to download the software before you can use it.
Screaming Frog SEO spider offers a free licence that allows you to crawl 500 URLs for free, with the main restriction being you can crawl a maximum of 500 URLs on one website. To limit access advanced features, the paid version unlocks unlimited crawling and additional integrations.
This should be enough for most people starting with technical SEO audits.
The SEO spider has become a popular tool amongst some of the best SEOs out there and for good reason. We’ve tested nearly every SEO tool available, and the Screaming Frog SEO spider every person involved in SEO should have in their toolkit.
Here are 8 fundamental things you can do with this powerful SEO spider tool.
As your site grows, broken links will happen.
But to maintain a high-quality website with a good user experience, you need to find broken links errors and fix them as quickly as possible.
Broken links are website URLs that used to have a page on that URL, but it has now disappeared.
Google doesn’t like broken links and having too many of them can affect your SEO.
Thankfully, the SEO spider tool is a master at finding broken links and identifying server errors across your entire site.
It will identify every URL and link on your website and then provide you with a comprehensive report of all the broken links errors redirects that need attention.
The first thing you need to do is open the SEO spider software.
Then, type in your website and click “Start”.
The SEO spider tool crawls your entire website systematically.
Once the crawl is finished,click on “Response Codes” and select “Client Error (4xx)”.
This will bring up all of the broken links on your website. Now you need to know where those broken links are on your site so you can fix them.
To do so, just click the “Inlinks” tab down the bottom.
“From” is the URL on your website with the broken link. “To” is the broken link.
This makes it super easy to fix all of the broken links on your site quickly.
PRO TIP: You can use the SEO spider to identify broken links on other websites to execute broken link building campaigns. This is a great way to build high-quality links from other sites and increase your domain authority.
You will inevitably need to change or remove URLs on your site at some point during site migration or content updates.
Redirects make this easy to do without:
Here’s what I mean:
Let’s say you decided to change the URL of a page. You need to redirect the old URL to the new URL so users and search engines can still find your content.
It’s the same for broken links. The best way to fix broken links is simply redirect them to a new and relevant URL.
The thing is… Sometimes you can make too many redirects from one URL.
This creates redirect chains and loops that can hurt your site’s performance.
Basically, you start by creating a redirect from URL A to URL B. Then later you make another change and redirect URL B to URL C.
Now you have redirect chains that have to go through 3 URLs to load the right page.
Ideally, you need to identify redirect chains, go back and just have URL A redirect directly to URL C – skipping URL B altogether.
The problem is that it’s difficult to identify redirect chains manually.
That’s where the Screaming Frog SEO spider can help with its advanced redirect analysis.
It does a complete redirect audit at scale and highlights long redirect chains that need fixing as soon as possible.
After the SEO spider crawls your site, click on “Response Codes” and select “Redirection (3xx)”.
The Redirect URL column shows the destination of the address URL redirect.
Now click on “Reports”, then “Redirects” and finally “Redirect Chains”.
Export the Google sheets report which maps out chains of redirects, loops and any hops along the way. It will also identify the source to make it easier to fix.
Now you can work through each redirect issue and fix them quickly!
Here’s the truth:
Google hates duplicate content.
They only like new and fresh content. The problem is that most sites end up being affected by duplicate content at some point.
This is especially true for ecommerce stores. In fact, I have never had an ecommerce SEO client that hasn’t had duplicate content on their store.
It really is that bad.
Lucky for you, the SEO spider makes it easy to discover exact duplicate pages and eliminate duplicate content quickly.
All you have to do is click on the “Content” tab and select “Exact Duplicates” or “Near Duplicates” from the Filter menu.
This report shows you all of the pages that have internal duplicate content issues.
Now you just need to apply the correct canonicalization tags to fix the problem.
You don’t want to block search engines from the wrong pages. That means making sure that you have custom robots.txt tags and other directives in the right places.
The SEO spider helps you review robots directives and meta robots settings comprehensively.
The most common search engine crawlers are:
But you also have other crawlers such as:
…that visit your site regularly.
Before entering your site, they look to see what pages they are allowed to crawl based on your custom robots.txt and meta robots directives.
Screaming Frog SEO spider makes it easy to review meta robots directives and check which website pages block crawlers by robots.txt, meta robots or X-robots tag directives.
So here’s what you need to do:
First, click on the “Response Codes” and select “Blocked by Robots.txt”.
Here you will see which URLs are currently blocking search engines and other crawlers.
With the paid licence, you can also configure and customize your custom robots.txt file settings.
You want Google to find and index all your site pages, right?
The easiest way to ensure that Google finds all of your pages is by submitting an XML sitemap in Google Search Console.
Don’t have a sitemap?
With Screaming Frog’s SEO spider, you can easily generate XML sitemaps for your site using the built-in XML sitemap generation feature.
Just follow the instructions below to learn how to do it.
At the top, click “Sitemaps” and then “XML Sitemap”.
This will open up all of the sitemap configuration options for XML sitemap generation.
Only pages with a 200 OK (2xx) response from the crawl will be included in the XML sitemaps generation. This box should be checked by default.
Leave everything else as it is and click “Next”. By default, Screaming Frog should choose the best settings for most websites for the rest of the steps.
The SEO spider will create the XML sitemaps for you. That’s all there is to it!
Now follow this search engine submission tutorial to learn how to submit your XML sitemaps to Google and Bing.
Your site architecture plays a more important role in SEO than most people think.
Google likes to see an organised and user-friendly site structure that helps people find what they are looking for efficiently.
That’s why most website owners use a silo structure for their internal linking and URL structure.
Screaming Frog SEO spider allows you to visualise site architecture and analyze your internal linking patterns. This makes it easy to identify opportunities to:
In the top menu bar, click “Visualisations”, then “Force-Directed Crawl Diagram”.
Think of this report like a heat map. You can see every page on your website and how it links to other pages.
Zoom in to specific pages to see the smaller internal links between pages.
Remember:
The best site architecture allows for a maximum of 3 in crawl depth. This means you should be able to reach any page on your website from your home page within 3 clicks.
Then, select the “Directory Tree Graph” from the “Visualisations” menu to see your click depth.
Now you can see how many clicks every page on your site is from the home page.
That’s it! Pretty cool, right?
Do you have a JavaScript website?
Using the integrated Chromium WRSRender feature, Screaming Frog’s SEO spider can crawl JavaScript websites and frameworks. You can also crawl dynamic content and JavaScript rich websites effectively.
This is a unique feature compared to most SEO audit tools.
JavaScript crawling is typically slower and more intensive than regular HTML websites. While this might not affect small websites so much, it’s a real problem for larger websites.
First, you need to configure the SEO spider to crawl JavaScript.
Click “Configuration” > “Spider” > “Rendering”.
Now change “Rendering” to “JavaScript” in the dropdown menu.
Choose your user agents for Crawling. I recommend you choose the “Google Mobile user agent” because of mobile-first indexing.
Lastly, click on the “Crawl” tab and check all of the boxes.
Now just paste your website in the crawl bar and crawl it again.
The SEO spider will get to work and crawl your entire JavaScript site. You can use all of the features like normal!
The SEO spider can be integrated with a number of Google tools to connect to the Google ecosystem.
This includes Google Analytics integration, Google Search Console integration, and PageSpeed Insights integration:
Why bother?
Simple – It makes Screaming Frog’s SEO spider even more powerful.
Rather than the SEO spider just crawling your website to extract data, it can extract data from the Google tools and integrate with GA GSC PSI systems.
With PageSpeed Insights integration you can bulk test your Google PageSpeed score inside Screaming Frog directly.
This makes it easy to identify opportunities to increase website speed without need to test each individual URL.
It will save you a ton of time while pointing out the URLs that need work.
Google integrations are a paid licence feature. But they are definitely worth it if you want to take full advantage of the SEO spider.
Beyond the core functionality, the SEO spider tool offers advanced features for power users. You can analyse page titles meta data to find missing, duplicate, long, short or multiple headings issues. The tool also provides custom source code search capabilities to extract data with XPath and perform structured data validation.
For troubleshooting international SEO issues, Screaming Frog SEO spider includes AMP crawling validation and can help with meta refresh analysis. You can store and view HTML rendered content, compare crawls staging environments, and even crawl with OpenAI Gemini integration for enhanced analysis.
The tool excels at crawling top landing pages, analyzing external links and external link metrics, and identifying areas where your onsite SEO by auditing can be improved.
The SEO spider is a necessary tool to have on hand for all SEOs.
There isn’t really another tool like it.
Download the free version and use it to implement the 8 steps above. It will help diagnose all of your big technical and on-page SEO errors that can affect your rankings.
Ready to take it up a level?
Get a paid licence and unlock a huge amount of extra features like unlimited crawling and integrations with Google tools.
Increase Your Search Traffic
In Just 28 Days…