The truth is…
If you don’t have good on-page and technical SEO, your website just won’t rank. And the only way to check these factors is with an in-depth SEO audit.
Here’s the problem:
Running an SEO audit is hard work. There are so many technical and on-page factors that you have to get right.
This is where SEO Spider comes in.
This powerful SEO tool is a favourite amongst all SEO pros for running comprehensive SEO audits quickly and efficiently.
This blog will show you how to take advantage of SEO Spider tool for your own website.
Let’s jump in!
What Will I Learn?
An SEO Spider is a bot that crawls your website to gather important data and then show you a report of what it finds. They can be used to uncover technical issues and make sure that Google can access your site correctly.
The most common SEO Spider tool is by Screaming Frog.
It understands how search engines work and has a ton of powerful features to help you rank higher.
SEO helps you improve on-page SEO by crawling and auditing your site for common SEO issues. SEO Spider is considered to be one of the best on-page SEO auditing tools available.
Unlike most SEO tools, Screaming Frog SEO Spider is a desktop-based tool.
This means you need to download the software before you can use it.
Screaming Frog SEO Spider offers a free licence, with the main restriction being you can crawl a maximum of 500 URLs on one website.
This should be enough for most people.
SEO Spider has become a popular tool amongst some of the best SEOs out there and for good reason.
Here are 8 fundamental things you can do with it.
As your site grows, broken links will happen.
But to maintain a high-quality website with a good user experience, you need to fix your broken links as quickly as possible.
Broken links are website URLs that used to have a page on that URL, but it has now disappeared.
Google doesn’t like broken links and having too many of them can affect your SEO.
Thankfully, SEO Spider is a master at finding broken links.
It will identify every URL and link on your website and then provide you with a comprehensive report of all the broken links.
The first thing you need to do is open the SEO Spider software.
Then, type in your website and click “Start”.
SEO Spider will do a complete crawl of your website.
Once the crawl is finished,click on “Response Codes” and select “Client Error (4xx)”.
This will bring up all of the broken links on your website. Now you need to know where those broken links are on your site so you can fix them.
To do so, just click the “Inlinks” tab down the bottom.
“From” is the URL on your website with the broken link. “To” is the broken link.
This makes it super easy to fix all of the broken links only on your site quickly.
PRO TIP: You can use SEO Spider to identify broken links on other websites to execute broken link building campaigns. This is a great way to build high-quality links from other sites and increase your domain authority.
You will inevitably need to change or remove URLs on your site at some point.
Redirects make this easy to do without:
Here’s what I mean:
Let’s say you decided to change the URL of a page. You need to redirect the old URL to the new URL so users and search engines can still find your content.
It’s the same for broken links. The best way to fix broken links is simply redirect them to a new and relevant URL.
The thing is… Sometimes you can make too many redirects from one URL.
This is called a redirect chain.
Basically, you start by creating a redirect from URL A to URL B. Then later you make another change and redirect URL B to URL C.
Now you have a redirect chain that has to go through 3 URLs to load the right page.
Ideally, you need to identify this chain, go back and just have URL A redirect to directly URL C – skipping URL B altogether.
The problem is that it’s difficult to identify long-redirect chains.
That’s where the Screaming Frog SEO Spider can help.
It does a complete redirect audit at scale and highlights long redirect chains that need fixing as soon as possible.
After the SEO spider crawls your site, click on “Response Codes” and select “Redirection (3xx)”.
The Redirect URL column shows the destination of the address URL redirect.
Now click on “Reports”, then “Redirects” and finally “Redirect Chains”.
Export the Google sheets report which maps out chains of redirects, loops and any hops along the way. It will also identify the source to make it easier to fix.
Now you can work through each redirect issue and fix them quickly!
Here’s the truth:
Google hates duplicate content.
They only like new and fresh content. The problem is that most sites end up being affected by duplicate content at some point.
This is especially true for ecommerce stores. In fact, I have never had an ecommerce SEO client that hasn’t had duplicate content on their store.
It really is that bad.
Lucky for you, SEO Spider makes it easy to find and eliminate duplicate content quickly.
All you have to do is click on the “Content” tab and select “Exact Duplicates” or “Near Duplicates” from the Filter menu.
This report shows you all of the pages that have internal duplicate content.
Now you just need to apply the correct canonicalization tags to fix the problem.
You don’t want to block search engines from the wrong pages. That means making sure that you have robot.txt tags and other directives in the right places.
The most common search engine crawlers are:
But you also have other crawlers such as:
…that visit your site regularly.
Before entering your site, they look to see what pages they are allowed to crawl.
Screaming Frog SEO Spider makes it easy to check which website pages block crawlers by robots.txt, meta robots or X-Robots-Tag directives.
So here’s what you need to do:
First, click on the “Response Codes” and select “Blocked by Robots.txt”.
Here you will see which URLs are currently blocking search engines and other crawlers.
With the paid licence, you can also configure and change your Robots.txt file.
You want Google to find and index all your site pages, right?
The easiest way to ensure that Google finds all of your pages is by submitting an XML sitemap in Google Search Console.
Don’t have a sitemap?
With Screaming Frog’s SEO Spider, you can easily create an XML sitemap for your site.
Just follow the instructions below to learn how to do it.
At the top, click “Sitemaps” and then “XML Sitemap”.
This will open up all of the sitemap configuration options.
Only pages with a 200 OK (2xx) response from the crawl will be included in the XML sitemap generation. This box should be checked by default.
Leave everything else as it is and click “Next”. By default, Screaming Frog should choose the best settings for most websites for the rest of the steps.
The SEO Spider will create the sitemap for you. That’s all there is to it!
Now follow this search engine submission tutorial to learn how to submit your XML sitemap to Google and Bing.
Your site architecture plays a more important role in SEO than most people think.
Google likes to see an organised and user-friendly site structure that helps people find what they are looking for efficiently.
That’s why most website owners use a silo structure.
Screaming Frog SEO Spider allows you to visualise your site structure and architecture. This makes it easy to identify opportunities to:
In the top menu bar, click “Visualisations”, then “Force-Directed Crawl Diagram”.
Think of this report like a heat map. You can see every page on your website and how it links to other pages.
Zoom in to specific pages to see the smaller internal links between pages.
Remember:
The best site architecture allows for a maximum of 3 in crawl depth. This means you should be able to reach any page on your website from your home page within 3 clicks.
Then, select the “Directory Tree Graph” from the “Visualisations” menu to see your click depth.
Now you can see how many clicks every page on your site is from the home page.
That’s it! Pretty cool, right?
Do you have a JavaScript website?
Using the integrated Chromium WRSRender feature, Screaming Frog’s SEO Spider can crawl dynamic, JavaScript rich websites and frameworks.
This is a unique feature compared to most SEO audit tools.
JavaScript crawling is typically slower and more intensive than regular HTML websites. While this might not affect small websites so much, it’s a real problem for larger websites.
First, you need to configure the SEO Spider to crawl JavaScript.
Click “Configuration” > “Spider” > “Rendering”.
Now change “Rendering” to “JavaScript” in the dropdown menu.
Choose your user agents for Crawling. I recommend you choose the “Google Mobile user agent” because of mobile-first indexing.
Lastly, click on the “Crawl” tab and check all of the boxes.
Now just paste your website in the crawl bar and crawl it again.
The SEO Spider will get to work and crawl your entire JavaScript site. You can use all of the features like normal!
The SEO spider can be integrated with a number of Google tools.
This includes:
Why bother?
Simple – It makes Screaming Frog’s SEO Spider even more powerful.
Rather than the SEO Spider just crawling your website to extract data, it extracts data from the Google tools.
With PageSpeed Insights you can bulk test your Google PageSpeed score inside Screaming Frog directly.
This makes it easy to identify opportunities to increase website speed without need to test each individual URL.
It will save you a ton of time while pointing out the URLs that need work.
Google integrations are a paid licence feature. But they are definitely worth it if you want to take full advantage of the SEO Spider.
SEO Spider is a necessary tool to have on hand for all SEOs.
There isn’t really another tool like it.
Download the free version and use it to implement the 8 steps above. It will help diagnose all of your big technical and on-page SEO errors that can affect your rankings.
Ready to take it up a level?
Get a paid licence and unlock a huge amount of extra features like unlimited crawling and integrations with Google tools.
Increase Your Search Traffic
In Just 28 Days…