Why Screaming Frog is Awesome for Marketing & Webmasters

Screaming Frog is, in my humble opinion, an essential tool for SEO. It’s an amazing piece of software which provides you with essential information about your site in just a few clicks of the mouse.

 
I recently gave a knowledge share presentation within our agency on the capabilities of the tool because it’s not only useful for search optimisation purposes, but also for various webmaster related tasks or for a quick overview of main issues with a website.

I’d like to briefly talk through a few uses to show why it should be part of your tool-kit.

Mapping Your Website

As a site expands, mapping its pages may become more and more difficult. A website with a blog or an e-commerce website is likely to expand more and more over time with new articles and products being added as the months pass by. Keeping track of all these URLs could be a real headache. Why would you need to know? Well, if you’re re-designing a website then you’ll need to ensure all the old pages are properly redirected to the new ones once it’s launched or you’ll risk confusing Google and losing your current search rankings as all your old indexed pages lead to 404 error pages.

With Screaming Frog you can crawl and export all pages on your website to properly map and redirect them pre-launch.

From the ‘Internal’ tab you can click to filter only HTML pages, then export them into a .csv file to turn into a wonderful spreadsheet which can easily become a sitemap of all your pages to be used for future optimisation and mapping.

You’ll note in the above screenshot I’ve also highlighted the status code – 200 shows the pages are alive and working, other codes might make you have to consider your options (especially if you see 404 pages in there).

Testing Your Search Results

Hopefully if you’re reading this then you’re aware that page titles and meta descriptions are the elements of your website that appear in the search results. In a perfect world, these are what your potential customers will see when they first search for your products or services. There are many reasons why they’re important:

These and other considerations (such as not having duplicate meta and confusing people/search engine bots), are reasons why getting the meta on your website right is so important. And Screaming Frog is a great tool for helping you to do just that.

A recent update to the way Google displays search results meant that page titles are now displayed based on pixel width (rather than just character count) and the keyword(s) people are searching for. Without getting into a deep conversation on this (or arguing about what the right character length is) this essentially means that what is displayed will depend on the formatting. If your title or description is too long it’ll be truncated which may make it less appealing and not get your message across. Luckily Screaming Frog has the ability to show you what your search results might look like:

Using the SERP Snippet functionality you can see the predicted search results for each page on your website. As you can see from the above screenshot our website’s title for the /about-us/ page is currently truncated, but the description is just the right length.

On other pages you might find the Meta description is missing entirely:

 
Although this isn’t an accurate representation of the search results as Google would commonly just add its own description based on the page contents, it still shows why it’s important to optimise this data. Information on which pages are have missing, duplicated or long/short page titles and meta descriptions is also available elsewhere in the tool.

Internal Errors and Usability

When people are browsing our website we want them to be able to find everything they are looking for and not to stumble across any errors. Error pages (custom or otherwise) are off-putting, unhelpful and may frustrate your users and send them elsewhere.

Part of any website optimisation should therefore include eliminating error pages. The main ones you’ll find on any website are what we call 404 errors. 404 being the server code that tells the browser (and search engine) what’s happening with the page – in this case ‘error, not found’. There are many reasons why a 404 page might exist. It may be a page was deleted and not redirected, a product was removed and not replaced with an updated model, an old version of the website was not accounted for in a redesign. But as far as possible we need to work to reduce these errors.

There are a few ways we can combat these problems. Google’s Webmaster Tools will commonly show you the 404 error pages which are occurring because of external links pointing to your website and it’s certainly a good starting point.

 
(Hopefully your graph looks something like that with errors going down).

But Screaming Frog can show you the 404 pages which exist because of errors on your website and these are possibly more important errors to deal with.

 
In this example I found a .pdf file of a case study on our website which no longer existed (because an updated version had been created and uploaded). The result was a 404 page because the original URL had not been redirected. Clicking on the In Links tab at the bottom of the tool shows which pages are linking to this broken URL. So not only can we add a 301 redirect from the old URL to the new one (in case anyone linked to it externally, bookmarked it or emailed the link) but we can also fix the broken link on the website itself as we know what page it’s on.

Using these two methods we can combat both internal and external errors that are leading to 404 pages on our website and thus improve the user experience, reduce bounce rates and hopefully improve conversions as a result. There’s some (logical) argument that 404 errors may well harm your search engine rankings as well, so it’s pretty important to keep on top of these things.

It’s worth noting that Screaming Frog have rather kindly given us the ability to export these 404 pages (with ‘In Links’) into a spreadsheet so you can send it to your development agency or webmaster to deal with.

Validating Code

As a digital marketer, one of the things I have to deal with is websites that suddenly lose tracking or have issues with reporting problems. Commonly this is because updates have been pushed to a site but the developer forgot/neglected to check the Google Analytics (or other code) was on the site.

Screaming Frog gives us the power to check whether a particular bit of code is on a page or not. You can search and filter pages which have code and those that don’t.

Let’s take this imaginary world where we want to add Google Tag Manager code to a website as an example. We need it on every page of the site, so we need to check it’s there so we can track our results properly or our monitoring tools are working.

Google Tag Manager commonly starts with this string of code:

<!– Google Tag Manager –>

We can use this code to check the site by using Screaming Frog’s ‘Custom’ filters. Under Configuration there’s an option for ‘custom’:

Selecting this allows us to put in our custom code. You can use multiple filters and choose whether to check if the site contains that code or does not contain it.

 
These results then allow you to work out which pages need fixing.

You could theoretically use this custom filtering to check for any bit of code that was needed site-wide or even use it to check for duplicate body content when you were optimising pages to make sure they were unique.

Pages Blocked From Search Engines

Another issue I come across with far too much regularity is pages (or entire websites) being blocked from the search engines due to code on the site. Either in the form of a line in a robots.txt file or a meta tag that looks a little something like this:

<Meta Name=”Robots” Content=”Noindex, Nofollow”>

These tags are often left on a new site once it goes live (by accident) and if not picked up and dealt with this could (worse case) lead to the entire website being dropped from the search results – a big problem for the business owner who would lose out on leads or revenue as a result.

Screaming Frog can help us pick up on these pages:

Sitemap.XML

The XML sitemap is a simple yet essential part of any website which is basically a list of all your website pages in a format that search engines can easily find, crawl and understand. It’s often missed out, not updated when a website is changed or improperly formatted. Another hidden gem of Screaming Frog is its ability to crawl your site and create a sitemap.xml file for you to upload straight to your site. Though for an e-commerce website we’d recommend having a dynamic sitemap, this is still a great feature for smaller/static websites.

Redirect Chains

A redirect chain may occur for many reasons. Let’s say there’s a website that’s been around for a few years. The current site probably looks a fair bit different to the one that was originally launched; it might be that the site has been redesigned since it was first launched in 2006. After the redesign, old URLs were redirected to new ones and then it happened again in 2010 and again in 2014. What’s the result of this? If a loyal customer had bookmarked the site back in 2006, the links in their browser may well have been redirected several times since then. The result is a redirect chain. Let’s say the original URL looks like this:

http://www.example.com/page-1.html

Then it became:

http://www.example.com/homebrewing-kits.html

Which in turn was later updated to:

http://www.example.com/brewing/homebrewing

Now if each was redirected to the next, that means http://www.example.com/page-1.html now has 2 redirects before it reaches the right place. More and more redirects mean a slower loading website and a frustrated user. Ideally we need to go back and find the original URLs and redirect them to the newest ones.

Screaming Frog picks up a current redirect chains and allows us to easily fix them by showing where it started and where it should finish:

Again exported into a handy little spreadsheet to send to developers or deal with yourself!

Conclusion?

Screaming Frog is a work horse for SEO and webmasters. This has just been a quick overview of some of the more in-depth features, I haven’t covered the main day-to-day uses as there are so many and they’re obvious. Here’s a quick list of general things I use it for:

These things are not only great for SEO audits but also on-going website maintenance and optimisation. It’s worth noting that Screaming Frog is available as a free tool (with a limit of 500 URLs) and a very reasonable annual price tag so there’s no reason not to use it!

Exit mobile version