Most organisations now realise that SEO is a hugely important element of any marketing strategy.

SEO dictates the level of visibility of your brand and is an inextricable part of any content marketing strategy.

So how do you ensure that your websites tick all the right boxes?

How do you make sure that your websites are performing as efficiently and effectively as possible?

To begin with search engine optimisation is now all about value and quality. Content is King… right?

Not so fast!

Your content may be exceptional and your website might be state of the art in design and user experience but if you make any mistakes you will attract varying degrees of loss of trust or you will be marked down as a preferred solution to a given query.

The rise of technical compliance

This has been an incremental process and Google has been very clear that fast, efficient, error-free websites that don’t suffer from thin content or spammy backlinks will perform well in Google’s search results.

A series of major algorithm updates have targeted specific ranges of tactics – Panda, Penguin, Pigeon, Pirate, EMD, HTTPS/SSL, Payday Loan, Page Layout, Hummingbird, DMCA Penalty and several unnamed and unannounced updates.

Overall the requirements are fairly clear. But working through the increasing range of needed compliance factors has become increasingly complex.

Google Webmaster Tools

Webmaster Tools has become an essential addition to website performance analysis and this is where a large portion of research can be done. Google has been continually adding features to this tool and gives recommendations on what can be improved on your site.

This is also where Google advises if you have received a manual penalty for non-compliant behaviour or if you have security issues such as malware.

It is not surprising then that I recommend every website must have Webmaster Tools added and the data should be continually monitored.

Other recommended tools

There are a number of very useful tools that take care of different elements of your website. We use the Moz Suite, Ahrefs, Majestic, Screaming Frog and others to look at and analyse backlink profiles.

A clean, high quality link profile is extremely important. Any sign of manipulative behaviour and penalties can be severe resulting in removal from search results or a dramatic reduction in visibility.

What are SEO audits and how do they work?

An SEO audit looks at all of the various elements of your website and web presence and the report will give you actionable recommendations that you can work with your developers on or you can adjust through your CMS.

The outcome will be a website that ticks all the boxes for search engines and will offer a better experience for your human visitors.

I will run through the various elements of the audit now.

Crawl errors

The following areas should be checked:

  • DNS – problems connecting will hurt your site
  • Server connectivity – frequent server outages can punish your search positions
  • Robots.txt fetch – if Google can’t find your robots.txt it will delay crawling
  • URL errors on Desktop, smartphone, feature phone – small numbers of these are OK but once they get up in number (and I have seen sites with more pages returning a 404 than actual pages on the site) your site is seen as not giving a valuable experience to the user
  • Not Found, soft 404 and server errors – as above these all contribute to a poorer user experience.

Search queries

Evaluation of current page positions, impressions and clicks.

How relevant are your top performing non-branded keywords and what is the click-through-rate like? Is the overall SERPs trend healthy or is it of concern?

Structured data

Review if structured data is relevant. This page mark-up is very useful for particular industries but must be managed carefully to stay within guidelines.

HTML improvements

Here we are looking for duplicate, long or short title tags or meta descriptions.

There are many “out-of-the-box” duplication issues that are potentially harmful and these are common to most CMS. Blog archives, category management, author archives and other taxonomies can all produce accidental duplicate content.

Dynamic URLs are also a major contributor to duplicate content.

Way too often people see the Title tag and description as somewhere to be as comprehensive as possible. Title tags should never be longer than 60 characters including spaces. Meta descriptions should never be longer than 156 characters including spaces.

Short titles are often caused by an automatic tag creation which uses just the page name e.g. About

Links to your site

As well as Webmaster Tools it is important to look at your link profile with Ahrefs and Majestic to ensure there are no concerns with either the quality of your links or the anchor text used.

Since the Penguin update was launched it has been poor practice to build links through submissions, article marketing, commenting, or any other practice where the link has not been earned and placed by a third party.

Triggers for a penalty include the number of low quality sites linking to yours, lack of relevance of the sites to your business, repeated keyword rich anchor text, sitewide footer links, widgets and more.

Penguin penalties are probably the most severe of all Google penalties.

Manual actions

Check that there are no manual actions by Google. This is something that you definitely do not want. Even large global brands have been hit with these when they have crossed the compliance line.

Security issues

Check for malware and other security threats.

International targeting

Check that the website has the correct international targeting and that hreflang tags are used if needed.

International SEO is one of the most complex and challenging strategies crossing both geographical and language boundaries.

It is not advisable to have the same content in multiple geographic territories and I have seen some very large brands making this mistake.

Index status

Number of indexed pages – does this align with number of value giving pages on the site. Are there duplication or thin page content issues?

One audit I worked on found that the site had over 100,000 pages indexed. But 99,000 of them were caused by a calendar function that created events pages for the next 10 years. All of these pages were blank and so the ratio of value giving pages to thin content pages was extremely poor.

We removed the entire directory from Google and the website performance improved significantly.

Robots.txt tester

Test robots.txt for errors – I have seen people accidentally block search engines from visiting their site by using incorrect code here. It is very important to get this element absolutely correct.

Pagespeed insights

Slow web pages have a negative effect on site performance. Frequent problems are images not optimised, lack of compression, browser caching, render-blocking javascript and server response time.

You also need to look at desktop speed and mobile speed. Mobile speed is becoming increasingly important and is a strong factor in mobile search as Google wants its customers to be able to find solutions quickly wherever they are.

Redirects

Have these been handled correctly? Are 301s used? Are there duplicate versions of the site live?

It is very common to see both the www and non-www versions of a website live. And for some reason many developers have a different URL when you return to the homepage from elsewhere in the site – usually with an addition like index.html

Sometimes 302 redirects have been used which is a temporary shift where 301 (page moved permanently) is the correct response.

This is poorly managed in many site redesigns and redevelopments. URLs are changed but no thought is given to what happens to the old URLs.

Image optimisation

Do images have useful file names? Are there alt tags for all images? Are they compressed and of a reasonable file size?

Alt tags are the primary indicator of what the image is about but the other elements are also read by search engines.

Heading tags

Correct heading tag implementation is very important. The correct frequency, length and relevance makes a big difference to your content structure.

Every page should only ever have one H1 heading tag and this should be clearly aligned with page intent. H2s and H3s need to be correct length and should follow a hierarchy to demonstrate a full picture of page content emphasis.

Developers often use heading tags for text sizing instead of as a content map. So you may end up with fairly useless H2 tags as navigation such as “Quote”.

Metadata

Quality of titles and descriptions – no keyword stuffing, correct length, calls-to action, compelling copy.

This is where you get to put your best foot forward. SEO titles and meta descriptions are where you get to say exactly what the page is about and give useful, compelling information with a teaser or call-to-action.

It is not a place to put multiple keywords.

This is your shop window – it is how you will be represented in search and in social media.

How many times have you seen a social media share that looks like this? “Widgets | green widget | blue widget | Sydney | Melbourne | Auckland”

Rubbish, isn’t it?

SEO page titles are one of the most important content signals for Google so don’t blow it!

Social signals

What is the quality and influence of your social media properties? Are they adding value to your SEO or are they hurting it?

Search engines are tracking the level of engagement on social media. A brand that is popular on social media will mostly be a good result for a given query – as long as the other factors line up as well.

How influential is your brand?

Content strategy

Google has very clear recommendations for content. Well structured, easy to crawl, comprehensive solutions (in-depth articles), relevant, focused, popular and more.

Does your content strategy measure up?

How frequently do you publish?

Is the content optimised and structured correctly?

Is the content popular?

How many people are viewing and sharing it?

How professional is the standard of writing? Is spelling and grammar of a high standard?

Summary

In an SEO audit all of these elements are checked and analysed. Then recommendations can be made on the key areas that need correction or improvement.

Each of the performance improvements and error corrections will incrementally improve your website’s search visibility.

It is interesting to track just how quickly the sudden appearance of errors or a slow-down in page loading can affect visitor numbers. At the other end these technical fixes do not always result in immediate changes for the better.

Sometimes it can take several weeks for the changes to be trusted and rewarded.

It is important to implement a regime of regular audits. Crawl errors from a long dead version of your website can suddenly appear for no apparent reason. So you must watch closely. And a spam site linking to yours thousands of times can do serious damage if it is not dealt with quickly.

Do you have a regular SEO audit strategy in place?