Top Mistakes to Avoid When Auditing a Website

Auditing a Website

Whether you own or manage a website, there is always the ongoing battle that is SEO or Search Engine Optimisation.  One day a system works well and ranks near the top then an algorithm change comes along and suddenly your website is on page five and traffic has died.  This means that there is a constant need to audit websites and to keep an eye on those key areas, any top b2b marketing agency will tell you this.  But there are mistakes that many people make when auditing a website that can make the process a failure.  Here are some of the top ones to avoid.

Before you start

Before you start, remind yourself of the key areas for any website and also read up on the latest changes that might affect your site. Some of the key areas are content quality, the responsiveness of the website, speed of the website and authority.  

Looking at these areas first is always important in an audit as they can be at the heart of a number of problems.  

Leave yourself enough time to conduct a thorough audit – for a big website, this might take a few days.  But it will be worth it in the long run as you can maintain your top ranking spot.

Crawling

People often jump into a website audit without carrying out one very important task – crawling the entire website.  The most common problems can be highlighted during a comprehensive crawl of the site so it is always the first step you should take in an audit.  There are some good tools out there to help with the task, both paid and free.

Once you have selected your tool, you will enter the website URL and also if there are any subdomains to also crawl.  The process is the cornerstone of the audit process and will highlight errors and issues that you can deal with before moving on.

Accessibility

You want traffic to come to every page and post of your website but sometimes there can be accessibility issues that can prevent this without you realising.  And if search engines can’t access areas of the website, then they can’t register them as existing and they won’t be of any benefit, plus the blocked areas could negatively affect the overall rating of the website.

There is a simple way to check that all of the pages and areas of the website are accessible using the robots.txt file.  This shows a list of all the pages and folders that search engine bots can crawl and highlights any that they cannot.

If you have purposefully left an area as disallowed for crawlers, then this is fine but if there are any areas showing on the list that should be accessible, then you can take steps to remedy the problem.  One of the most common reasons for this is a ‘no follow’ tag in the meta tag data.  If you see this, you can remedy it.

Other common codes to watch for include:

  • 200 – this is the code for correct and means everything works on the page
  • 301 – permanent redirect which means that you have set up a redirection to a new page, such as when the location of a post has changed
  • 302 – temporary redirect which means that you have set up a short-term redirect, often when a page is being redesigned
  • 404 – not found which means the page isn’t where it is meant to be
  • 503 – service unavailable which means crawlers were unable to access the site to view the page

Also, watch for you XML Sitemap as this is the path that search engine crawlers use to easily check all of the pages on your website.

A certain format is best to ensure that all crawlers can use it.

On-page factors

On-page factors

There are several on-page factors that can cause problems with your website that you should watch for while conducting an audit.  While the content of the page is always at the forefront of what you do, and what gives your page the rating you desire, there are other some factors to look at.  From an SEO viewpoint, consider:

  • Does the page have enough content?  Most experts say a minimum of 300 words
  • Does the content offer value for the reader?  This is areas such as being well written and relevant
  • Does it contain optimised images?  This includes metadata that can be read by search engines
  • Does it have targeted keywords?  This involves research and a strategy to ensure they are relevant and what customers will search for

You should also consider if search engines can process it – examples that can cause slowness on websites are JavaScript or Flash.  Also, consider the readability from a human viewpoint as all content is written for people to read.  This is often called the user experience.

Off-page factors

The final key factor to consider in your website audit is known as off-page ranking factors.  This is things that come from outside the site but affect how it ranks.  You can use tools such as Google Analytics to study your traffic and where it comes from as well as competitive analysis tools to look at your website versus that of your competitors and see how they measure up.

There are also measures you might take thinking they would boost traffic but that can have a negative effect such as keyword stuffing, cloaking and hidden text.  Google Search Console is a good place to make sure none of these issues is affecting how your site ranks.  

Lastly, social engagement is one factor with an uncertain impact but that shouldn’t be ignored.  This means that engagement on social accounts such as a Facebook Page or Twitter account can help increase the ranking of a website.

Conclusion

Sometimes auditing a website can be an uphill battle against the ever-changing algorithms but if you keep the basics in mind and avoid some of the mistakes here then you can have a sound basis for the site that should withstand changes.  This should also mean that with time and good SEO practices, you can have a top-ranking website.

 

Leave a Reply

Your email address will not be published. Required fields are marked *