We've run over 100 technical audits this year. Through this we've gained deep insights into how technical structure impacts a website's performance in search. This article will highlight the most common technical SEO issues we encounter and which have the largest impact on organic traffic when corrected.

1. Mismanaged 404 Errors

This happens quite a bit on eCommerce sites. When a product is removed or expires, it's easily forgotten and the page "404s". Although 404 errors can erode your crawl budget, they won't necessarily kill your SEO. Google understands that sometimes you HAVE to delete pages on your site.

However, 404 pages can be a problem when they:

The best practice is to set up a 301 redirect from the deleted page into another relevant page on your site. This will preserve the SEO equity and make sure users can seamlessly navigate.

How to find these errors

How to fix these errors

2. Website Migration Issues

When launching a new website, design changes or new pages, there are a number of technical aspects that should be addressed ahead of time.

Common errors we see:

How to fix these errors

3. Website Speed

Google has confirmed that website speed is a ranking factor — they expect pages to load in 2 seconds or less. More importantly, website visitors won't wait around for a page to load. Slow websites don't make money.

Optimizing for website speed will require the help of a developer, as the most common issues slowing down websites are:

How to find these errors

How to fix these errors

4. Not Optimizing the Mobile User Experience (UX)

Google's index is officially mobile first, which means the algorithm is looking at the mobile version of your site first when ranking for queries. Google wants both desktop and mobile experiences to be the same — don't significantly simplify one over the other.

How to find these errors

How to fix these errors

5. XML Sitemap Issues

An XML Sitemap lists out URLs on your site that you want to be crawled and indexed by search engines. You're allowed to include information about when a page was last updated, how often it changes, and how important it is relative to other URLs.

Sitemaps are particularly beneficial on websites where:

How to find these errors

How to fix these errors

6. URL Structure Issues

As your website grows, it's easy to lose track of URL structures and hierarchies. Poor structures make it difficult for both users and bots to navigate, which will negatively impact your rankings.

How to fix these errors

7. Issues with robots.txt File

A robots.txt file controls how search engines access your website. It's a commonly misunderstood file that can crush your website's indexation if misused. Most problems tend to arise from not changing it when you move from your development environment to live, or miscoding the syntax.

How to fix these errors

8. Too Much Thin Content

It's no longer a good idea to crank out pages for "SEO" purposes. Google wants to rank pages that are deep, informative, and provide value. Having too much "thin" content (less than 500 words, no media, lack of purpose) can negatively impact your SEO.

How to fix these errors

9. Too Much Irrelevant Content

In addition to "thin" pages, you want to make sure your content is "relevant." Irrelevant pages that don't help the user can also detract from the good stuff you have on site. This is particularly important if you have a small, less authoritative website.

How to fix these errors

10. Misuse of Canonical Tags

A canonical tag (aka "rel=canonical") is a piece of HTML that helps search engines decipher duplicate pages. If you have two pages that are the same (or similar), you can use this tag to tell search engines which page you want to show in search results.

We often find websites that misuse canonical tags in a number of ways:

How to fix these errors

11. Misuse of Robots Tags

As well as your robots.txt file, there are also robots tags that can be used in your header code. We see a lot of potential issues with this used at file level and on individual pages. In some cases, we have seen multiple robots tags on the same page — Google will struggle with this and it can prevent a good, optimized page from ranking.

How to fix these errors

12. Mismanaged Crawl Budget

It's a challenge for Google to crawl all the content on the internet. In order to save time, the Googlebot has a budget it allocates to sites depending on a number of factors. A more authoritative site will have a bigger crawl budget than a lower authority site.

How to fix these errors

13. Not Leveraging Internal Links to Pass Equity

Internal links help to distribute "equity" across a website. Lots of sites, especially those with thin or irrelevant content, tend to have a lower amount of cross-linking within the site content. Cross-linking articles and posts helps Google and your site traffic move around your website.

How to fix these errors

14. Errors with Page "On-Page" Markup

Title tags and metadata are some of the most abused code on websites and have been since Google has been crawling websites. Site owners have pretty much forgotten about the relevance and importance of title tags and metadata.

How to fix these errors

Bonus: Structured Data

With Google becoming more sophisticated and offering webmasters the ability to add different markup data, it is easy to see how schema markup can get messy. The correct schema markup data can allow you to dominate the onscreen element of a SERP.

How to fix these errors

Wrapping It Up

As search engine algorithms continue to advance, so does the need for technical SEO. If your website needs an audit, consulting or improvements, contact us directly for more help. Getting these fundamentals right is the foundation on which all other SEO work is built.