We've run over 100 technical audits this year. Through this we've gained deep insights into how technical structure impacts a website's performance in search. This article will highlight the most common technical SEO issues we encounter and which have the largest impact on organic traffic when corrected.
1. Mismanaged 404 Errors
This happens quite a bit on eCommerce sites. When a product is removed or expires, it's easily forgotten and the page "404s". Although 404 errors can erode your crawl budget, they won't necessarily kill your SEO. Google understands that sometimes you HAVE to delete pages on your site.
However, 404 pages can be a problem when they:
- Are getting traffic (internally and from organic search)
- Have external links pointing to them
- Have internal links pointing to them
- Have a large number of them on a larger website
- Are shared on social media / around the web
The best practice is to set up a 301 redirect from the deleted page into another relevant page on your site. This will preserve the SEO equity and make sure users can seamlessly navigate.
How to find these errors
- Run a full website crawl (SiteBulb, DeepCrawl or Screaming Frog) to find all 404 pages
- Check Google Search Console reporting (Crawl > Crawl Errors)
How to fix these errors
- Analyze the list of "404" errors on your website
- Crosscheck those URLs with Google Analytics to understand which pages were getting traffic
- Crosscheck those URLs with Google Search Console to understand which pages had inbound links from outside websites
- For those pages of value, identify an existing page on your website that is most relevant to the deleted page
- Setup "server-side" 301 redirects from the 404 page into the existing page you've identified
2. Website Migration Issues
When launching a new website, design changes or new pages, there are a number of technical aspects that should be addressed ahead of time.
Common errors we see:
- Use of 302 (temporary redirect) instead of 301 (permanent) redirects
- Improper setup of HTTPS on a website — specifically, not redirecting the HTTP version into HTTPS, which can cause issues with duplicate pages
- Not carrying over 301 redirects from the previous site to new site
- Leaving legacy tags on the site from the staging domain (canonical tags, NOINDEX tags, etc.)
- Leaving staging domains indexed
- Creating "redirect chains" when cleaning up legacy websites
- Not saving the force www or non-www of the site in the .htaccess file
How to fix these errors
- Triple check to make sure your 301 redirects migrated properly
- Test your 301 and 302 redirects to make sure they go to the right place on the first step
- Check canonical tags and ensure you have the right canonical tags in place
- Given a choice between canonicalizing a page and 301 redirecting a page — a 301 redirect is a safer, stronger option
- Check your code to ensure you remove NOINDEX tags (if used on staging domain)
- Update your robots.txt file and check your .htaccess file
3. Website Speed
Google has confirmed that website speed is a ranking factor — they expect pages to load in 2 seconds or less. More importantly, website visitors won't wait around for a page to load. Slow websites don't make money.
Optimizing for website speed will require the help of a developer, as the most common issues slowing down websites are:
- Large, unoptimized images
- Poorly written (bloated) website code
- Too many plugins
- Heavy Javascript and CSS
How to find these errors
- Check your website in Google PageSpeed Insights, GTMetrix, or Pingdom
How to fix these errors
- Hire a developer with experience in this area
- Make sure you have a staging domain setup so website performance isn't hindered
- Where possible make sure you have upgraded to PHP7 — this will have a big impact on speed
4. Not Optimizing the Mobile User Experience (UX)
Google's index is officially mobile first, which means the algorithm is looking at the mobile version of your site first when ranking for queries. Google wants both desktop and mobile experiences to be the same — don't significantly simplify one over the other.
How to find these errors
- Use Google's Mobile-Friendly Test to check if Google sees your site as mobile-friendly
- Does your website respond to different devices? Test all site pages on mobile
How to fix these errors
- Focus on building your pages from a mobile-first perspective. Google's preferred option is responsive design
- Focus on multiple mobile breakpoints, not just the latest iPhone — 320px wide is still super important
- Test across iPhone and Android
5. XML Sitemap Issues
An XML Sitemap lists out URLs on your site that you want to be crawled and indexed by search engines. You're allowed to include information about when a page was last updated, how often it changes, and how important it is relative to other URLs.
Sitemaps are particularly beneficial on websites where:
- Some areas of the website are not available through the browsable interface
- The site is very large with a chance for crawlers to overlook new or recently updated content
- Websites have a huge number of pages that are isolated or not well linked together
How to find these errors
- Make sure you have submitted your sitemap to Google Search Console
- Use Bing Webmaster Tools to submit your sitemap as well
- Check your sitemap for errors in GSC: Crawl > Sitemaps > Sitemap Errors
How to fix these errors
- Make sure your XML sitemap is connected to your Google Search Console
- Run a server log analysis to understand how often Google is crawling your sitemap
- If you are using a plugin for sitemap generation, make sure it is up to date and that the file it generates is valid
6. URL Structure Issues
As your website grows, it's easy to lose track of URL structures and hierarchies. Poor structures make it difficult for both users and bots to navigate, which will negatively impact your rankings.
- Issues with website structure and hierarchy
- Not using proper folder and subfolder structure
- URLs with special characters, capital letters, or not useful to humans
How to fix these errors
- Plan your site hierarchy — we always recommend parent-child folder structures
- Make sure all content is placed in its correct folder or subfolder
- Make sure your URL paths are easy to read and make sense
- Remove or consolidate any content that looks to rank for the same keyword
- Try to limit the number of subfolders/directories to no more than three levels
7. Issues with robots.txt File
A robots.txt file controls how search engines access your website. It's a commonly misunderstood file that can crush your website's indexation if misused. Most problems tend to arise from not changing it when you move from your development environment to live, or miscoding the syntax.
How to fix these errors
- Check Google Search Console reporting (Crawl > robots.txt tester) — this will validate your file
- Check to make sure the pages/folders you DON'T want crawled are included
- Make sure you are not blocking any important directories (JS, CSS, 404, etc.)
8. Too Much Thin Content
It's no longer a good idea to crank out pages for "SEO" purposes. Google wants to rank pages that are deep, informative, and provide value. Having too much "thin" content (less than 500 words, no media, lack of purpose) can negatively impact your SEO.
- Content that doesn't resonate with your target audience will kill conversion and engagement rates
- Too much low quality content can decrease search engine crawl rate, indexation rate, and ultimately traffic
How to fix these errors
- Cluster keywords into themes so rather than writing one keyword per page you can place 5 or 6 in the same piece of content and expand it
- Work on pages that keep the user engaged with a variety of content — consider video, infographics, or images
- Think about your user first — what do they want? Create content around their needs
9. Too Much Irrelevant Content
In addition to "thin" pages, you want to make sure your content is "relevant." Irrelevant pages that don't help the user can also detract from the good stuff you have on site. This is particularly important if you have a small, less authoritative website.
How to fix these errors
- Remove quotas in your content planning — add content that adds value rather than the six blog posts you NEED to post to hit a number
- Add pages to your robots.txt file that you would rather not see Google rank — focus Google on the good stuff
10. Misuse of Canonical Tags
A canonical tag (aka "rel=canonical") is a piece of HTML that helps search engines decipher duplicate pages. If you have two pages that are the same (or similar), you can use this tag to tell search engines which page you want to show in search results.
We often find websites that misuse canonical tags in a number of ways:
- Canonical tags pointing to the wrong pages
- Canonical tags pointing to 404 pages
- Missing a canonical tag altogether
- eCommerce and faceted navigation issues
- When a CMS creates two versions of a page
How to fix these errors
- Run a site crawl to find canonical issues
- Review pages to determine if canonical tags are pointing to the wrong page
- Run a content audit to understand pages that are similar and need a canonical tag
11. Misuse of Robots Tags
As well as your robots.txt file, there are also robots tags that can be used in your header code. We see a lot of potential issues with this used at file level and on individual pages. In some cases, we have seen multiple robots tags on the same page — Google will struggle with this and it can prevent a good, optimized page from ranking.
How to fix these errors
- Decide how you will manage/control robots activity — Yoast SEO gives you good abilities to manage robot tags at the page level
- Make sure you use one plugin to manage robot activity
- Make sure you amend any file templates where robot tags have been added manually
12. Mismanaged Crawl Budget
It's a challenge for Google to crawl all the content on the internet. In order to save time, the Googlebot has a budget it allocates to sites depending on a number of factors. A more authoritative site will have a bigger crawl budget than a lower authority site.
How to fix these errors
- Reduce the errors on your site
- Block pages you don't really want Google crawling
- Reduce redirect chains by finding links that point to pages that are themselves redirected — update all links to the new final page
- For ecommerce specifically, block parameter tags that are used for faceted navigation without changing the actual content on a page
13. Not Leveraging Internal Links to Pass Equity
Internal links help to distribute "equity" across a website. Lots of sites, especially those with thin or irrelevant content, tend to have a lower amount of cross-linking within the site content. Cross-linking articles and posts helps Google and your site traffic move around your website.
How to fix these errors
- For pages you are trying to rank, find existing site content that can link to the page you want to improve ranking for, and add internal links
- Use crawl data from Screaming Frog to identify opportunities for more internal linking
- Don't overcook the number of links and the keywords used to link — make it natural and across the board
14. Errors with Page "On-Page" Markup
Title tags and metadata are some of the most abused code on websites and have been since Google has been crawling websites. Site owners have pretty much forgotten about the relevance and importance of title tags and metadata.
How to fix these errors
- Use Yoast to see how to rework the titles and metadata, especially the meta description — take advantage of the increased character count Google now displays
- Use SEMrush to identify and fix any missing or duplicate page title tags — make sure every page has a unique title tag and meta description
- Remove any non-specific keywords from the meta keyword tag
Bonus: Structured Data
With Google becoming more sophisticated and offering webmasters the ability to add different markup data, it is easy to see how schema markup can get messy. The correct schema markup data can allow you to dominate the onscreen element of a SERP.
How to fix these errors
- Identify what schema you want to use on your website, then find a relevant plugin to assist
- Once the code is built, test with Google's Structured Data Markup Helper
- Google prefers the JSON-LD format — ensure your developer knows this format
Wrapping It Up
As search engine algorithms continue to advance, so does the need for technical SEO. If your website needs an audit, consulting or improvements, contact us directly for more help. Getting these fundamentals right is the foundation on which all other SEO work is built.