Technical SEO audit: 140 points to audit

Technical SEO Audit

You who are looking for The ultimate checklist to carry out a  technical SEO audit in 2021, stop digging, here it is.

This checklist takes into account the most important points to audit according to technical constraints and current SEO requirements.

Of course, I am speaking to site administrators but also to external auditors who will find here all the steps to audit a client’s site.

Method:

I followed a logical order to validate the points of this checklist,

and with each point is associated a tool and a method for verifying the items to be reviewed.

Anyway, we will use a lot of the SEO Spider tool of the Screaming Frog suite, technical audit, and On-Page analysis software par excellence.

It has a freemium version which allows you to analyze sites that have up to 500 URLs.

For some features on the other hand, even below 500 URLs, it’s not free.

I will explore other alternatives, but which may also pay off: for example, I use SEMrush a lot.

To learn more about this tool, you can read my full guide to SEMrush.

Other tools or tips can be used for your audit, and sometimes even simple checks “by hand” (or rather by eye).

Click on the links for quick access to the different categories:

Global audit

Before tackling the bulk of the subject, a review of the site and a first audit will allow you to familiarize yourself with certain issues.

Take a first visual examination

Review your site …

and take notes on what seems to be a good site and what does not.

The reports of the different tools will then allow you to better understand what you will have noticed.

Use SEMrush’s Site Audit tool

I talked about this very well-done feature in my last review article on Semrush and Ahrefs.

Semrush’s Site Audit finds almost every technical flaw in your site by analyzing over 130 SEO factors.

And he does it in order of importance.

If you have a Semrush account, don’t miss the audit …

… But also take the opportunity to take a look at other reports concerning your site and your competitors to note the evolution later.

Fix the most important errors before moving on to a detailed audit.

Accessibility by search engines and users

There are a number of technical errors that can pose problems for SEO and your site’s performance.

But the first thing to check is the settings that allow its indexing so that before being efficient, it is actually accessible

– at least for the pages you want to appear on the SERPs -.

Indexation

Check how many pages on the site are indexed:

Use the search operator site:(such as site:bloggercage.com)in Google to check the approximate number of indexed pages.

Normally, the home page should be the first result of this search.

Make sure this roughly matches the number of canonical (indexable) pages in Google Search Console:

Index> Coverage> Valid

Too much of a difference should draw your attention to the indexing of your pages.

Check if irrelevant content is indexed

For this, you need to take a direct look at the Google results, at least the first 5 pages.

Use the same search operator site:

Multiple results pages will give you a good overview of which pages the search engine is indexing.

Make sure you have a robots.txt file at the root of your site

Just go to the address https://example.com/robots.txt.

(For my site it would be https://www.bloggercage.com/robots.txt )

You can also use the free SEOptimer tool which will provide you with a robots.txt file tracking audit.

See if the robots.txt file is blocking resources that should be indexed by search engines

It happens to make errors in the contents of its robots.txt file.

This can lead to a lack of crawler crawling of pages that should appear in search engines.

They can be indexed, receive links … but not crawled and therefore not well positioned.

Check that the resources that should be blocked are

You can effectively choose to block certain resources of no interest to search engines.

This is the case for URLs with parameters, pages with low content, etc.

Check that the robots.txt file does not contain errors

Even if the robots.txt file does not contain syntax errors, there may be other types of errors …

Like errors in the name of a blocked directory.

See if meta robots tags are blocking resources that should be indexed – with Screaming Frog

This tag located in the element <head>of a page tells robots whether the page should be indexed or not.

A value “noindex” blocks the indexing of the page.

The presence of a “noindex” value on certain pages is verified in Overview> Crawl Data> Directives> Noindex.

To quickly understand everything about this tag, here is a very clear article.

And do the opposite … see if any indexed pages lack a robots meta tag

Review your poor content pages to see if your site would benefit from being blocked by the robots meta tag.

Check if the X-Robots-tag HTTP header is blocking resources it shouldn’t

The indexing of a web page can also be blocked through an HTTP X-Robots-Tag header.

You will find the presence of a “noindex” instruction in the same Screaming Frog Directives section, X-Robots-Tag 1 column (visible by scrolling to the right).

This article explains how to modify HTTP headers.

Check that all robot guidelines do not contradict each other

If this happens, the more restrictive statement will apply (possibly resulting in an index restriction).

Make sure important pages like your privacy policy are searchable

Today, these pages have a role to play in calculating your EAT.

Expertise – Authoritativeness – Trustworthiness stands for Expertise, Authority, and Reliability.

It goes without saying that the transparency of your policies and conditions is proof of your reliability for Google.

XML sitemaps

Verify that the robots.txt file indicates the location of a sitemap.xml address

This is a line to add in the robots.txt file to indicate the location of your sitemap.xml file (if you have one).

It can be located anywhere and looks like this:

Sitemap: https://www.example.com/sitemap.xml

Either way, indicating the location of the sitemap in the robots.txt only becomes important if the address is not standard or there are multiple sitemaps.

In the latter case, just add a new row Sitemap: for each different address.

Check if a sitemap (s) is/are listed in the Google Search Console

Not having a sitemap file can affect search engine access to the site and in particular deep pages.

Go to Index> Sitemaps.

If a sitemap has an issue, you can click on it to review it.

Google tells you about sitemaps in this article of its documentation.

You can verify the existence of a sitemap in several other ways:

Go to the default URL which is https://www.example.com/sitemap.xml or even /sitemap_index.xml

On the robots.txt file as we saw previously

In WordPress or any other content management system

Here the site map can be activated and consulted with the Yoast SEO extension.

Make sure the sitemap contains all the necessary URLs – with Screaming Frog or SEMrush

Logically, a site map should contain all the indexable and canonical pages of a site.

But sometimes there are mistakes.

To configure sitemap auditing with Screaming Frog, you need the paid version.

Here I continue with SEMrush (paid version also necessary to crawl more than 100 pages).

The path: Site Audit> Overview> Explorability then Pages in the sitemap.

This gives you a view of the pages crawled with a filter (not otherwise accessible): “Present in sitemap”.

Check if there are any obsolete parameters in the sitemap such as

<priority> and <changefreq>

This is done directly in the file.

See if the site also has an images sitemap

An images sitemap lists all the URLs of the images on the site – in addition to the URLs on which they appear -.

It starts to gain interest when the site has a lot of important images and is gaining traffic through Google Images.

This sitemap can be combined with the standard sitemap or be created separately.

The second option is the most viable for easy tracking on Google Search Console.

Same thing with a video sitemap

On the other hand, this sitemap only has a reason to exist if the site hosts its videos on its own server.

Subdomain analysis

If you have important subdomains, make sure they have their own root crawl instructions

In order for your subdomains to be properly indexed, they must of course be open to crawling by robots and have their own robots.txt and sitemap.xml files.

The “tests” subdomains must be configured in “noindex”

Check the correct indexing of your subdomains – with the Google Search Console

Architecture Analysis

Your site is well-indexed: congratulations!

With an optimized and streamlined architecture, it will be even more accessible by users and search engines.

Is the site structure really siled, and not “flat”? Test it with Screaming Frog

Not all pages should be one click away from the home page.

You can use the Visualizations available in the top menu bar of Screaming Frog SEO Spider.

For an in-depth analysis, you can use Gephi, a tutorial of which is available here.

Check that the site architecture is not too deep

Too deep a structure means pages 4 or 5 clicks deep and a lot of orphan pages.

Breadcrumb

Check that your site has a breadcrumb trail if it is more than 3 levels deep

On very large websites, this is really essential.

Check that the breadcrumb trail is correctly installed:

Navigation via the breadcrumb trail must be done through structured data

The navigation must take into account all the pages of the path

The last page must not be a link

Check that the breadcrumb trail is consistent and applies to as many pages as possible

See this Google document for more information on breadcrumbs.

Navigation

Check that the most important pages of the site are directly accessible from the main page

The home page should have links to major platforms/category pages, as well as information and contact pages.

Is the main navigation of the site coded in HTML and not in Javascript? – Visual examination

Browsing in HTML ensures a more accessible experience.

Nothing prohibits the use of elements of navigation in JavaScript but simply, they should not constitute the main navigation.

Is the navigation in the footer intuitive and uncluttered?

The footer allows you to integrate menu elements that could not be implemented in the main menu because they are less essential at first.

That does not prevent them from being essential in the end …

Your users will look for crucial information there.

Therefore, your footer should not be a catch-all for keywords.

Check that the

<ul>

and <li> list tags are used to build the navigation elements

Navigation links should be built with list tags.

Check that the navigation links are visible to search engine robots

This is important for sites that contain a lot of JavaScript.

To do this, compare the source code to the HTML rendering in your browser.

Check the correct functioning of the navigation on mobile

Do this directly on the mobile version of your site.

Do the menus open correctly, are the links successful?

Is it a new tab or a new page that opens?

Check that external links that are not true recommendations have a relevant attribute

Sponsored links should be tagged rel=” sponsored “while outbound links that don’t need to pass link juice have the attribute rel=”nofollow”.

You can use Screaming Frog to take a look at external links on pages that have a lot of them.

Go to the External tab and select the links found for details.

Check that links added by users have the rel = “ugc” tag – in the source code

If you have sitewide links, make sure they are Nofollow

A single link distributed on the whole of a site (in the footer for example) can possibly be reviewed manually and be the cause of a Google penalty.

Hierarchy

Verify you are using canonical tags – with Screaming Frog

A good rule to apply is the insertion of a canonical tag in all canonical pages of the website.

Check that all of your important pages are marked as canonical.

Go to Overview> Crawl Data> Canonicals> All

Check anyway that the existing canonical tags are used correctly

Check that no genuine page is canonized (referring to another URL such as the home page).

Check that Google has taken your canonization recommendations into account – with the Google Search Console

Indeed, a canonical beacon is an indication more than an order.

This is verified in the Search Console URL inspection tool.

To check all your canonical URLs, you can group them together by choosing to rank your URLs according to their indexability on Screaming Frog.

Then copy the link of the URL of your choice to analyze it with the console:

On-Site and On-Page Analysis

On-Site and On-Page analysis will ensure that the information returned by your site is as clear as possible,

especially so that Google can judge the relevance and richness of the content.

URLs

Use Screaming Frog to analyze URLs in batches.

Check that your URLs are SEO Friendly

Don’t hurt your click-through rate for some stupid untrustworthy URL question.

The basis is therefore to have a maximum of clean URLs (made up of words), short, memorable, or even shareable.

Check if there are URLs containing parameters (session or user identification) that do not modify the displayed content

If URLs do not modify the content, they must have a canonical link to their version without parameters.

Check that the URLs contain keywords

In this video, ex-Google engineer Matt Cutts suggests that the keywords in the URL are indeed SEO-friendly.

This video is old but in any case, you have to consider that a URL that displays a keyword is certainly more user-friendly than a series of numbers and characters.

Check that there are no foreign language words in your URLs

For a matter of relevance affecting both users and search engines, URLs must be in the same language as the associated pages.

Check that words are separated by hyphens in your URLs

They make the address more readable.

Again, this pleases the users as much as the robots.

<title>: Check that your pages have a <title> tag

Like most important tags (and perhaps even more than all the others), the

<title> tag

tells Google about the topic of the page.

So consider writing unique content if it is missing.

You can find the missing content in Overview> Craw Data> Page Titles> Missing.

Check that your <title> tags are between 30 and 60 characters long

Check the “Over 60 Characters” and “Below 30 Characters” lines.

It’s simply a matter of how it looks in the SERPs.

If you have any doubts about what it looks like, go and see it directly on Google.

The first one is fine …

The second, not really!

Check that your title tags are not duplicated – with Screaming Frog

Just move up the “Duplicate” line a bit to make sure your Title Tags are all unique.

<meta description>

Check that all your important pages have meta descriptions – with Screaming Frog

Overview> Craw Data> Meta Description> Missing

If there isn’t one, Google will generate one automatically.

You can see what it looks like by looking for it in the SERPs:

Writing your own meta description, in addition to inserting a keyword, gives you the option of including a Call to Action: think about it.

Check that your meta descriptions are in the correct number of characters range – with Screaming Frog

If you wrote it yourself, it would be a shame if your meta description was cut off or too short (and therefore replaced altogether).

Make sure you don’t have duplicate meta descriptions – with Screaming Frog

You just need to click on the line above the previous one.

Note one thing: it is better not to have a personalized meta description (and therefore one created by Google), than to have the same one several times.

You can learn more (as well as page titles) in this Google Support article.

Body

Check that most of the pages on the site have an H1 tag – with Screaming Frog

Go to Overview> Craw Data> H1> Missing and review the pages that do not have an H1.

Almost all of your pages should have One H1 header to give search engines much-needed content information.

Check that the pages do not have multiple H1 tags – with Screaming Frog

A single H1 tag will have a better impact than several.

Go to Overview> Craw Data> H1> Multiple to check this.

Check your pages are structured by multiple header tags – with Detailed SEO

Multi-level header tags will improve the clarity of your content to engines.

A Chrome extension like Detailed SEO gives you a clear picture of the structure of an individual page.

Go to the Headings tab for this overview.

Check that there is no excessive use of headers – with Detailed SEO

Headings should be used to add clarity,

there is no point in using them excessively and confusing the message of the page.

Check that headers are used appropriately – with Detailed SEO

The hierarchy of your headers should match a certain hierarchy in your content.

The idea is that your point in general is well constructed …

It impacts your SEO.

Make sure you don’t have duplicate content – with Screaming Frog

Overview> Crawl Data> Content> Exact Duplicates and Near Duplicates.

Check that you don’t have duplicate content with other sites – with Copyscape

Content copied, too similar, etc.: your poor or plagiarized content may be penalized.

Check its originality on the basis of individual URLs with Copyscape.

Check that we don’t have (or too many) poor content pages – with Screaming Frog

On the line just below: Low content pages.

The length and richness of the content is, as we know, an SEO factor.

It goes without saying that for some pages like the Contact page, there is no problem!

Check your homepage is not too poor in content – with Detailed SEO

This is your most important page, it must contain the essential information and be perfectly structured.

Review its content live and take the opportunity to make the Detailed SEO plugin work on headers.

Check that the right tags are being used to highlight your keywords

In your content, it may be interesting to highlight certain terms, especially your keywords.

The recommended tag, in this case, is the tag (rather than the tag for “bold”).

Do this by visually examining the code on your most important pages.

Structured Data

Structured data is a standardized language that allows Google to provide a range of information on certain types of content.

It’s a bit of a way to store them in boxes so as to show them off or take them out of their boxes more easily.

The Google Search Console help portal gives you all the information you need to understand structured data.

Does your site use structured data? – Check with Screaming Frog or SEMrush

In SEO Spider, you will need to configure your structured data with the paid version.

Configuration> Spider> Extraction, “Structured Data”

I do my own structured data audit with SEMrush, going to SEO> On-Page and Tech SEO> Site Audit.

In the Overview, there is a new box: Markup, which corresponds to the Schema Markup language.

On the associated page, I can see which structured data I am using, on which pages, and see if some are invalid and therefore need to be corrected.

Check the validity of your structured data

Click on the “View invalid items” button to open the report on the structured data to be corrected.

Consider whether some pages might have other types of structured data

Go through the most important pages to see if it is possible to add some structured data to them.

They could then appear in different types of rich results on Google.

To quickly check which structured data each of these pages uses, use the “Schema” tab of the Detailed SEO plugin.

If you are a WordPress user, you can learn how to use RankMath to set up different Rich Snippets by starting with step 15 of my WordPress SEO checklist.

Pictures

Screaming Frog will still be your ally when it comes to images.

Go to Overview> Craw Data> Images.

Check that the images are correctly integrated

An image embedded with CSS will not be treated as content by Google.

These are part of the style, the layout of the site but do not represent the content itself.

To check how an image is embedded, right click on it and click “Inspect”.

It should contain the <img> tag and ideally an <alt> tag.

Check that alt attributes are filled and optimized – with Screaming Frog

It is best if the alt attributes of the images are unique and representative.

To find the missing alt attributes, go to Overview> Craw Data> Images> Missing alt text.

Theoretically, alt text and alt attribute are the same thing but here missing alt text includes images that have an alt attribute but where it is empty of text.

Kind of like this: <img src = “screamingfrog-logo.jpg” alt = “” />

Then click on each image in the list on the left to see which page it appears on and modify the alternate attribute there.

(If you want to check for alt attributes on a single page, you can also use Detailed SEO).

Check if the name of your images is relevant – with Screaming Frog

It’s not as important as the alternate attributes, but indicating what the image is “about” is still a plus if you don’t do keyword stuffing.

You will find these file names directly at the end of the URL of your images.

Are the images optimized? – with PageSpeed ​​Insights

Like GTMetrix, PageSpeed ​​Insights can give you tips on compressing certain images that make your pages load slower.

Optimize images in batch

On WordPress, some plugins like Smush can help you optimize multiple images at once.

SEO International (several countries or languages ​​targeted)

This section only applies if your site addresses several countries or several languages ​​(multiregional and / or multilingual).

Does each target country have its domain name with an adequate ccTLD?

The ccTLD (country code Top Level Domain) is a guarantee of confidence for the Internet user who understands that the site is managed from his country of origin.

It is therefore an interesting option if all the domain names with ccTLDs of the countries we are targeting are available.

If there is one site / ccTLD per country, is each site registered as a property on the Google Search Console?

Is international targeting set for each of the Google Search Console properties?

International targeting is an old report that has not been adapted for the new version of the console but is still accessible and functional under its old version.

You must therefore go to Old tools and reports and International Targeting.

A new page opens and in the Country tab, you can choose the target country.

Check if the languages ​​are clearly demarcated on the site

If the languages ​​are not clearly delimited, there can quickly be an indexing problem.

For example, it is possible that only one page version is indexed (while there is one for each language) …

and that other pages are only indexed in another language.

To avoid this, it is preferable that each version has its own directory (of type .com / fr), or even its subdomain.

It is also ideal to have a drop-down menu or a button making every other language / directory / subdomain accessible.

Finally, use hreflang tags.

Check if you have hreflang tags – with Screaming Frog

On the one hand, the hreflang tags are used by Google to find the different versions of a page.

On the other hand, they can also be used to rule out any suspicion of duplicate content for almost identical pages which are in the same language but intended for a different region.

Access your hreflang tags in Overview> Crawl Data> Hreflang> Contain hreflang.

Stay in the All section to also see pages that don’t have one but might have one.

Check that the hreflang tags are associated with the correct versions – with Screaming Frog

In the Contain hreflang section of Screaming Frog, review each page to see what URLs are associated with it.

For each page (each row) you should scroll the section to the right and inspect each column “HTML hreflang 2 URL” etc., to see which URLs are associated in other languages.

Check for missing backlinks – with Screaming Frog

The missing hreflang return links can be found in the Missing Return Links section.

This section detects pages to which there are hreflang type tags

el-alternate-hreflangbut which do not have any returned.

They must have some for all these values ​​to be confirmed one another.

Check if there are any x-default attributes if needed – with Screaming Frog

The X-Default attribute indicates which pages on the site are considered the “default” pages, not targeting any particular region.

It must therefore be used on international sites, which indeed have a “default” path and not a different offer for each region.

Any pages that have hreflang tags without an x-default tag are listed in the Missing X-Default section.

For ecommerce sites, are currencies localized?

To sell your products in multiple currencies, you can set up a multi-store configuration or choose a simple display in the local currency corresponding to the current conversion rate.

Are there too many links on the home page? – with Screaming Frog

Too many links on the homepage could be a sign of something fishy.

If you have 100 or more, you should start asking yourself questions.

Go to Overview> Crawl Data> Internal> HTML, and click on the home page link which should be the first one in the list.

Look at its Outlinks (lower menu), and the number under the table.

Do all pages of the site have a link to the home page? – with Screaming Frog

All pages should have a link to the home page.

Most of the time, it is located on the site logo and / or on a “Home” button in the top menu.

To confirm the presence of this link, go to Overview> Crawl Data> Canonicals> All.

Select the URLs one by one (by scrolling through them) and look at their Outlinks (lower menu).

The first Outlink should always be a link to the home page.

Are there links to non-existent or blocked resources? – with Screaming Frog

Links to inaccessible resources are synonymous with poor user experience.

To find these “broken” links, inspect all links in both Overview> Internal and External.

Organize the lists that appear by “Status Code” to group links to the problematic status code, such as 401, 403, 404, and even 999.

You will then find the page (s) where these links are placed by clicking on the Inlinks tab in the lower menu.

Install monitoring of your broken links – with Broken Link Checker

A good way not to let broken links settle for too long:

(and if you have WordPress)

Use the Broken Link Checker plugin .

This regularly crawls your content for broken internal and external links.

And when it finds any, it sends you a notification.

Inspect the alternative tags of your image links

If some of your links are in image form, it would be particularly unfortunate in SEO if that image didn’t have alt text.

To inspect these alternative texts and see if it would be coherent to add some where there is none,

go to Overview> Internal then External and> Images.

Then in the lower Inlinks menu scroll the window bar to the right to find the alt texts.

Inspect your possible NoFollow backlinks – with SEMrush

You have access to a basic backlink audit function from the freemium version of SEMrush .

In SEO> Link building> Backlinks analysis, go to the “Backlinks” tab,

and select the “NoFollow” filter.

You will be able to inspect the links that are marked with this attribute to possibly “claim” a DoFollow from the sites that link to you.

Are there any suspicious movements of referring domains? – with SEMrush

Sudden peaks, sudden losses… any unnatural development should spark your attention.

Let’s stay on SEMrush (and the possibility of using its freemium version) for a quick analysis of backlink movements.

SEO> Link building> Analysis of backlinks, graph of referring Domains.

Check the Google backlinks disavowal tool

The toxic links disavow tool is separate from other scan tools because it should be used with caution.

You can find it by Googling it or directly through this link .

Inspect the links that are submitted there, if there are any.

Audit of Analysis tools

Google Search Console

Verify that the site has a Google Search Console account configured

You should have already used it for your audit.

But if it is not done, it is urgent!

Inspect the Performance Report – in the Google Search Console

Select the period over the last 12 months.

And take note of the different metrics and their evolution:

Number of clicks

Number of impressions

Average CTR (Click Rate)

Average position

Inspect the Coverage Report

This important report will show you:

which pages are indexed and appear on Google

which pages are indexed but maybe shouldn’t appear on google

which pages are excluded and whether it is an error or intentional exclusion

Check which is the main crawler

While you are in the report, verify that the primary crawler complies with Mobile First Indexing which comes into effect this year.

The robot must therefore be the Googlebot for Smartphone.

Inspect Delete requests

Go to the bottom menu to see if someone has requested removal of content from your site.

It is located just below the Sitemaps menu.

Check out suggested Google Search Console Improvements

You can just take a look, since you’ll be looking at Core Web Vitals, among other things, later in this audit.

Check that there is no manual penalty action by Google

To verify that a Google agent has not applied a penalty on your site,

Go to Security and manual actions> Manual actions.

These manual actions are rarer but still exist.

To watch therefore.

If there is a penalty, then you must:

Make sure to correct what caused it,

Submit a reconsideration request for each affected URL (“Request a review” in the URL Inspection Tool).

Take a look at your links

This is just a quick check.

This is done in the menu Old tools and reports> Links.

Take a look at the exploration statistics

This report is available in Settings> Exploration statistics

It gives you an overview of how Google crawls the site.

You will have information on the health of your hosts, on the resources that Google crawls the most, on the codes returned by the server, etc.

Google analytics

Basic configurations

Do you have a Google Analytics account linked to your Search Console?

The association of the two accounts is done from Search Console, in Settings.

Click Associations.

Then associate the property (s) linked to the Google account.

For a more precise analysis, your traffic and user data will be linked to the technical performance and everything related to the indexing of your site.

Watch for possible JavaScript errors

JavaScript is a language subject to syntactic errors (errors due to the developer) and bugs (errors due to the server, the browser, etc.).

If you understand English (or auto-generated subtitles), watch this great video for setting up JavaScript error tracking with the Google Tag Manager – in Analytics -.

Check that robot filtering is enabled

At one time, the traffic generated by bots (which is huge) did not appear in Analytics statistics, because they did not generate JavaScript.

The filtering of web crawlers is always automatic, but unfortunately traffic related to other less desirable robots may appear.

To eliminate these from your data (and not take them for humans generating real marketing decisions), go to:

Administration> View>

View settings

Then check the box “Exclude all calls from known robots”.

For other unknown robots (and even less benevolent), it will be necessary to learn to spot them and to set up more advanced filters.

Data tracking

Is ecommerce data tracking enabled in Google Analytics?

With ecommerce tracking, you can track data about sales, orders, their average amount, location of billings, and more.

This data is correlated with that of the general use of the site to analyze the performance of your landing pages, your marketing campaigns, etc.

Setting up this tracking can be a bit tedious and require the action of a developer.

However, there are some online tutorials (often adapted to different e-commerce platforms).

Among other things, you will need:

copy a tracking code to all the pages of your site,

activate ecommerce reports for each Vue Analytics in which you want to find them,

then integrate your ecommerce cart,

add the tracking code on your order confirmation page,

configure a sales funnel adapted to the customer journey towards conversion on your site.

Check host names in your Audience reports

Most of your visitors should come to your site through your domain name, subdomain, etc.

Other host names mean that they are not real users but external services (like caches, Google translate…).

You should therefore filter these hosts at least for the refined view intended to analyze your audience – the real one.

So check if there are any strange hostnames.

Audience> Technology> Network, “Hostname” filter in the table.

And filter out unwanted / spammy hostnames in your Administration:

View column, Select your view then Filters and “exclude traffic to hostname”.

Filter a possible referrer spam

Referrer, or referral spam, can completely distort your Analytics data.

For the referral crawl, these are falsely referring sites whose robots come to visit your site;

or in the case of the ghost referral, simple protocols attack your Analytics identifier in order to arouse your curiosity, and your visits.

There are techniques to identify this type of spam in the Referral line of the Channels table (Acquisition menu).

For example, to identify the ghost referrer spam, you must give as a secondary dimension: “Hostname”, and identify the “(not set)”.

Spam that is in the form of a domain name can be blocked in the .htaccess file in the site root directory, or through a filter in a new Analytics view:

(or the same one created to filter hostnames)

Filter type: Custom, Check Exclude,

Filter field = Campaign source,

Filter rule = enter the terms of the spammy domain name separated by \ – |

Bing Webmaster Tools

Does the site have a Bing Webmaster Tools account?

Bing’s analytics tool has some cool features that are worth checking out.

Bing remains 2nd behind Google, and even if it is only for 4.5% of market share (mainly on desktop), it is still a big figure in terms of search.

Does the site have a Yandex Webmaster account?

Optimizing your site for Yandex makes sense from the moment you target the Russian market.

So if that makes sense to you, maybe now is the time to sign up for an account.

The 2 real positive points of this search engine:

The competition is less harsh there,

Optimizing for Yandex is easy (the algorithm is similar but less complex than Google’s).

Does the site have a Baidu Webmaster Tools account?

Same as Yandex, except Baidu applies to the Chinese market.

Performance analysis of important pages

You will see that we are entering more and more into the analyzes of the technical performance of the site.

Analyze the loading speed of your important pages with PageSpeed ​​Insights (on desktop)

Change tab on PageSpeed ​​Insight auditing, which is set to mobile by default.

NB: you can also use GTMetrix or the PageSpeed ​​Insights API available in the paid version of Screaming Frog.

Check if the tool indicates a lack of optimization of JS, HTML or CSS codes

This represents optimization opportunities that are quite easy to seize.

So check the other possible changes that are easy to implement.

Improve your loading speed by downloading WP Rocket

This cache plugin will store information on the server.

These will not have to be loaded dynamically each time a page is opened.

You can find the details of its implementation in chapter 16 of my wordpress SEO checklist.

Check that the site meets the standards of the W3C (World Wide Web Consortium) – with SEOptimer

SEOptimer tools are free for a limited number of uses per day.

The Webpage W3C Validator to SEOptimer provides you an appreciation of your web markup compared to standard W3C international consortium.

Is the site using outdated HTML tags? – with SEOptimer

Use SEOptimer’s Deprecated HTML Tags Checker to do this quick check.

Does the site contain JavaScript errors? – with SEOptimer

This test is performed page by page with the Webpage Javascript Error Checking Tool .

Are your CSS files minified? – with SEOptimer

The verification is done in the Performance section of the Audit Results.

If you want to do this minification (make CSS shortcuts to lighten the weight of these files), use the CSS Minifier tool .

Are JavaScript files minified? – with SEOptimer

The verification is done, and for the minification tool it is JS Minifier !

Check if there are any inline CSS styles – with SEOptimer

The “inline” CSS style directly formats a specific HTML tag using the <style> attribute.

It is rarely used, it is an exception and should not be the rule

You will also find this report in the Performance section of the site audit done by SEOptimer.

Testing on mobile devices

Take your site to the Google mobile usage test

And thus check its compatibility for mobile search.

Check that there is no problem with charging

Check that the screenshot matches the HTML code

Compare the HTML in the mobile optimization test screenshot with what you get in the browser.

Make sure that the most important links and content are present in the HTML code.

Analyze the loading speed of your important pages with PageSpeed ​​Insights (on mobile)

This Google tool gives you an overall speed score (you should aim above 80).

It also provides advice on improving the speed of the analyzed page,

starting with items with a score below 50.

Test your Core Web Vitals with GTMetrix

The tool has an interface that provides a foreground preview of Core Web Vitals (or Page Experience), the algorithm update scheduled for May 2021.

These are focused on three criteria: speed of loading, interactivity, and visual stability.

They are represented by these three measures:

Largest Contentful Paint (LCP)

First Input Delay (FID)

Cumulative Layout Shift (CLS)

Under the load times you also have a visual representation of the loading of the different elements.

On PageSpeed ​​Insights, Core Web Vitals are marked with a small bookmark.

Check that your images are well compressed / optimized with GTMetrix

Expand the menu to see which images should be resized / compressed.

Consider adopting new generation image formats (such as JPEG2000 or WebP) to optimize their compression.

Tests on the server

Basics

Install a backup plugin if necessary

Before touching the site as a whole, always make a backup of it.

This must be located somewhere other than the original files.

On WordPress, there are plugins for this like BackWPup .

Analyze log files – with SEMrush

This exploration is important for large sites.

Log files or “logs” list all requests made on the server (by humans and search engines) and the data associated with them.

SEMrush’s Log File Analyzer tool analyzes these files to help you understand how search engines are interacting with your site.

It helps to see your site through the eyes of the Googlebot and other bots.

You will be able to uncover problems like a waste of the crawl budget.

The procedure for obtaining these log files is indicated on the home page of the tool.

Check if there are any DNS issues

Any issues that may be encountered with your DNS server (which translates your domain name to IP address) can be found with the Super Tool from MxToolBox.com.

By clicking on More Info, and by creating a free account, you will have all the details on each error and the possibility of resolving them.

Is your IP address in the wrong neighborhood?

If your site is hosted at an IP address close to many spammy sites, this raises questions about its reliability.

It can damage your reputation with Google.

Check who your IP neighbors (if you have any) and IP range are with the IP Neighbors tool .

Duplication

Check that the site is not duplicated – with Google

Migration to HTTPS can cause duplication issues.

To see if your site has two SEO-damaging versions (HTTP and HTTPS), you can simply search for it with the following search operator:

site:example.com

And check if the same page appears twice.

Check that the site is accessible in its WWW and non-WWW versions – with Link Redirect Trace

The site should not be accessible in these two versions because it would be duplicated.

One of the two must be chosen as canonical, and the other must have a 301 redirect.

You can verify this with the Chrome Link Redirect Trace extension .

Here I entered the non-www version of my site in my browser:

Check that all pages on the site are accessible to URLs that are not case sensitive

If your site is ever hosted on a server with a case-sensitive operating system, then you have two options:

add a canonical link pointing to a page identical to the lower case URL

use URL Rewriting to convert your URLs to lowercase

Verify that pagination avoids duplication issues

You can touch the different solutions to present many products / contents with this short article on pagination .

But to understand all the issues and set up effective pagination, head to a more comprehensive guide .

Site security

Check that the site has no security issues – with the Google Search Console

This is the Security Issues report just below Manual Actions.

Among the problems that your site may encounter …:

Hacked content: malicious content may be placed on the site… without your knowledge of your own accord… if there is a security breach.

Unwanted or malicious software: Malware can be harmful to devices as well as to users.

Manipulation / scam: such as software that extracts information or captures sensitive user data.

Update WordPress and your plugins

While you’re at it, avoid any security issues by updating your CMS and its plugins.

Does the site have an SSL certificate? – with Screaming Frog

The use of HTTPS has been a Google ranking factor in place since 2014.

If you don’t already have this information, you can go to:

Overview> Crawl Data> Security> HTTPS URLs

If you do not have an SSL certificate, then buy SSL certificate and install one as soon as possible on your server.

Do all HTTP pages redirect to their version migrated to HTTPS?

Check the HTTP URLs (see above) in Screaming Frog and make sure the redirects are permanent (code 301).

Is there any mixed content on the site?

Even with an SSL certificate, the site can still upload part of its content from a location not protected by the security key.

This sometimes happens on images, audio / video files, on JavaScript …

Go down to the Mixed Content line of the Security menu of Screaming Frog to check that there is no unsecured content left.

To overcome this problem, you can secure these resources in batch (with a WordPress plugin for example like SSL Insecure Content Fixer ),

or block mixed content (something that Chrome does automatically now) with a CSP directive on the server.

On WordPress: Install a security plugin

WordPress sites are particularly susceptible to attacks.

You can get around it by installing a plugin like Wordfence … and forget about it.

HTTP codes and redirects

HTTP codes represent the server’s response to the request made by the browser.

The codes in 200 indicate correct operation, those in 300 redirects,

those in 500 of various server problems,

and in 400: errors.

Check if any pages are returning 5xx errors

If many pages return an error code (xx, there may be server problems.

It may be overloaded or need some configuration.

Check if any pages are returning 4xx errors

If many pages return a 404 code (not found) or a 401 code, this can negatively affect the UX.

And this goes for both internal AS external links.

If backlinks point to your 404 pages, Google will ignore those links.

The best is to redirect them in 301 to (relevant) pages that work.

Internal links to these pages should be removed, or replaced by links to existing pages.

Check that there is no “soft 404” type error

A “soft 404” page is a page which does not exist but which still returns a 2xx code.

In this case, it will continue to be indexed by Google and users will find themselves in a bind.

With the Headers tool , immediately get the response from the server for any URL.

Verify that the site has a custom 404 page

A rudimentary error page looks a bit crude.

User experience is affected.

Above all, an error page must be clear that the requested page does not exist or no longer exists.

Personalized and in the tone of your brand, it further encourages the user to stay on the site.

Check that there are no redirect chains

The Screaming Frog report on redirect chains can be found in the Reports menu on the computer bar.

It is downloaded directly in .csv.

What you should know is that after 2 or 3 redirects, Google stops crawling a page.

Check that your redirects have the correct status – with Screaming Frog

Most of the time, you should use 301 (i.e. permanent) redirects.

302 redirects indicate temporary changes.

It is therefore necessary to ensure that there is no 302 redirect left by mistake.

Overview> Response Codes> Redirection (3xx) -> Status Code.

Check that there are no Meta Refresh redirects

These redirects tell the browser to redirect to another page after a set loading time.

This redirection has been misused (to doorway pages) but above all it affects the user experience.

For example, it keeps the initial page in the browsing history.

You will find these possible redirects in the Response Codes menu on the Redirection line (Meta Refresh).

Check if the site is accessible in its HTTPS and HTTP versions

If the site has an SSL certificate (essential), all versions that are not in HTTPS must redirect permanently to URLs in HTTPS.

And of course, without a redirect chain.

Last checks

Clean up the plugins

WordPress plugins are very, very practical, but by dint of installing them:

You may have a duplicate (especially security or optimization plugins like Yoast / RankMath…).

There may be some you no longer use.

So review them to eliminate those who contribute (for nothing) to increase the loading of the site.

Conclusion

We cannot say that the technical SEO audit of a site is a piece of cake.

But what is certain is that these tasks become a little less thankless with a little organization, and when you have the solutions on hand to make them easier.

I hope this checklist has helped you to see it clearly, and that you will use it without moderation …

Are there any checks that you think are missing?

Give me your opinion in the comments.

Scroll to Top