Si vous envisagez d'optimiser un site Web, un audit technique de référencement constitue la première étape essentielle. Il vous fournit une base pour lancer des campagnes de référencement qui aideront votre site Web à progresser régulièrement dans les classements. Au fur et à mesure de l'amélioration du classement de votre site Web, vous constaterez également une augmentation de votre taux de conversion et de votre volume de ventes.
Vous ne pouvez pas construire une maison sans une fondation de niveau. L’audit SEO est le niveau de fondation de votre site web. Il garantit que tout exercice d'optimisation que vous entreprendrez ira comme prévu et vous n'aurez pas à revenir en arrière et à réparer les problèmes qui ont été affectés par le processus d'optimisation. En utilisant ces outils d’audit de référencement pour que le site Web soit assez sain forme pour commencer l'optimisation, il vous évite des heures de revenir en arrière pour résoudre les problèmes que le processus d'optimisation a découvert.
Un audit de référencement est un processus qui garantit qu'un site correspond au meilleurs pratiques de l'industrie, donnant ainsi au webmaster une base solide sur laquelle il peut construire une campagne de référencement réussie. Une campagne de référencement réussie est essentielle pour tirer le meilleur parti de votre site Web possible. Certaines étapes doivent être suivies dans un audit de référencement. Ceux-ci doivent être suivis dans un ordre particulier. Tout comme un pilote qui vérifie une liste de contrôle avant de prendre son envol, le processus de vérification assure que tout fonctionne comme il se doit avant toute tentative d’optimisation de votre site Web.
This SEO audit checklist is designed to take you through the necessary steps of an audit from beginning to end, ensuring no stone is left unturned. Here are the steps of an SEO audit checklist:
In order to ensure your website can be seen in the search engine results pages (SERPs), it is important that these two files (i.e. Robots.txt and XML sitemap) are included in the website’s coding. Robots.txt is a basic text file that is placed on your site’s root directory and references the XML sitemap location. It tells the search engine bots which parts of your site they need to crawl. It also stipulates which bots are allowed to crawl your site and which ones are not. It acts as as a signpost pointing the way to where you have posted relevant keywords that will help you gain more visitors to your website.
The XML sitemap is a file that contains a list of all the pages on your website. This file can also contain extra information about each URL in the form of meta data or thumbnail descriptions of the content contained on the respective URL. Together with Robot.txt, the XML sitemap helps search engine bots crawl and index all the pages on your website. You can check Robots.txt and XML sitemap files by following these three steps:
The XML sitemap can mostly be checked by adding /sitemap.xml follow to the root domain. For
type www.example.com/sitemap.xml into your web browser.
As you can see in the above screenshot, PixelTec.co.th has multiple sitemaps.
This is an advanced tactic that can help to maximize the indexation of your website and increase site traffic in some cases.
The experts over at Moz.com have written in detail about the value of multiple sitemaps. if you don’t find a sitemap on your website, you will need to create one. You can use an XML sitemap generator or utilise the information available at Sitemaps.org.
The Robots.txt URL can also be checked by adding /robots.txt follow to the root domain. For example, type www.example.com/robots.txt into your browser.
If a Robots.txt file exists, check to see if the syntax is correct. Syntax refers to the phrasing and spelling of the file name. If it doesn’t exist, then you will need to create a file and add it to the root directory of your web server (you will need access to your web server). It is usually added to the same place as the site’s main “index.html”; however the location varies depending on the type of server used.
Open up Robots.txt and place a directive with the URL in your Robots.txt to allow for auto discovery of the XML sitemap. For example:
The robots.txt file will be:
The above screenshot example from PixelTec shows what www.example.com/robots.txt should look like when the sitemap has been added for auto discovery.
The second stage of completing a technical SEO audit is to check all versions of the page listed below to see if they are all accessible or are redirecting viewers to the website.
It’s important to note that Google prefers sites that use HTTPS rather than HTTP. HTTPS (Secure HyperText Transfer Protocol) is essentially the secure version of HTTP (HyperText Transfer Protocol).
It is particularly important for e-commerce websites to use HTTPS as they require increased security due to shopping carts/payment systems that provide sensitive bank account numbers and/or credit card information.
It is important to check the domain age of a website, as this can affect SERP rankings. An aging website website that hasn’t been updated for quite a while will suffer in the rankings.
Use whois.domaintools.com to:
As outlined by the
at Moz.com, site speed is one of Google’s ranking factors. It’s one of the main factors that
deters visitors from exploring your website. Therefore, it’s important that a site’s load speed is
The loading speed of your website can be quickly checked at tools.pingdom.com. This tool provides information regarding load time and how your site performs in comparison to other websites.
To improve the speed of individual pages loading, webmasters should look at compressing images, adopting a content delivery network and decreasing the server response time. Heavy custom coding and large image sizes can also slow down loading times. Remember to check the site speed of both the mobile and desktop version of your website.
To learn more about improving page loading speeds, visit Google’s PageSpeed Insights.
There are a number of elements to check when assessing a URL’s ‘healthiness’. These include:
The page title (also referred to as the title tag) defines the title of the page and needs to be accurate and concise in its description of what the page is about. The page title appears in the SERPs (see meta description example below) as well as in the browser tab and needs to be 70 characters or fewer in length. It is used by search engines and web users alike to recognize the topic of a page. So it’s important to ensure it is correctly optimised with the most important and relevant keyword(s).
Meta descriptions don’t directly affect SEO, but they do affect whether or not someone is going to click on your SERP listing. They act as explanatory sentences of the content available on the respective page. Therefore, it’s important that the meta description is uniquely written and accurately describes the page in question – rather than simply taking an excerpt of text from the page itself. A snippet optimizer tool makes it easy to create meta descriptions that are the correct length (156 characters or fewer). Note: the below tool also features the page title.
This is a long word that basically means to ensure a website doesn’t contain multiple version of the one page. Canonicalization is important because otherwise the search engines don’t know which version of a page to show users. Multiple versions of the same content also causes issues relating to duplication, which could confuse and frustrate viewers – and it’s therefore important to ensure canonicalization issues are addressed.
If there are multiple versions of one page, the webmaster will need to redirect these versions to a single, dominant version. This can be done via a 301 redirect, or by utilizing the canonical tag. The canonical tag allows you to specify in the HTML header that the URL in question should be treated as a copy, while also naming the URL that the bots should read instead.
Within the HTML header of the page loading on this URL there would be a parameter like this:
When performing an SEO site audit, page headings should be checked to ensure they include relevant
keywords. However they shouldn’t be over-optimised (i.e. the same keywords shouldn’t be used in
headings on one page).
Make sure that headings are unique and specific to each page so that no duplication issues arise.
These meta tags tell the search engines whether or not they should index a certain page or follow
that are placed on that page.
Index - tells the search engine to index a specific page
Noindex - tells the search engine not to index a specific page
Follow - tells the search engine to follow the links on a specific page
Nofollow - tells the search engine not to follow the links on a specific page
Check which HTTP response status code is returned when a search engine or web user enters a request
their browser. It’s important to check the response codes of each page, as some codes can have a
negative impact on the user experience and SEO.
200: Everything is okay.
301: Permanent redirect; everyone is redirected to the new location.
302: Temporary redirect; everyone is redirected to the new location, except for any ‘link juice’.
404: Page not found; the original page is gone and site visitors may see a 404 error page.
500: Server error; no page is returned and both site visitors and the search engine bots can’t find it.
503: A 404 alternative; this response code essentially asks everyone to ‘come back later’.
Moz.com has some excellent additional information regarding response codes.
Ever since Google released its Panda algorithm update, thin content has become a real issue for webmasters around the globe. Thin content can best be described as content that is both short on words and light on information; think of short, generic pages that don’t provide any information of real value.
It’s important to make sure that the pages on a website provide visitors (and search engines) with in-depth, informative and relevant content. In most cases, this means providing content that is longer – e.g. at least 300-500 words or more depending on the page and respective topic (note that product pages can usually get away with fewer words).
According to a comScore report released in mid-2014, internet access via
mobile devices has overtaken internet access from desktops in the U.S. – and the rest of the world
heading in that same direction, if they haven’t already.
It’s important to note that mobile friendliness is now one of Google’s ranking factors, thanks to the ‘Mobilegeddon’ algorithm update.
It’s therefore critical to ensure that a website is mobile friendly to maintain your website’s ranking – something you can check via Google’s Mobile Friendliness tool.
This tool checks to see that the design of your website is mobile friendly.
Backlinks are the links on other sites that point back to the website in question; they can be seen as ‘votes of confidence’ by other web users in favour of a respected website.
Generally, the more backlinks a website has the better it will rank, with certain major exceptions. Large quantities of ‘dofollow’ links coming from one domain are frowned upon and penalized by Google, as are links coming from poor quality/unrelated websites.
In other words, effective backlinks are about quality over quantity. Unnatural backlinks can lead to a website receiving a manual penalty or being de-indexed. Google Support offers a detailed guide that explains how to disavow unwanted backlinks. Performing a disavow is a two-step process. You will need to download a list of all the links pointing to your website, and then create and upload a file to Google that details all the links that need to be disavowed. To avoid the necessity of going through this process, get in the habit of researching the links on your website regularly to ensure that they are all valid and relevant to the content they are linked to.
Once you have performed an in-depth SEO website analysis, areas in need of improvement will naturally become apparent. Then, It’s simply a matter of rectifying these elements to ensure that the site is better placed to achieve a higher SERP ranking.
Once these areas of concern have been addressed, the site is now essentially ‘up to date’ in terms of SEO; from here on out it’s a matter of building upon this strong foundation to boost rankings, increase your conversion rate and achieve better results!
For experienced and knowledgeable SEO in Thailand and a free SEO audit, get in contact with us. We will provide you with a full strategy outline of how we can then furnish you with a full range of SEO services. Please contact us and take the next step with your digital marketing agency in Thailand.