To be able to score well in Google’s organic search results, it is crucial that you have the technology “in order.” Technology is one of the most essential elements of SEO Web Development. On the internet, you can find plenty of checklists with which you can do an excellent audit of your website. What is often missing, however, is the context. Why do you have to do certain things or not? If you do not understand the “why” behind every SEO tip, you overlook opportunities or problems that you did not know existed.
Today we are going to uncover many vital points that, strangely enough, are often underestimated: JavaScript, site speed and structured data.
To understand how these three elements can influence SEO, we first have to look at how Google and thus the process of ranking works.
- Crawling. Visiting your pages by Googlebots. Only the HTML is read, looking for links (a href) to continue crawling.
- Indexing. The actual ‘rendering’ of the page in two steps: first the raw HTML-code and then the complete page with JavaScript. This process also determines whether a page is included in the Google index.
- Ranking. The ever-enigmatic Pandora box from Google, which determines how specific searches arrange the pages.
JavaScript & SEO
A question that is often asked is: can Google crawl/index JavaScript? To be honest, that is the wrong question to ask. The simple answer to that question is: “Yes!” But that does not mean that the use of JavaScript can have no negative consequences on SEO.
To begin with: JavaScript is not necessarily bad. It can be beneficial for the functionality of a website and thereby contribute significantly to the user-friendliness. In most cases, however, JavaScript is only rendered to the user in the browser. The way in which JavaScript is used can then be at the expense of the efficiency with which the site is crawled and indexed.
Progressive enhancement
In theory, JavaScript is best used as the last functional layer over an already working HTML / CSS website. This means that a site without JavaScript should already function correctly and that JavaScript is only added for user-friendliness and/or additional and complex functionality (such as calculators or widgets).
However, if JavaScript is necessary for the basic functionality of a website, for example for displaying a list of products on a category page of an e-commerce website, that is, when SEO problems could arise. Because JavaScript generates the products and with them the links to the product detail pages, they are not in the source code (HTML). This will prevent the crawler from finding these links in the first place and consequently not crawl them.
The result: the products are not found in the first instance and will therefore not receive a link value. Only when the page is rendered during the indexing process could the products and links become visible. This costs Google considerably more effort than if it were all in the source code.
Impact on crawl budget
If Google can finally find it, then there is nothing wrong? Wrong! Why? That is where the term ‘crawl budget’ comes into play. The crawl budget indicates how much attention Google gives to a site. The size of this is not public, varies per site and is determined from the link profile. The higher the authority of a website, the more attention the site receives from Google.
What exactly is the crawl budget? It is a collective name for various things in the crawling process, URL importance, crawl prioritization, crawl schedules, etc. Does the crawl budget indicate how many pages are precisely crawled? Yes and no. Yes, because a large crawl budget also means that more pages are visited. No, because you can also influence this number by making your site more crawl-friendly. If the crawl budget should still have some sort of unit, then it would be ‘time.’ A few examples:
- Can you make sure that no JavaScript has to be rendered? That saves a lot of time and this time can be used to crawl other pages.
- Can you make sure your site loads twice as fast? Then a logical consequence is that twice as many pages are crawled as in this example.
Implementation & testing of structured data
Structured data must be implemented in the code. There are several ways for this, of which JSON-LD is the most flexible and, by the way, Google is the recommended way. The advantage of JSON-LD is that you do not have to mess around in the code nor to provide different elements with extra bits of code (like microformats). JSON-LD is implemented as a readable list in the header.
Finally, a few tips regarding the implementation:
- Use the structured data testing tool from Google to check if the implementation went well.
- Do not use Google Tag Manager to add JSON-LD to the code. The disadvantage of this is that it works with external JavaScript so that it is only read when indexing. JSON-LD in the header is in the source code and is, therefore, read faster. However, this method is excellent for testing structured data.
- The same applies to the data highlighter tool in the Google Search Console. This is also great as a test.
Where do I start optimizing?
I hope that I have given you valuable tips on how to continue the optimization of your website. Do you have any good suggestions for relatively low-threshold optimization layers? Then I would like to hear them in the comments. Good luck!