SEO
SEO - 05 September 2022
SEO

All of the work done for search engine bots to better understand, crawl, and index sites written in Javascript is called Javascript SEO. Analyzing the complete crawlability of text, image, and video content is a top priority for SEO professionals. With the increase of websites written with Javascript, Javascript SEO studies have also started. Modern front-end libraries (React JS, Vue JS, Angular JS, etc.) have paved the way for a new study here.

If we go much more fundamentally, as we know, web pages consist of 3 main parts.

1. HTML – We can think of it as the skeleton and content area of your website. A page consisting entirely of Html can give you information but does not offer an attractive design.

2. CSS – CSS files that add visuality and design to HTML. Thanks to CSS, we can access the web page design we are familiar with.

3. Javascript – It is the programming language that enables movable or changeable areas on a website. When you hover over an area, its color changes, or when you scroll the page, new content is loaded at the bottom thanks to javascript. According to the user's movement, Javascript files are run in the background and the final structure to be shown to the user is presented.

What is Client-Side Rendering (CSR) and Server-Side Rendering (SSR)?

With the developments on the software side, websites have recently started to be coded only with JS. Code lines are not presented individually in HTML, but are given directly over JS files.

So, how well could Google adapt to these developments?

Google & Javascript Relationship

Google Javascript sites have started to understand and make sense of it much better in recent years. However, crawling sites written directly in JS is very costly for Google. Compared to a normal site, it both spends more time browsing and exhausts its servers. So what is Google doing to optimize this expense?

Google uses a method called 2-stage crawling to crawl pages written in Javascript. In the first stage, Googlebot enters your site, scans the HTML and CSS, and adds it to its index. It also sees the Javascript files here but leaves it to the 2nd time to scan and understand them. In the meantime, it starts to show the results as far as the user sees in the first stage. It puts your site in a row within itself for 2nd wave scanning. This can take hours or days. It evaluates your site in order of its own priority based on its value and authority. When it comes to the 2nd wave, it also scans the Javascript files and adds them to the index. Thus, it starts to show the user the final version that you actually planned.

How Is Javascript Different From Other Languages For Google?

The browsers we use parse HTML, CSS, and JS through render engines. Therefore, it must be rendered first. Languages such as PHP and Python stand out as languages rendered on the server side. Alternatives such as Angular and React from languages coded with JS can be rendered with both Server Side Rendering (SSR) and Client Side Rendering (CSR) methods.

How Does GoogleBot Crawl Javascript Sites

There are 2 different methods for the correct crawling and indexing of sites written in Javascript by search engine bots.

  • Client Side Rendering (CSR) (Including Dynamic Rendering)
  • Server Side Rendering (SSR)

What is Client-Side Rendering (CSR) and Server-Side Rendering (SSR)?


Client-Side Rendering (CSR)

  • The user or the browser of the search engine makes a request to your address.
  • The server accepts the request and returns the response.
  • The browser downloads the content and accompanying JS files.
  • The browser executes JS files to view the content.
  • The content becomes interactable by users and bots.


What is Client-Side Rendering (CSR) and Server-Side Rendering (SSR)?


Pros and Cons of CSR
  • (+) Faster processing after first-page load
  • (+) Puts less load on the server
  • (+) Provides rich site interaction
  • (+) Provides fast website navigation (fewer HTTP requests to the server as it doesn't have to be re-downloaded for every page load)
  • (-) Slower loading of the first page (due to loading of assets required for multiple pages)
  • (-) Delayed loading on devices with low processors.
  • (-) Late loading when the internet connection is weak.
  • (-) It negatively affects SEO if not implemented properly. (The page will appear blank until the Javascript codes are run and the content is created at the opening of the page)
  • (-) External libraries are needed very often.
Risks of CSR SEO

Your page, which you have prepared using the most modern technologies, may appear as a blank page for Google. If you have a site written with modern Javascript libraries, frequently check how Google understands your site via the "Show Crawled Page" button in the URL Checking tool via Google Search Console. If your page is still not in the Google index, you can also test it on the live URL.

Dynamic Rendering

Dynamic Rendering is a method used for websites created in CSR to be SEO compatible. The main difference lies in offering different rendering formats to GoogleBot and users. When the server request is made, it is understood whether the request is made via GoogleBot or by the user on the server side. If rendered by GoogleBot, it returns an HTML response that will be rendered on the server. Users continue to view your site as CSR.

What is Client-Side Rendering (CSR) and Server-Side Rendering (SSR)?

Server-Side Rendering (SSR)

  • The user or the browser of the search engine makes a request to your address.
  • The server accepts the request, renders the entire HTML and returns the response.
  • The browser downloads the generated HTML and JS files.
  • The content becomes interactable by users and bots.

What is Client-Side Rendering (CSR) and Server-Side Rendering (SSR)?

Pros and Cons of SSR

  • (+) Improves user experience by making pages load faster.
  • (+) It is advantageous in terms of SEO.
  • (+) It is a much more ideal structure for sites that offer static content.
  • (+) Fewer JS dependencies are available.
  • (+) Also ideal when the user's internet connection is slow.
  • (-) When the server has a large number of visitors or the site is large, it can cause significant slowdowns in page rendering. (TTFB time may be affected)
  • (-) It has to reload the whole page on each navigation.
  • (-) Server costs will be high to provide higher performance.

Our Similar Articles in The SEO (Search Engine Optimization) Category

A Dive into Prompt Engineering Techniques Pt.2
A Dive into Prompt Engineering Techniques Pt.2

Advanced ChatGPT prompt techniques in Pt.2, focusing on White et. al's classification! Prompt patterns for creative, informed, and engaging AI interactions.

Read more
A Dive into Prompt Engineering Techniques Pt.1
A Dive into Prompt Engineering Techniques Pt.1

Exploring LLMs through the academic paper 'A Prompt Pattern Catalog' by White et. al, this blog showcases practical prompt engineering with real examples.

Read more
Topic Clustering, a Core Content Strategy
Topic Clustering, a Core Content Strategy

Dive into the modern content evolution with topic clustering. Learn how a topic-centric approach boosts user experience and search rankings.

Read more
We are waiting for you! Contact us now to meet our multidimensional digital marketing solutions.