Thu Mar 31 04:16:33 2022

What Is a "Crawler-Friendly" Website?

Emveep
best seo practice for website

Crawler-friendly is often defined as the ability of a bot to crawl the website and index the page so that the bot will understand the content of your page. However, it can be interpreted again as an understanding between a bot and a human on the content of your website pages. So, there are two indicators in a crawler-friendly, humans need to understand your content, and the robots that index your website need to understand the content of your page. If the links on your website are inappropriately translated by bots, it will affect the bot's ability to read your pages.

In addition, the impact in the future will be complicated for you to achieve rankings on search engines if the bot's ability is poor. Furthermore, you will lose the benefits and advantages of SEO. The following discussion is how does it look like?

How Does It Look Like of Crawler-Friendly Website?

Crawler-friendly

crawler friendly website

Not crawler-friendly

not crawler friendly website

You can see the difference between a crawler-friendly and one that isn't. Robots will have difficulty reading and understanding the content. The cause of not being crawler friendly is because only the javascript is read by the robot while the HTML and CSS contained on the website are blocked by javascript. Next, we will discuss what tools are used to check your website and how to handle websites that are not crawler friendly.

How to Check for Crawler-Friendly Website?

There are 3 methods to check for a crawler-friendly:

1. Using Google's Mobile-Friendly Test tool

The first method you can use is Google's Mobile-Friendly Test tool.

The first step is to first search for the tool on the google search engine.

search for mobile friendly test

In the second step, when you have found it, enter your website link and do a run test

testing live url

live url on google mobile friendly

In the third step, click "view tested page." Appears beside the screenshot results.

result testing live url

The fourth step, if the screenshots obtained still do not show, then you can check again with a copy of the HTML code

copy html code

Then search for HTML viewer, select HTML viewer code beautify

search html viewer

Paste the HTML code in the left box, then click run/view

run html viewer

If the results in the right box show the content of your website, then your page is already crawler-friendly.

result html viewer

 

2. Using Screaming Frog

The second method is to check it on a screaming frog desktop application. You can download it first if it's not there.

The first step is to open the screaming frog application and enter your website link

Click start, and the screaming frog will crawl your website

If the address list does not contain an HTML page, only a javascript file, this indicates that your website is not crawler friendly

screaming frog result

3. Using Chrome Dev Tools - Network Tools

In the last method, you can check it via inspect element

The first step is to open your website, then right-click and select inspect element

inspect element

Select network

select network

Click network conditions

click network conditions

Uncheck use default browser, select Googlebot

checklist googlebot

Click reload, and the result will be like this. Then click your website URL. If the display in the preview reflects your website content, then the website is crawler-friendly

result inspect element

If My Website is Not Crawler Friendly?

What to do if your website is not crawler friendly? This step is divided into 2, which include In-development and existing websites.

In-development

  1. Get buy-ins ASAP from the dev team. Make sure they understand this.
  2. Give a few references:
  • Google for: "SEO for developers."
  • Google for: "javascript SEO google."

Improve existing websites

Rework to make it server-side rendering

Server-side rendering allows developers to populate web pages with custom user data directly on the server. It is usually faster to make all requests on the server than to create additional browser-to-server round trips. This is what developers typically do before client-side rendering. For more information, you can read the article Understanding server-side rendering.

Implement dynamic rendering

With good dynamic rendering, your website content can be appropriately indexed. Not all pages need to use dynamic rendering, and keep in mind that dynamic rendering is a workaround for crawlers. Dynamic rendering requires your webserver to detect crawlers by checking the user agent. Requests from crawlers are routed to a renderer, requests from users are usually served. If needed, the dynamic renderer performs a crawler-friendly version of the content, such as filling a static HTML version. You can choose to enable dynamic rendering for all pages or per page. Furthermore, you can read this article for your reference: Implement dynamic rendering.

Using 3rd party pre-render service (for example, using prerender.io)

Prerender.io renders your website regularly using the latest Chrome. It can then save all generated HTML pages into the database. Apart from that, it also gives you an API for it so you can access the rendered HTML for each of your website URLs.

The only thing you need to do is add a proxy that checks for user agents. If the user agent is a search engine or some crawler (Facebook, Linkedin, etc.), you can send an API call, get the rendered HTML from prerender.io, and return it to the crawler. If the user agent is not a crawler, you can return your SPA's index.html so the JS will enter. Prerender.io has configurations for all familiar web servers. Apache, Nginx, HaProxy, Express, etc.

Lets Chat


Our experienced team is ready

reach out and tell us your idea, even if you’re not sure what your next step is.