Saturday , December 16 2017

How to Setup Custom Robots.txt File in Blogger Blog

A few post back I have shared a post about how to set the search preferences in blogger and in the last of this post I had promised to you that in the upcoming posts I will share about blogger robots.txt. So today I bring a complete tutorial about the custom robots.txt file. Actually, this a simple text file (not Html) which is giving instruction to search engine bots that which pages either index or not. The search engine bots and other crawling bots usually obey this text.
This very important factor in SEO, with the help this text file, you can block your blog’s any irrelevant URL such as the label, demo pages any other page which is not important to index in search engines.

Set Robots.txt File in Blogger

By default setup Robots.txt File for Blogger

In blogger, we are using default robots.text like below.

User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Allow: /

First, I want to explain the terms of this file because this very important to know what are you doing in your blog. If you do any mistake in adding to this file. In the resulting search engines can ignore your blog or reduce blog’s ranking. In below I explained all terms of the robots.txt file.
User-agent: Mediapartners-Google

Mediapartners- is the name of Google AdSense crawling bots name and this is only beneficial for those blogs who are using Google AdSense on their blogs. It helps to display your niche ads on your blog. If you are not using Google AdSense in the blog, then you don’t need to keep this line.
When you mark User-agent with asterisk (*) sign, this means you allow all crawling bots such as Google-bots, bing and other search engines and directories bots.
Disallow: /search
This code will block all URLs which having keyword search after the domain name. In blogger, all labels and search pages URLs having this kind structure. So, that means you block all labels and search pages from crawling.
Allow: / 

This code especially used for the homepage and it invites the search engine spider to crawl and index your blog home page.

Sitemap is helpful to show your content to search engines and it is inviting the search bots to crawl your blog posts quickly and it also notifies the search engines when you do any changing in the blog.

How To Disallow post Or Page Using Robots.txt File?

If you want to prevent any page or post from search engines crawling. You can do by using custom robots.txt file, simply write after User-agent* disallow: then “/” and place your post or pages except the main domain name like below.
For post

Disallow : /2015/04/your-post-url.html
For page
Disallow : /p/your-page-url.html.

How to generate robots.txt file?

After reading, this tutorial I hope you will be able to create robots.txt file. This very easy and you don’t require to learn any special language or techniques. So you can create manually or use the below online tools the file.

Adding Custom Robots.txt File In Blogger

Once you create robots.txt file, then you easily add in blogger.
Go to your blogger dashboard

  • Select Settings, then Search preferences Crawlers and indexing > Custom robots.txt > Edit > Yes.
  • Paste your robots.txt code in it.
  • Click Save Changes button.

I hope after reading this article you will be able to set custom robots.txt in blogger and also understand its terms. This great way to index your blog in search engines and get traffic, but in the end suggest again use this option with caution because adding anything wrong search engines could ignore your blog. If you need more help place your question in the comments box below.

Check Also

Submit a Blogger Blog to Google Search Console

Google Search Console (formerly known as “Google Webmaster Tools“) is a free tool offered by …

One comment

  1. Thanks for this! Good Work.

Leave a Reply

Your email address will not be published. Required fields are marked *