A few post back I have shared a post about how to set the search preferences in blogger and in the last of this post I had promised to you that in the upcoming posts I will share about blogger robots.txt. So today I bring a complete tutorial about the custom robots.txt file. Actually, this a simple text file (not Html) which is giving instruction to search engine bots that which pages either index or not. The search engine bots and other crawling bots usually obey this text.
This very important factor in SEO, with the help this text file, you can block your blog’s any irrelevant URL such as the label, demo pages any other page which is not important to index in search engines.
By default setup Robots.txt File for Blogger
In blogger, we are using default robots.text like below.
First, I want to explain the terms of this file because this very important to know what are you doing in your blog. If you do any mistake in adding to this file. In the resulting search engines can ignore your blog or reduce blog’s ranking. In below I explained all terms of the robots.txt file.
When you mark User-agent with asterisk (*) sign, this means you allow all crawling bots such as Google-bots, bing and other search engines and directories bots.
This code will block all URLs which having keyword search after the domain name. In blogger, all labels and search pages URLs having this kind structure. So, that means you block all labels and search pages from crawling.
Sitemap is helpful to show your content to search engines and it is inviting the search bots to crawl your blog posts quickly and it also notifies the search engines when you do any changing in the blog.
How To Disallow post Or Page Using Robots.txt File?
If you want to prevent any page or post from search engines crawling. You can do by using custom robots.txt file, simply write after User-agent* disallow: then “/” and place your post or pages except the main domain name like below.
Disallow : /p/your-page-url.html.
How to generate robots.txt file?
After reading, this tutorial I hope you will be able to create robots.txt file. This very easy and you don’t require to learn any special language or techniques. So you can create manually or use the below online tools the file.
Adding Custom Robots.txt File In Blogger
Once you create robots.txt file, then you easily add in blogger.
Go to your blogger dashboard
- Select Settings, then Search preferences Crawlers and indexing > Custom robots.txt > Edit > Yes.
- Paste your robots.txt code in it.
- Click Save Changes button.
I hope after reading this article you will be able to set custom robots.txt in blogger and also understand its terms. This great way to index your blog in search engines and get traffic, but in the end suggest again use this option with caution because adding anything wrong search engines could ignore your blog. If you need more help place your question in the comments box below.