12 January 2019

How to Add Custom Robots.txt File in Blogger

Leave a Comment
Robots.txt is a simple text file which is used by webmasters to instruct search engines and other web crawlers about how to crawl your website i.e. which pages or directory should be crawled. Thus, it plays an important role in search engine optimization (SEO) of your website.

Add custom robots.txt file in Blogger

Robots.txt file isn't used to deindex web pages from search engine result pages (SERPs). For that purpose, you have to add noindex robots meta tag in each web page that you don't want to index. We have already discussed how to add necessary robots meta tags in your Blogger blog.

In this tutorial, we will guide you how to add a custom robots.txt file in your Blogger blog. Let's first discuss where this robots.txt file is located in your blog and how you can access it.

How to view robots.txt file of your Blogger blog


Blogger already has a default robots.txt file for your blog which is located in the root directory of your blog at /robots.txt address. Follow this step-by-step guide to view robots.txt file of your blog -
  1. Open Google Chrome or any other web browser.
  2. Type your full blog address in the URL tab and add /robots.txt at the end of the address. (See below picture)
    View robots.txt file
  3. Now press the enter button and it will load the robots.txt file of your blog.
This way, you can check the robots.txt file of any website even Google and Blogger itself.

View robots.txt of your blog using Google Search Console


You can view and test the robots.txt file on your Blogger blog using Google Search Console (formerly named Google Webmaster Tools). Follow these steps -
  1. Go to Google Search Console and switch to the old version.
  2. Now open your blog's property and go to Crawl > robots.txt Tester.
    Search Console robots.txt Tester
  3. Here you will find the robots.txt file of your blog. If there are any syntax warnings and logic errors, Google will display them below the editor.
  4. You can also enter a URL and test if it is blocked from any Google web crawler (such as Googlebot, Googlebot-Image, Mediapartners-Google, Adsbot Google etc.)
Note: Any changes you make in the tool editor are not automatically saved to your blog's web server. This tool is made for the testing purpose only.

Test robots.txt file using a third-party tool


Google Search Console's robots.txt Tester allows you to test URLs for Google web crawlers only. There are a number of third-party tools available that can help you to test robots.txt file.

For example, you can use this robots.txt Validator and Testing Tool by TechnicalSEO website. Simply, enter the URL of any website you want to test and select the User-Agent to check if any file or directory is blocked for that web crawler.

Basic terms used in Robots.txt file


Before you add a custom robots.txt file in your Blogger blog, you should know about the basic terms that are used in a robots.txt file.
  • User-agent: This is used to target web crawlers. Either you can mention the name of a particular web crawler or use an asterisk (*) symbol to target all web crawlers.
  • Disallow: It tells the robots which pages, files or directory they are not allowed to crawl. As you can see in the above picture, /search directory (Label pages) is blocked from all web crawlers.
    If you want to disallow particular blog post, use this syntax: Disallow: /yyyy/mm/post-url.html where yyyy and mm refers to the year and month of the post. Similarly, you can disallow Blogger page like this: Disallow: /p/page-url.html
    If the Disallow is blank, it means that web crawlers can access your entire website.
  • Allow: It tells robots which pages, files or directory are allowed to be crawled by them. Allow: / means that all content of your website is allowed to be crawled.
  • Sitemap: Here you can add the XML sitemap of your website or blog.

How to add custom robots.txt file in Blogger


Blogger has an option in search preference settings which allows you to add a custom robots.txt file on your blog. Here's a step-by-step guide -
  1. Login to your Blogger Dashboard and go to Settings > Search preferences.
  2. Under Crawlers and indexing, click edit link next to the Custom robots.txt option.
  3. Now tap Yes button and paste your robots.txt file code in the box. You can either use an online tool to generate your robots.txt file or create it manually.
    Add custom robots.txt file in Blogger
  4. Finally, tap the "Save Changes" button. That's it.
Note: We recommend you not to change the default robots.txt file of your Blogger blog. However, if you want to disallow any web pages or directory, make sure to test and validate the custom robots.txt file very carefully.

We hope you find this guide helpful to add a custom robots.txt file on your Blogger blog. If you are facing any problem, feel free to share it in the comment section below.

Leave A Comment