Home » Blogger Tips » How to Enable Custom Robots.txt files in Blogger

How to Enable Custom Robots.txt files in Blogger

How to Enable Custom Robots.txt files in Blogger

Robots.txt is a way to tell search engines whether they allow to index the pages in the search results or not. Bots are automated, and before they can access your site, they check the robots.txt file to make sure whether they are allowed to roam this page or not. Sometimes, people do not want to stop searching engines from crawling all over their website. On the other hand, they want to determine some of the pages, which are not to be indexed in the search results.

First of all, Go to

  1. Blogger.com >> your site
  2. Settings
  3. Search preferences 
  4. Crawlers and Indexing . 

Now you will be able to see two choices robots.txt Custom and Custom robots header tags. Both options will give you the flexibility to customize the file robots.txt you.

Custom robots.txt: This option gives you the ability to edit the whole of your robots.txt file. You can simply type in your content if you want the content to be crawled by spiders or not. However, you can always undo your actions and always be able to return to normal.

Custom Header robot tags: This option is a little over. It does not give you the ability to write your code, instead of providing several options with check boxes. So, if you do not know about the head tag then stay away from it.

Now you have to activate the file robots.txt custom so press the edit button is present next to “Custom robots.txt“option. After selecting the edit button, it will ask you to activate the content robots.txt custom and press “Yes” and proceed to the next step.

Custom Robots.txt files in Blogger

In the text area, type the content you want to exclude from being crawled. After editing the file according to your needs, press the save button to conclude. However, if you want to return to the default robot file later, you can select “No” instead of “Yes.

Custom Robots.txt

For Example:

User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://www.leetblogger.com/feeds/posts/default?orderby=UPDATED

We hope this tip will help you. If you experience any issues regarding crawling and then, do not hesitate to leave your questions in the comments below and our experts will try to help you in solving the problem. 

About Cyb3r

Check Also


How to Maximize Google Adsense Earnings

Inserting Adsense code and then leave it alone probably will continue to provide earnings for …

Leave a Reply

Your email address will not be published. Required fields are marked *