How To Add Custom Robots.txt File in Blogger

The robots.txt file is a small but powerful tool that helps website owners and webmasters control how their website is crawled and indexed by google
The robots.txt file is a simple text file that is placed in the root directory of a website and is used to communicate with web robots, also known as crawlers or spiders, about which pages or files on the site they can or cannot access. In this blog post, we will discuss the importance of robots.txt file, its syntax and how to use it effectively.

The Importance of robots.txt File


Robots.txt file is an essential tool for website owners and webmasters who want to control how their website is crawled and indexed by search engines. It helps search engines like Google, Bing, Yahoo, and others to understand which pages or sections of the website should be indexed and displayed in search results. By using robots.txt, you can prevent search engines from indexing sensitive pages or sections of the website that you do not want to appear in search results.
In addition to search engines, other web robots can also crawl your website, such as web scrapers, which can extract data from your website for different purposes. By using robots.txt, you can prevent these web robots from accessing your website or specific pages, which can help protect your website's content and information.

Syntax of robots.txt File


The robots.txt file uses a simple syntax, which consists of two main parts: user-agent and disallow. The user-agent specifies the web robot that the rule applies to, and the disallow specifies the pages or sections of the website that the robot should not access.

How To Add Custom Robots.txt File in Blogger

Custom Robots.txt is a file that we add to a blog/site to improve SEO. By this file, we can guide search engine crawler which page to crawl or which not to. With this tutorial, you must Setup Custom Header Tags in Blogger. It is also similar to Custom Robots.txt.
  1. Sign in to blogger and choose blog which you want to customize.
  2. Go to "Search Preferences" > "Settings".
  3.  Find "Custom robots.txt" under the "Crawlers and Indexing" section. Click on "Edit" on the right side of the option. Select "Yes". A blank box will appear. Copy the below code and, paste into the box.

User-agent: Mediapartners-GoogleDisallow:User-agent: *Disallow: /searchAllow: /Sitemap: http://YOUR BLOG NAME/feeds/posts/default?orderby=UPDATED       4. After that, click on "Save changes".That's  all 
robots.txt file explanation.

Tara Gnawali (Stellax)write a blog post about robot.txt file
If you have ever worked on a website or used a search engine, you might have come across a file called "robots.txt". The robots.txt file is a simple text file that is placed in the root directory of a website and is used to communicate with web robots, also known as crawlers or spiders, about which pages or files on the site they can or cannot access. In this blog post, we will discuss the importance of robots.txt file, its syntax and how to use it effectively.
The Importance of robots.txt File
Robots.txt file is an essential tool for website owners and webmasters who want to control how their website is crawled and indexed by search engines. It helps search engines like Google, Bing, Yahoo, and others to understand which pages or sections of the website should be indexed and displayed in search results. By using robots.txt, you can prevent search engines from indexing sensitive pages or sections of the website that you do not want to appear in search results.
In addition to search engines, other web robots can also crawl your website, such as web scrapers, which can extract data from your website for different purposes. By using robots.txt, you can prevent these web robots from accessing your website or specific pages, which can help protect your website's content and information.
Syntax of robots.txt File
The robots.txt file uses a simple syntax, which consists of two main parts: user-agent and disallow. The user-agent specifies the web robot that the rule applies to, and the disallow specifies the pages or sections of the website that the robot should not access. Here's an example of a robots.txt file:
User-agent: *Disallow: /admin/Disallow: /private/
In this example, the user-agent (*) specifies that the rules apply to all web robots, and the disallow rules indicate that the /admin/ and /private/ directories should not be accessed by web robots. Note that you can also use the allow directive to specify which pages or sections of the website should be accessed by web robots.

Using robots.txt Effectively

While robots.txt file can be an effective tool to control web robot access to your website, it's important to use it correctly to avoid accidentally blocking legitimate access. Here are some tips to use robots.txt effectively:
  • Use the "Disallow" directive to block access to sensitive pages or directories.
  • Use the "Allow" directive to explicitly allow access to certain pages or directories.
  • Use the "*" wildcard to apply a rule to all web robots.
  • Use the "User-agent" directive to specify a particular web robot to apply a rule to.
  • Test your robots.txt file using the Google Search Console or other tools to ensure that it's working as intended.

Step by Step Blogger Tutorials

  1. Blogger Vs Wordpress: Which one should you choose
  2. Benefits Of Using Blogspot As Blogging Platform
  3. Choosing A Perfect Niche For Your Blog
  4. Creating a Free Blog on Blogger
  5. Add Custom Robots.txt File in Blogger
  6. Most Important Settings You Must Have Set In Your Blogger Blog
  7. How to Create Contact Us page in Blogger
  8. How To Edit or change A Blogger Template - Complete tutorial
  9. Essential Safety Steps To Follow On Editing Blogger Template
  10. Setup MultiTab system on Blogger
  11. Add Facebook(Meta) meta tags in Blogger
  12. All In one seo pack for Blogger Blog
  13. How to make money through blogging
  14. How to get Google AdSense Approval very fast For A New Blog
  15. Earn 10$ Through blogging
  16. Best free AMP blogger Templates 2023
  17. Top 5 Premium AMP templates for blogger blog
I'm working on this series. I will gradually update this list..Keep Blogging ❤❤❤

Conclusion: 

The robots.txt file is a small but powerful tool that can help website owners and webmasters control how their website is crawled and indexed by search engines and other web robots. By using robots.txt, you can prevent sensitive pages or sections of your website from being indexed by search engines or accessed by web robots, which can help protect your website's content and information. Remember to use the syntax correctly and test your robots.txt file to ensure it's working as intended.

Founder of Tarang Inc and Gtara Tech. He Writes about blogging , tech tips and tricks . He manages Tarang Inc and Gtara Tech in the best possible way. Find him on : @taragnawali | facebook| instagr…

إرسال تعليق

Fair and unbiased words are appreciated. ❤