How To Add A Robots.txt File In A Blogger Blog

Admin
0

Want to know how to add a robots.txt file to Blogger? This information is what you need. In this post, I explain how to add a robots.txt file in 2023 to Blogger Blogspot. Before we start, though, I want to talk about what the robots.txt file is and why you should add a special robots.txt file in Blogger.


What Is A Robots.txt File?


Robot.txt is a simple text file in simple codes that the website owner uses to give search engine bots or crawlers specific instructions. These files tell the bots what they can't do when indexing a page. Let me give you some more information. Remember, when web bots look at your site, they crawl it first before adding it to an index.


First, they look for the file robot.txt to see if it has the protocol limitations. So, Search Engine Crawlers know which pages to index and which pages not to index. This is a great way to boost your blog's score and get more visitors, as long as you use it right. By digging deeper into this, understand how to add a custom robot.txt file to a Blogspot blog.


Why Should You Add Your Robots.txt File In Blogger?


By adding a robot.txt file to Blogger, you can keep web crawlers from indexing certain pages on your site, like your blog tags page, your test page, or other pages that don't need to be scanned as much. 


It's great for SEO, which helps you get more visits from search engines. Always remember that search crawlers check the robots.txt file before crawling any web page. Each site housed on Blogger encounters its robots.txt file set up by default.


How To Add A Robots.txt File To Blogger Blogspot?


Follow The Steps Below To Add A Robots.txt File In 2022 To Blogger Blogspot.


User-agent: Media partners-Google

Disallow: User-agent: *

Disallow: /search 

Disallow: /b

Allow: /

Sitemap: https://www.yourblogurll.blogspot.com/feeds/posts/default?orderby=updated


  • Step 1: Open your Blogger blog.

  • Step 2: Go to Settings > Search Preferences > Crawlers and search> Search Engine Crawlers. Your robots.txt

  • Step 3: Now, turn on the unique robots.txt text by choosing "Yes."  

  • Step 4: Paste the code from your robots.txt file into the box that says "Given."

  • Step 5: Press the button "Save Changes."

  • Step 6: You're all set!


All About The Robot.txt File


Each Site Stored On Blogger Has Its Personalized Robots.txt File That Looks Something Like This:


User-agent: MediaPartners-google Disallow: 


User-agent: * 

Disallow: /search 

Disallow: /b 

Allow: /


Sitemap: Https://www.yourblogurll.blogspot.com/feeds/posts/default?orderby=updated


Let's Understand The Robot.txt File Completely:


This code has been broken up into several parts. Let's look at each of them first, and then we'll learn how to find special robots.txt files on blogger sites.


User-agent: Media Partners-google


This code is for the robots that run Google Adsense. It helps them serve better ads on your blog. Either you're employing Google Adsense on your website or not giving up on it.


User-agent:*


By default, search bots can only find our blog's tag links. Because of the code below, our editor won't be able to find our tags page links.


Disallow: /search


That means links with keyword lookup following the domain name will be overlooked. See the link to a tag page called SEO in the case below.


And if we remove "Disallow:/search" from the code above, crawlers may be able to index and crawl all of our blog's posts and website pages.

Here, Allow: / explains the site, meaning internet robots can crawl and analyze our blog's site.


Disallow Particular Post


Let's say we want to keep a certain piece from being indexed. We could put the lines of code below into the code.


Disallow: /yyyy/mm/post-url.html


Here This yyyy and mm tell you the month and year the post came out. For example, if we released a blog post in March 2013, we would have to use the style below.


Disallow: /2015/04/post-url.html


To Simplify This Job, Copy The Post's Url And Remove The Name From The Beginning.


Disallow Particular Page


If we want to block access to a certain page, we can use the same method. Just copy the page's URL and take out the part that says "site address." It should look something like this:


Disallow: /p/page-url. Html


Sitemap:http://yoururll.blogspot.com/feeds/posts/defaultorderby=updated


This code lets you know where the page is. If you put your site here, you can make this blog run quickly.

It means the crawlers can find a way to crawl your site if they look at your robots.txt file.


If You Put Your Sitemap In The Robots.txt File, Web Crawlers Can Quickly Find And Read Your Pages And Articles.


Sitemap: Http://yourblourll.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500


If You Have Over 500 Items, Use The Following Two Codes:


Sitemap: Http://yourblogurll.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500


Sitemap: Http://yourblogurl.blogspot.com/atom.xml?redirect=false&start-index=500&max-results=1000


How To Find/Create Your Robot.txt File?


In a web browser, you can find your blog's robot.txt file by adding /robots.txt to the end of the blog's URL. Last but not least, I'd like to say that this post helps to get more natural traffic to your website. And if you liked this blog post, send it to a friend who wants to boost site traffic by adding a robots.txt file to Blogger.


Read Also: How To Delete An Instagram Account Permanently


How To Look At Your Robots.txt File?


You can look at this file on your blog by putting /robots.txt at the end of your blog's URL. Take a glance at the picture below to see what I mean.


Http://www.naveengfx.com/robots.txt


When you go to the URL for the robots.txt file, you'll see all the code you're using in your robots.txt file. See the picture below.


Conclusion


Thanks for reading this guide, guys. If you like it, please help me get my message by sharing the article on your social media. Happy Blogging!


Post a Comment

0Comments

Post a Comment (0)

#buttons=(Ok, Go it!) #days=(20)

Our website uses cookies to enhance your experience. Check Now
Ok, Go it!