If you are newbie blogger then you may not know what is robot.txt? How to make robot.txt? Don't worry it's ok because every experience blogger once was a beginner. it's not too late because today I am going to tell you how to create perfect robot.txt file for better SEO.
Robot.txt file known as robot exclusion protocol. It is a tiny text file that tells web robots that which page to crawl and which page to ignore.
Why Perfect robot.txt file is important
As I already told you that robot.txt file the web robot tells which Page to crawl and which page to ignore.
One question might come in your mind that why web robot should ignore few pages? Let me explain it technically.
Well, there are few article which comes under two label or two category. In this scenario what you do, you put that article or blog post into two different label you consider that as a single article but distributed in two label. But Google consider it as a duplicate content.which hurts SEO of your page. Which decrease the the score of your website.
But if you create robots.txt file perfectly then this duplicate content it will not be called twice. You are going to to block the crawling of category, label and tags.
All you need is perfect robot.txt file.
There is so many way of making robot.txt file so today I am going to tell you two different method of making Robot.txt file. You have to replace https://www.technicalbishnuji.com with your website url. That's all.
User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://www.technicalbishnuji.com/sitemap.xml
Slash after disallow tells web robot ; not to crawl that perticular
User-agent: Mediapartners-Google tells web robots to serve better ads if you are using Google advertisement.
Conclusion:
You don't need any website to generate robot.txt file. Because you simply copy this code and replace with your website url and you are done.
What is Robot.txt file
Robot.txt file known as robot exclusion protocol. It is a tiny text file that tells web robots that which page to crawl and which page to ignore.
Why Perfect robot.txt file is important
As I already told you that robot.txt file the web robot tells which Page to crawl and which page to ignore.
One question might come in your mind that why web robot should ignore few pages? Let me explain it technically.
Well, there are few article which comes under two label or two category. In this scenario what you do, you put that article or blog post into two different label you consider that as a single article but distributed in two label. But Google consider it as a duplicate content.which hurts SEO of your page. Which decrease the the score of your website.
But if you create robots.txt file perfectly then this duplicate content it will not be called twice. You are going to to block the crawling of category, label and tags.
All you need is perfect robot.txt file.
How to create robots.txt file
There is so many way of making robot.txt file so today I am going to tell you two different method of making Robot.txt file. You have to replace https://www.technicalbishnuji.com with your website url. That's all.
Method 1 :
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://www.technicalbishnuji.com/atom.xml?redirect=false&start-index=1&max-results=500
Disallow: /search
Allow: /
Sitemap: https://www.technicalbishnuji.com/atom.xml?redirect=false&start-index=1&max-results=500
Method 2 :
User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://www.technicalbishnuji.com/sitemap.xml
Most Important Things:
Slash after disallow tells web robot ; not to crawl that perticular
User-agent: Mediapartners-Google tells web robots to serve better ads if you are using Google advertisement.
Conclusion:
You don't need any website to generate robot.txt file. Because you simply copy this code and replace with your website url and you are done.
Thank you ji
ReplyDeleteYou are welcome
DeleteGreat sir
ReplyDelete