Search Blog

Effect and Function of Robots Txt File on Blogger seo

what is robots txt file

Robots txt file is a collection of web standards that run web robot performance and search engine indexing of your site or blog.It is placed in the root of site or blog, which also allows you with set of commands to limit access to your site by any part or type of web bot and other bots follow the robot txt file for crawling of by robot txt file you can easily block any content of your site that you not want you show in search engine.the prime function of robot txt file is to tell the google bot and other bots about your site crawling limits.If you are using a blogger blog then you have no problem to add robot txt file in root. if you not want to block any section of your site or blog then it will be very simple robot txt file to add in blogger. 
Also see how to add robot txt file in blogger 

but if you to block any page or section of your site just you need may some thing to learn so that you can easily change it to your requirement. it is very simple just type disallow: blocked page url and save your file and update it.

Function of Robots.txt file

you have seen the User-agent: *  means that you have enable your site to crawl for all bots. any bot can access your site.and if you not 

want to show your site in search engine then you can simply add disallow:site url in robot txt file and update to root of your site it 
does not means that your site is not index it is just hiding that page from search engine means no body can access that page from 
search engine but if some other site has refer it i can be access .if you want that google bot and all other you site crawl your content you must have to add robot txt file and make sure that you have allow your site to crawl .
you will see it has allow:/ command in file. be careful to use it.It has importance in search engine crawing actuall before going to your site bot will see you robot txt file and then crawl the related content in search engine. only one disallow is used for each url to block if you had more than one urls to block you have to use disallow seperately for each url and use it new line never disallow more than one url in one line.if you have host your site on paid
server or if its not blogger than you make sure use you file as robots.txt not as Robots.TXT or any other.if you do so it may not works properly and you will never got the reponse from bots. bot may think it is some other file hence save it as robots.txt file and this leads to fetching error.

IconIconIconFollow Me on Pinterest
Blog Widget by LinkWithin