// //

How to Make Simple Robot.txt file Seo friendly .

How to Make Simple Robot.txt file 


What is  Robots.Txt ?

In Simple language , Robots.txt is like a website door for the Search engine Crawler bot  , By Controlling Robots.Txt of a website , a Web owner can Control the access of Search engine crawler bot  from crawling website  and helps to you to control which website pages and post of the website is to be indexed by the particular Search engine and which pages are not to be indexed by the Search engine crawler bot  .A Proper use of the  Robots.txt can Help to boost your website Indexing and Helps In bringing lots of traffic to your website where as a Bad  Robots.txt Can


Creating Robots.Txt for Website

Making A Robots.Txt file is Very easy and simple But main issues or complexity comes with the use of Wildcard used in the robots.txt . To Create A Robots.txt , Create a New notepad Text Document and enter either code 1 , Code 2 or code 3 ( as Given below ) in the Text documents ,when you done with the use of wildcards then save the Text document as Robots.Txt and upload this Robots.Txt file in the Main directory , Just inside Public.html or Domain Name . Once you uplaoded Robots.txt then make sure to check whether its is properly uplaoded  in the website or not by opening your website as WwW.Example.com/Robots.txt .

What is the use of robots.txt file

Basic use of Robots.txt is to control search engine Crawler bot access to website directory for indexing them in search engine search results .Using Proper Wild card In Robots.txt , Webmaster can Control the search engine Crawler bot , That means Robots.txt is like Key for the Crawler bots to access any website for indexing .

User agent :  *  All the search engine crawler bot are allowed to crawl Website or blog .
User agent :  Googlebot   Only Google Search engine Crawler bot is allowed to Crawl  a website . 
Disallow : /   Search engine will not able to Crawl  website .
Disallow : /SearchSearch engine crawler bot are allowed to crawl Website accept Search Directory
Allow :  /Search Search engine Crawler bot is allowed to Crawl  Search Directory .
Allow : /   Search engine will be able to Crawl whole website ( All website Directory) .
         

If you want all of your website to be indexed in all search engine then just copy Code 2 in your Robots.txt  file . by doing so Search engine Crawler bot will access all the content of your website and index everything Form your website ,Copy below given code in your website Robots.txt 


Example of Robots.txt

  Code 1 : Allow indexing of whole website by all search engines but Directories will not be indexed by the search engine because of Wild card Disallow : / 


User-agent: *
Disallow:/


Code 2  :To exclude all Search engine robots complete access ,Copy below given code in your website robots.txt 

User-agent: *
Disallow:

Code 3 :To exclude particular folder of your website by using  robots.txt . Just add Disallow: /folder directory . for example , Disallow: /search to Disallow search engine to access Search directory of your website , Disallow: /Admin to Disallow search engine to access Admin directory of your website .


User-agent: *
Disallow:/Admin