Robots txt Generator

The ROBOTS.TXT file is important for controlling search engines and other robots. Since most people do not know how to make one, this tool will make one for you.


Default – All Robots are:  
Crawl-Delay:
Sitemap: (leave blank if you don’t have) 

Search Robots

     
Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   

Restricted Directories:

The path is relative to root and must contain a trailing slash “/”
 
 
 
 
 
 
   

Now, Create ‘robots.txt’ file at your root directory. Copy above text and paste into the text file.

How to use this tool

You can now use our robots.txt generator to create the robots.txt. The Robots.txt generator is a free online tool that helps you build your site’s robots.txt file. You can access and change the file with our software, if you have already built a robots.txt file and wish to make it better. The use of the tool is very straightforward, because in the disallow section the instructions that you do not want to index search engines are included, whilst in the allow section the directives you like the search engines to index are present. You can select the option Remove Directive if you want to change the current Robots.txt file.

What is robots.txt file and why is it necessary

Robots.txt is a file which is available for all the websites which are hosted on a web server. Whenever a search engine starts crawling on a Site, it is a file called robots.txt which it goes through first.. The location of this file is generally in the root of the web directory. Once the file is found by the spiders, they can find the links that are blocked or disallowed by the administrator. If you use a robots.txt file, all search engines like Google and Yahoo will be able to understand the links you don’t want to index. The inverse of sitemap can be seen as robots.txt.

The robots.txt files for the respective search engines can also be generated by selecting them from the drop-down list.
If you first build the robots.txt file and wonder what you need to disallow then you can exclude the things listed below.

  • Login Page of your website.
  • The Contact Page of your website.
  • Internal structure.
  • Privacy Page.
  • All the Media files which you don’t want to see in the search results.
  • All the Image folders which you don’t want to see in search results.

How to Optimize Robots.txt file

You can follow some of the tips below that will help you optimize the robots.txt file.

  • When you are trying to check the syntax of the robots.txt file, we recommend that you shift it to the bottom of the file. If the search engine reads the robot.txt file, it starts from top to bottom. When the syntax you have assigned is incorrect, the previously mentioned instructions should be parsed and not ignored.
  • With the help of the wild card directive, you can easily create simple statements. The wildcard directive will disallow all patterns found in the URL. It is supported only by a few search engines so we suggest that you add it to the bottom of the robots.txt file.
  • Do not use the robots.txt file to allow directives you want to index. The purpose of the robots.txt file is to mention directives that you do not want to be indexed in search engines. So you should use robots.txt just want to disallow directives.

We hope that the next time you create a robots.txt file, you will keep all of the above tips in your mind.

Add a Comment

Your email address will not be published. Required fields are marked *

ABOUT CODINGACE

My name is Nohman Habib and I am a web developer with over 10 years of experience, programming in Joomla, Wordpress, WHMCS, vTiger and Hybrid Apps. My plan to start codingace.com is to share my experience and expertise with others. Here my basic area of focus is to post tutorials primarily on Joomla development, HTML5, CSS3 and PHP.

Nohman Habib

CEO: codingace.com

Request a Quote









PHP Code Snippets Powered By : XYZScripts.com