Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robots Txt Generator (by Ann Smarty) is an easy to use tool to create Robots directives. txt appropriate for your site: Easily copy and edit Robots.txt files from other sites or create your own Robots.txt files. When search engine spiders browse a website, they usually start by identifying a robots file. txt at the root domain level. During identification, the crawler reads the file directives to identify directories and files that can be blocked. Blocked files can be created with the robots generator. txt; these files are, in a way, the opposite of those of a website sitemap, which usually includes pages to include when a search engine scans a website.

Our Robots Generator. txt
Creating an edited/new robots.txt file for a website is easy when you use a robots.txt generator. To download an existing file and pre-fill the generator tool, type or copy/paste the root domain URL into the text box provided and select Download. To customize the generated robots.txt file, use the'Allow' and'Deny' functions. Please note that the default setting is Allow. To apply a new directive to your list, click'Add Directive'. To delete or modify an existing directive, select Delete Directive, then create a new one using the Add Directive function.

Customize user agent directives
Our robots generator. txt allows the specification of several different search engines, including Google. To specify alternative directives for one or more specific search engine crawler(s), select the User Agent check box, indicating (*) by default, to specify the bot. Select'Add Directive', to add the custom section to the list with the generic directive as part of the new custom directive. To revise a generic reject directive to reflect an authorization directive for a custom user agent, create a new authorization directive to accommodate the specific user agent for the content. The corresponding reject directive is then deleted for the custom user agent.

To add a link to an XML-based sitemap file, copy/paste or type the full URL of the sitemap file into the specified text box, then select'Update' to add this command to the list of robot files. txt. When finished, select Export to save the new robots.txt file, then use FTP to upload the robots.txt file to the root domain of the website. Once downloaded, the robtos.txt file will tell Google and other search engines which pages, directories and directories or the website to display and not display when a search engine user enters a query. The Ultimate Guide to Blocking Your Content in Search is an excellent resource for those who want to learn more about txt files and guidelines.