How To Optimize robots.txt File For Security And SEO

1 Flares 1 Flares ×

SEO-industry is growing very rapidly with lots of changes. As we know SEO techniques depends on many factors. There are some important factors that are relevant and important in the past, present, and will remain essential in the future so they must be taken into consideration. One of the important factors in SEO is Crawling which mainly depend on the action of robots.txt file, so you must optimize robots.txt file properly for the security and for SEO of course.
optimize robots.txt

What is robots.txt file

robot.txt is an ASCII text file and it contains instructions for the search engine spiders/crawlers which access your web site pages and files to get indexed into their search engines.


How to create and optimize robots.txt file?

robots.txt file is basically created by using Notepad. However, experts recommend to create it in an adapted environment and in Unix format. Notepad’s robots.txt file consists of records. Each entry is divided into two fields – a string with the name of the client application (User-agent) and the line-directive Disallow. ”Disallow” tag prescribed in URL prohibited Reading and indexing. According to the guidelines of robots.txt file you should write “User-agent“, not the “User-Agent“. If the robots.txt file is left empty, the robots will realize that the webmaster allows them to index all kinds of pages available on the site.

If you want to hide some information from spiders/crawlers then use following syntax:

User-Agent: Googlebot
Disallow: /Private

If you want to hide specific page then use the following code:

User-Agent: Googlebot
Disallow: / polls.html

If you want to hide selected pages then use the following code:

Disallow: / users.html
Disallow: / admin

If you want to set different rules for different search engines then use the following code:

User-Agent: Googlebot
Disallow: / admin /
User-Agent: *
Disallow: / admin / news /

Insert your sitemap URL at the bottom of robots.txt file:

E.g. Sitemap:

Once you create robots.txt file place it in your root directory

E.g. You can create only one robots.txt file for one domain name.


Important instruction to optimize robots.txt file: 


1. robots.txt files should always be prescribed in lower case (small letters). Writing Robots.txt or ROBOTS.TXT – is wrong.
2. In the “User-agent” we can use only one special symbol which is “*” – it means “all”.
3. Empty file robots.txt implies that any robot can index the page.
4. One domain should not have more than one robots.txt file.
5. Only site owners with administrative rights can create a robots.txt.
6. Each command in the file should be prescribed in a separate line. The number of lines are not limited.
7. In the robots.txt file all characters should be be prescribed to all lowercase letters.
8. Absence robots.txt file can cause a 404 error to the search engines.