The robots.txt File Functions And Importance For Webmasters

The robots.txt File Functions And Importance For Webmasters

Webmasters create robots.txt also known as robots exclusion protocol file for instruct robots of all different search engines Google, Bing, Yahoo and AVG etc how to index and not to index webpages of their sites.

robot teacher SS

How robots.txt Works

when a robot visit your website before it dose so,first it will check if roborts.txt file set to disallow that robot from not to visit some files or folders on this server robot will not not visit that part of your server.

But there are robots that can ignore robots.txt like email harvesters used by most of spammers to get email address.

Where to place robots.txt

It must be in the top level directory of your server. The following are list of different instruction to robots.

If you want to block All robots not to index any content of your site

User-agent: *
Disallow: /

If you want to just block a specific robot for not to index specific folder or webpage on server

For folder:

User-agent: Bingbot
Disallow: /no-bing/

For webpage:

User-agent: Bingbot
Disallow: /no-Bingbot/blocked-page.html/

For Allow All

If you want to allow all bots to index all pages of your website/server you will create a robot.txt file and leave it empty.

Leave a Reply

Your email address will not be published. Required fields are marked *