Friday, 22 March 2013

Using separate robots.txt for http and https

If we like to block the some content on the site specially for the SSL
then here is the solution:

Create two file robots.txt for general instruction and robots_ssl.txt for site running under the ssl
suppose you like to drop all pages of https than your robots_ssl.txt look like

User-agent: *
Disallow: /


Now write the magic code in .htaccess which determines which robots file to be run on the basis of http and https


Options +FollowSymLinks
RewriteEngine on
############
RewriteCond %{SERVER_PORT} ^443$
RewriteRule ^robots.txt$ robots_ssl.txt
####################


So now you can find separate robots.txt on http and https
i.e. http://www.yoursite.com/robots.txt runs robots.txt
https://www.yoursite.com/robots.txt runs robots_ssl.txt

No comments :

Post a Comment