Exclude PAGEs from search engines

Hello



I have a few pages within CS installation and I do not want them to be indexed by any search engine. Is there any way to block them from being accessible to robots. Do I try with robots.txt or .htaccess ?



Thank you.

[quote name=‘Noman’]Hello



I have a few pages within CS installation and I do not want them to be indexed by any search engine. Is there any way to block them from being accessible to robots. Do I try with robots.txt or .htaccess ?



Thank you.[/quote]



The .htaccess will avoid access to the page but not it’s crawling if the url (linkf) appears somewhere on the web or on your site.



User-agent:

Disallow: /yoursiteurl1 …[path after the root]

Disallow: /yoursiteurl2 …[path after the root]



etc. Don’t use wildcards like /somefolder/


Google doesn’t accept that.

Indy



I’m in love :wink:



Thank you.