Hi,
Can anyone tell me how to stop bot crawling specific pages, i.e certain content like about us/privacy policy etc etc.
Tony a while back posted something about it but I cant find it.
thanks
John
[quote name=‘johnbol1’ timestamp=‘1343403327’ post=‘141638’]
Hi,
Can anyone tell me how to stop bot crawling specific pages, i.e certain content like about us/privacy policy etc etc.
Tony a while back posted something about it but I cant find it.
thanks
John
[/quote]
Yo, Fonz,
Within your robots.txt file, add your specific pages as follows:
User-agent: *
Disallow: /store_closed.html
Disallow: /install/
So this is telling “All Bots” to not crawl my store_closed.html page, and also to not crawl the entire “install” directory.
diamond…thanks Struck, didnt even cross my mind that one.
john
You can also tell a bot not to follow a link by adding a
rel=“nofollow” attribute to the anchor tag. You can do a “page source view” and search for login or register to see how it's done.
[quote name='tbirnseth' timestamp='1343461458' post='141707']
You can also tell a bot not to follow a link by adding a
rel=“nofollow” attribute to the anchor tag. You can do a “page source view” and search for login or register to see how it's done.
[/quote]
THATS ! what I seen you put in another thread. Thanks for the info I can use both of these.
John