Google Crawling Features Hash Page 2

I am getting bots crawling the features pages which will create many duplicate pages in WM tools is there a way to stop it. I added

Disallow: /?features_hash

in the robots.txt but they are still being crawled

06 May 2015, 20:42

/tee-shirts/page-2/?features_hash=V7599.V7592 Tee Shirts, Custom printed, Embroidered, delivered fast - page 2 06 May 2015, 20:39

/tee-shirts/?features_hash=V7599.V7592.V7596 Tee Shirts, Custom printed, Embroidered, delivered fast 06 May 2015, 20:34

/tee-shirts/?features_hash=V7599.V7592.V7598 Tee Shirts, Custom printed, Embroidered, delivered fast 06 May 2015, 20:30

/tee-shirts/?features_hash=V7599.V7592.V7594 Tee Shirts, Cust

please try :



[color=#282828][font=arial, verdana, tahoma, sans-serif]Disallow: /*?features_hash=[/font][/color]

Thanks, I have also added in webmaster tools crawl perameters,



John

check which url parameters is google watchibg in the webmaster. but the solution should work.

best wishes

Had the same issue, googlebot just crashes my site indexing more then 20.000 pages. While i have 270 products and a few pages.



I added the url parameters and in addition added the Disallow: /*?features_hash= to my robots txt.



Thanks for the tip.

basicly robots.txt insruction is enough l. google will drop urls…

Added to robots txt and still being crawled, I have message wm and will get back here with response

wait couple days…

have you updated robots.txt in webmadters?


Hi again,

so now Im getting loads of these all with different page numbers.

variant-152.html?page=166

Can I just add

Disallow variant-

and it will stop all pages

or should it be

Disallow variant-/

I am also seeing these being crawled, how to stop these ?

footwear-page-13.html?subcats=Y

page-3.html

variant-152.html?currency=USD&page=151