My customer hired a SEO company to evaluate their site.
The major complaint was the amount of link combinations on the category pages. The spiders are creating thousands of links from one category page because of the Sort by, # per page, Grid, Compact list, etc. options the customer has to layout the category page. She said all these links created by the cart and seen by the spiders (that are really the same page) are hurting the ranking.
She asked me to change all the Sort By, # per page, View by Grid, View by List, etc links from <a href links to use jquery links to hide them from the spiders.
Does this make sense?
Has anyone done this?
By the way I have Disallow: /*? in the robots.txt file and I thought this told spiders to ignore links with a ? in it. But she showed me that the spiders still will index these links, they just won’t add them to the rankings because of the robots.txt exception. But since they are still spidered they hurt against the duplicate page and content. She said to use jquery links to hide all these links from the search engines.