Did I do something stupid?

We opened our site last week and submitted our sitemap to Google, Bing, Yahoo, and Ask. While the Bing, Yahoo and Ask bots have not even looked at our site, the GoogleBot has been very active, crawling around 1,000 pages a day.



Although the crawl is happening, according to Webmaster Tools, Google says that it hasn’t indexed any of our URL’s yet. However a few days ago I tried the “site:www.oursite.com” and Google showed 140 pages. That was cool, but all the pages were subcategory pages driven by our filters. These URL’s start with “/subcat?=” on our site. These pages (in my mind) aren’t specific to product searches, I’d rather have the pages in our sitemap show as these are SEO URL’s for products and categories.



So (and here’s the stupid part) I disallowed “/subcat?=” in robots.txt and now when I do “site:www.oursite.com” there’s nothing listed which I guess I should have expected. Question is, should I have done this? Looking back, I’d rather have someone come to the site to non-specific page rather than not come at all.



Should I undo what I did?



Thanks