Blocking Dynamic Url's From Google

We have a Pro account and they've pointed out a few serious SEO issues with the dynamic pages on our site, such the testimonials page. Here are some SEO issues they've pointed out:

1) Duplicate Page Content
Code and content on this page looks similar or identical to code and content on other pages on your site. Search engines may not know which pages are best to include in their index and rankings. Common fixes for this issue include 301 redirects, using the rel=canonical tag, and using the Parameter handling tool in Google Webmaster Central.

2) Duplicate Page Title
Make sure to use unique titles for various pages on your site to ensure that they describe each page individually and don't compete with each other for keyword relevance.

3) Missing Meta Description Tag
Meta description tags, while not important to search engine rankings, are extremely important in gaining user click-through from search engine result pages (SERPs).

Therefore we're considering blocking dynamic URL's from search engine bots with the robots.txt file and we're curious as to what you think about the above valid and concerning issues the MOZ have pointed out? Also we're curious if you would agree blocking the dynamic URL's with the robots.txt file is the best method such as:

User-agent: *

Disallow: /*?
Disallow: /*index.php?

Are there any sample links that you'd like to block?

Are there any sample links that you'd like to block?

1) index.php?dispatch=discussion.view&thread_id=6&page=2&selected_section=discussion

2) ?selected_section=discussion

Add the following line to the robots.txt file

Disallow: /?selected_section=