Best practice robots.txt

Hi,

is there a actual best practice robots.txt. Our store is running mostly in German speaking countries, so Google is key of course.

I have:

User-agent: *
Disallow: /app/
Disallow: /store_closed.html
Sitemap: https://mysite.de/sitemap.xml

But .e.g. simtech recommends:

User-agent: *
Disallow: /app/
Disallow: /design/
Disallow: /js/
Disallow: /var/
Disallow: /images/
Disallow: /images/thumbnails/
Disallow: /store_closed.html
Disallow: /comparers
Disallow: /items_per_page
Disallow: /dispatch=products.search
Disallow: /dispatch=products.newest
Disallow: /dispatch=searchanise.async
Disallow: /dispatch=profiles.update
Disallow: /dispatch=auth.recover_password
Disallow: /dispatch=products.quick_view
Disallow: /dispatch=orders.search
Disallow: /dispatch=auth
Disallow: /dispatch=wishlist
Disallow: /dispatch=profiles
Disallow: /dispatch=checkout
Disallow: /dispatch=debugger

Sitemap: https://[EXAMPLE.COM]/sitemap.xml
Host: https://[EXAMPLE.COM]

Crawl-delay: 1

Best,

Bernhard