Confused as to the problem.
We can have multiple robots.txt from DB for Multi-Srorefronts ONLY edition of CS-Cart. Not for Multi-Vendor Editions. Our store is MV Plus. Technically we can have only ONE storefront.
BUT, there are addons allowing the vendors to have their own frontend on different subdomains. Crippled addons, because they only add a subdomain in the cPanel and redirect it to the vendor store in the MVE - mod_rewritten or not. When you click on a product, you are transported back to the MVE.
We have developed a method to generate different frontends for the different languages, and as long as you don't change the language, you stay on one domain.
Here comes the problem - the language specific domains are essentially parked domains and share one root with the main domain.
In order to do what you suggest, CS-Cart have to develop a Multi-Vendor Multi-Store Multi-Frontends Edition...
Judging by the multiplication price logic of CS-Cart, this edition will cost probably $10,000.
My solution costed me $100.
What remains to be done is creating virtual robots.txt for each domain. When I manually create a robot.txt and put it in the root, it will be valid for all domains, because there is no way to give contradictory instructions or domain specific instructions, say, e.g.
User-agent: *
Disallow: /*sl=en$
Disallow: /sachgebiet/
Disallow: /предмет/
Host https://www.mysite.en/
User-agent: *
Disallow: /*sl=de$
Disallow: /subject/
Disallow: /предмет/
Host https://www.mysite.de/
User-agent: *
Disallow: /*sl=ru$
Disallow: /subject/
Disallow: /sachgebiet/
Host https://www.mysite.ru/
Or this is possible? We don't know the logic of the Google or Yandex bot. We only know that robots.txt is valid for that domain in whose root it is placed... Will Host and Sitemap directive really differentiate the domains?
Only Brin knows.
Or probably He doesn't.
What we know for sure is that the creation of robots.txt from DB must be disabled. What function or php file is responsible for this?