Unique implementation of robots.txt and .htaccess on sub-blog

Topics: Business Logic Layer
Jul 23, 2013 at 10:36 PM
Edited Jul 23, 2013 at 10:43 PM
How can I have a unique robot.txt file for diffrent domains running on on instance of blogengine as sub blogs.

Currently all implementations of the following are using the same file

is the same as


Is there a way to make the sub-blogs have their own implementation of these files?

BTW: This has a few SEO implication like being able to remap pages that are already indexed to new content and controlling crawler access to the individual sub-blogs as needed