Unique implementation of robots.txt and .htaccess on sub-blog

Topics: Business Logic Layer
Jul 23, 2013 at 11:36 PM
Edited Jul 23, 2013 at 11:43 PM
How can I have a unique robot.txt file for diffrent domains running on on instance of blogengine as sub blogs.


Currently all implementations of the following are using the same file
i.e:
www.site1.com/robots.txt
www.site1.com/.htaccess

is the same as

www.site2.com/robots.txt
www.site1.com/.htaccess

Is there a way to make the sub-blogs have their own implementation of these files?

BTW: This has a few SEO implication like being able to remap pages that are already indexed to new content and controlling crawler access to the individual sub-blogs as needed