There is an easy solution to this :)
Just go to your default.aspx.cs page
and put this code in there:
else if (Request.RawUrl.ToLowerInvariant().Contains("?page="))
base.AddMetaTag("ROBOT", "NOINDEX, NOFOLLOW");
With the ToLowerInvariant().Contains("?page=")) section you can also add additional filtering too like
and add noindex or follow to those too.
See it in action live at:
In my opinion pages like ?page=2 tags, and categories are really meant just for "humans" to use as tools and not for
search engines to index.
The only thing that really should be index are just your /pages and /posts
The rest have no need to be indexed.
So in your robots.txt file you can also put allow /pages /posts and then decline all of the other folders.
This will solve the issue of duplicate content because you are only having the actual page or posts being indexed and that is it.
Have Post list, list of posts on pages, keywords, categories all will just devalue your value of your actual posts and pages.
If users want to use those they can use the widgets on your website to get to them.
When they are directed from a search engine from one of these pages like ?page=3 90% chance they are not being directed to what they were looking for any
way, or they have to search the posts or pages in the list to get to it.
It is much easier to direct them to the actual page or post they are looking for in the first place instead of having them play Easter egg hunt in finding what
they were looking for in the first place. :)
Have a great Day!