Turn off paging?

Topics: ASP.NET 2.0
Feb 7, 2014 at 8:26 PM
Is there a way to turn off paging at the bottom of the blog? The paging is causing some SEO issues for us, and I wanted to turn this off, if possible.
Feb 7, 2014 at 10:16 PM
I guess setting posts per page in settings to something like 10000 would be same as turning it off.
Feb 9, 2014 at 10:07 PM
nwebster wrote:
Is there a way to turn off paging at the bottom of the blog?

The paging is causing some SEO issues for us, and I wanted to turn this off, if possible.
Hi Nwebster,

How is the paging causing SEO issues for you?

Just wondering what they are and see if there is a way to fix that instead of just turning it off :)
Feb 9, 2014 at 10:59 PM
Our posts tend to be cyclical year over year. With google picking up the old posts from the paging for the previous years, it dings us for repeating content. We just really want to have the last handful of posts visible.
Feb 9, 2014 at 11:22 PM
Edited Feb 10, 2014 at 12:02 AM
Hi Nwebster,

"We just really want to have the last handful of posts visible."

Do you mean your older posts you don't want them being displayed at all?

Or do you mean only want last handful visible in the Post List and have the rest of the "older" posts left on website
but not showing up in the paging or post list?

Other words "humans" will not be selecting the older posts directly to view but reaching it through Google and other search engines and not through the paging?

To stop Google and other search engines from indexing the pages you can add this code to the

default.aspx.cs page in your root directory

else if (Request.RawUrl.ToLowerInvariant().Contains("?page="))
        base.AddMetaTag("ROBOT", "NOINDEX, NOFOLLOW");

Then in your robots.txt


Disallow: /?page=*

These would stop google from indexing your posts from paging.

To "enforce" that all of the previous pages index from Google gets re-index to your default page put this in instead:

else if (HttpCapabilitiesBase.Crawler == true) & (Request.RawUrl.ToLowerInvariant().Contains("?page="))

    base.AddMetaTag("ROBOT", "NOINDEX, NOFOLLOW");
    Response.StatusCode = 301;
    Response.AppendHeader("location", Utils.RelativeWebRoot + "/default.aspx");

Source: http://msdn.microsoft.com/en-us/library/system.web.configuration.httpcapabilitiesbase.crawler%28v=vs.110%29.aspx

Code above is for showing what can be done :)

You would only want to apply the NoIndex and Redirect if it is a bot crawling the website
for humans you don't need to do anything.

This way if Google already indexed your paged page --> YourWebsite.Com?page=1

On the next visit it tells Google Don't index or follow and then tells it use

your default page in its place and this is a permit move.

Do doing the following above will fix your SEO issues allowing you to set what ever amount you need in your pagging instead of 10,000 :)

That would be too many pages to load on visit of main website.

So that should do it!

If you have any questions please ask.

Hope this helps

Have a great day!

Brian Davis

Feb 10, 2014 at 8:03 PM

Thanks. That will work perfectly for our needs!


Feb 16, 2014 at 7:15 AM
Edited Feb 18, 2014 at 5:06 AM
Hi Nwebster and any one else that is interested,

Update: Tue 18 Feb 2014

Round 1 of SEO improvements got merge into the local build of BE.

Current Fork is below:

I am working on some more SEO improvements starting with the default.aspx.cs page and then moving on to different areas of BlogEngine.net

If you have any additional ideas for SEO improvements that you would like to see added to BlogEngine.Net please let me know by replying here to this forum post.

I am going to be spending some time working on SEO improvements for BE and updating those changes to the fork


So if you have a "Wish List" of SEO features you would like to see in BE let me know here.

Live testing on a live server of the changes can be tested at:


Stay tune for more updates.

Have a great day!

Brian Davis