This project is read-only.

10k pages crawled on a site with 30 - Is there a bug?

Topics: Business Logic Layer
Feb 27, 2014 at 6:50 PM

Using BE 2.7, and various tools are reporting a ton of pages that should not be there. ...

Why are these page being generated by BE? Its causing havoc with SEO and so on ...

Feb 27, 2014 at 6:59 PM
Edited Feb 27, 2014 at 11:00 PM
One step closer, it seems that the cause of this is in the creation of sub blogs. Somewhere pages from other sub blogs are being cross connected causing bogus URLS.

BTW: after further checks, this is not related to the sub blogs
Feb 27, 2014 at 10:59 PM
Edited Feb 27, 2014 at 11:00 PM
Just checked the 2.9 version and it seems to be addressed in this release .. would love some input on this?

The issue can be replicated on BE 2.7 and BE 2.8
Feb 28, 2014 at 1:15 AM
Feb 28, 2014 at 4:54 PM
Hey rtur,

Thanks for the suggestion, I'm testing it now.

Even with this implementation, the issue in the software remain. Why is the system generating access to phantom pages that do not exists?


This also messes up Google Analytic as it reports traffic to pages that are not real. Is this a bug in version 2.8 and bellow, and is this something that has been addressed in BE 2.9? It seems that 2.9 reports the page as not existing.
Feb 28, 2014 at 6:34 PM
Is the solution as simple as redirecting to an error page when the page= exceeds the available pages?

Updated PostList.ascx.cs BindPost() function with

            if (stop < 0 || stop + index > visiblePosts.Count)
                this.hlPrev.Visible = false;
                this.hlNext.Visible = false;

                //jk 2014-02-28 -- added to force error page when page= exceed availble post count