SEO Recommendation for META tags and URL structure

Topics: Business Logic Layer
Jul 23, 2013 at 11:59 PM
Edited Jul 24, 2013 at 12:02 AM
I'd like to suggest an extension to the page and post admin section. The Title tag and description meta tag play a large role in SEO and thus having the ability to control the values outside of the post title or excerpt field is huge.

Both tags have a length limitation and are usually customized for search engines.
  • The <title> tag is usually much different then the title of the post which is currently used for that purpose. It is also recommended to be limited to 70 chars.
  • The Description META tag is also much different then the blog excerpt and is limited to 155 chars. Where the excerpt is used for a readers summary of the post content depending on its setup
  • The "keyword" META tag is no longer used and is seen as spam, but its connected to keywords field which is also used for display of tags.
  • Last note that I also noticed is that the system generates duplicate content based on how the pages are accessed i.e through category or tag ... This should not be the case - i added code that sets those pages as "nofollow" but there maybe a better implementation of that.
or better yet maybe implementing the rel="canonical" tag?
https://support.google.com/webmasters/answer/139394?hl=en

Just some thoughts that would solve a few key issues for SEO purposes.
Jul 26, 2013 at 9:38 AM
Hi Vanadiumtech,

I had made a custom version of BE that includes the SEO features and more that you had mention above.

You can test drive it yourself

Login: admin/admin

http://demo.bloggersonline.com/

After you login test out creating a new post.

You will notice some new text boxes that are not in the default BE.

There is a Meta Post Title so you can have two titles per post.
  1. For Meta data shown in user browser at the top
  2. Post/Page title that shows within "on" page.
There are also other SEO features like

<meta name="ROBOT" content="NOINDEX, NOFOLLOW" />


Added to all non actual posts and pages like, keywords,categories, and etc.

Non-Actual posts or pages also have custom descriptions and words for each and does not use default ones.

You can see those in action here:

http://kbdavis07.info

Another feature to add in the future will be to track search engines and give them 301 redirects when trying to access
non-posts/pages like ?tag,category,page list urls llike page=1

This will enforce the non-indexing of non actual posts/pages.

For now have to rely on the search engines on using the robot rules in file and on page rules.
Jul 26, 2013 at 4:43 PM
Regarding the Canonical tag I would also implement it for the default.aspx:

demo.bloggersonline.com
demo.bloggersonline.com/default.aspx

Both pages deliver the same content and therefore "default.aspx.cs" should implement the canonical tag inside the Page_Load() method:
        if (Request.RawUrl.ToLowerInvariant().Contains("default.aspx"))
        {
            HtmlLink css = new HtmlLink();
            css.Href = "demo.bloggersonline.com";
            css.Attributes["rel"] = "canonical";
            Page.Header.Controls.Add(css);
        }
Just some experiences I had:
While I support (and implemented a while ago) your suggestions I still ranked behind a crappy site which does all of the above (and much more) wrong. The reason why I ranked lower was because of a bad "dwell time". The crappy site did it right even though they didn't do it on purpose.
Right now (mid 2013) it doesn't really matter if you have a crappy site or not as long as Google thinks the user engages with your content and doesn't come back to the search results.

The ending of the story are 7 YouTube results ranking 1-7 :D
Jul 26, 2013 at 6:12 PM
Hi Moddie,

Thanks for the canonical code you provided.

In regards to SEO there are 100's of many different factors that effect website ranking.

In terms of Google I had worked for Google as a Search Engine Evaluator and I know for fact they use humans to review some of the websites
that they index.

The major #1 thing Google is looking for now is "Usefulness" and "Utility" that websites provides to users.

With that said you can have your website 100% correctly with on page "SEO Optimized" with the correct and proper use of meta tags and the
right amount of keyword density per page and etc.

Even with having the correct "on" page "SEO Optimization" your website can still have lower ranking than websites that does not.

This can be the case with websites that have higher server response rates, have unique authoritative content, provide more usefulness and utility than the "proper"
SEO Optimized websites.

There are many different "Signals" that Google gets from websites and the sources comes from many different sources.

They are all then added and weighted in and then factored in to come up with the end result of Website rank.

Also you need "Off" page "SEO Optimization" too.

This is where I bet 95% chance the "crappy" website you are referring to has more "Off" page SEO.

Off Page SEO includes other websites that links to your website, not only how many but specifically which type of websites link to your website.

If you have a website about purely asp.net c# BlogEngine.net

Websites that has content only about "PHP" and nothing about BlogEngine in Google eyes would be valued less than
other websites that links to your website that has content about asp.net c# BlogEngine.net

The more websites content goes away from the content of your website the lower their link value gets to be added to your over all ranking score.

If that same website has links from shopping websites or other websites that has nothing to do with "Programming" "Technology" or the Internet
it is possible for those links to have an "negative" effect on your over all ranking score.

This is why it is very important to check exactly who is linking to you, what link they are using, and what anchor text they are using on their
website to link to yours for all of those factors also has an effect.

The take away here for SEO is the following:
  1. Think like your User / Reader of your website
If your website is unique, provides usefulness and utility to your users Google and others are going to like your website :)
  1. Content Freshness
Must have unique, original, content that are fresh and added periodically.

You simply can not just post 20 or 30 posts at a time and then never post new ones again or update the ones you already have.
  1. Social Media
Got to share it with social websites and etc
  1. "On" page SEO
  2. "Off" page SEO
If you follow these steps and thinking in order you will do good.

With recent changes and current trends "On" and "Off" page SEO are starting to be valued less in the over all ranking score calculations.
Even through they are now "valued" less than before they are still important through.

Steps #4 and #5 can be viewed as squeezing out all of the SEO value points you have left in your website.

Now days you can not just do "parts" of SEO Optimization but must include the entire picture now.
Jul 29, 2013 at 2:44 PM
@kbdavis07: Wow! Thank you for all the useful insights :)