8 Essential Search Engine Optimization Features For Any CMS
Title & Meta Tags
Yeah, this one is pretty obvious, but also highly overlooked. On a number of occasions I have come across CMS setups without this basic control. At the very least you need control over the Title tag and the Description and Keyword Meta tags for static pages. For dynamic pages, there should be some level of programming in place with will auto generate these critical tags without duplicating the tag content found on other pages.
Optionally, you will also have a meta Robots Meta tag which can be useful if you want to manually exclude certain pages from search indexes. If your CMS supports it, you should install character limiters on the form fields for the Title tag and Description Meta tag. This will ensure that your titles and descriptions are not too long.
Canonical Meta Tag
In February 2009, Google introduced the Canonical Meta tag which was officially live in April of 2012. This important meta tag protects us from "duplicate content" pages. Duplicate content issues occur when the same content is accessible from multiple URLs. (Google has made it very clear that it doesn't like duplicate content on websites!)
These two theoretical pages may return the same content. By adding a Canonical Meta tag to one of the pages, webmasters can make clear to the search engines which page should be credited as the original. According to Google, the canonical link element is not considered to be a directive, but a hint that the web crawler will "honour strongly".
- By using canonical tags Google knows that any 'reference'
should be 'credited' to
By doing this you avoid Google from indexing these two different URLs as duplicate content.
The Preferred Domain is the one that you would liked used to index your site's pages (sometimes this is referred to as the canonical domain). Links may point to your site using both the www and non-www versions of the URL.
The preferred domain is the version that you want used for your site in the search results. A CMS should have the ability to set the preferred domain in the event that this cannot be controlled at the hosting/server level.
Your Robots.txt file is what tells the search engines which pages to access and index on your website on which pages not to. For example, if you specify in your Robots.txt file that you don't want the search engines to be able to access your thank you page, that page won't be able to show up in the search results and web users won't be able to find it. Keeping the search engines from accessing certain pages on your site is essential for both the privacy of your site and for your SEO. A CMS may simply grant access to the Robots.txt file as a whole or allow individual pages to be added to the file via a checkbox on a given node for example.
The great thing about most CMS offerings out there is their ability to create XML feeds from page nodes. A SEO friendly CMS will automatically create a sitemap.xml file and keep it up to date as new content is added. XML site maps are accessible to all major search engines (XML sitemaps are now supported by Google, Yahoo! and MSN), so they can more intelligently crawl the site through both top and deep level links.
Browsers do not use this attribute in any way. However, search engines can use this attribute to get more information about a link. A SEO friendly CMS should have the ability to set the REL attribute value on your links.
When it comes to SEO, the nofollow value is by far the most useful. It is used by Google, to specify that the Google search spider should not follow that link. There are other values that may be useful to your Website:
|alternate||An alternate version of the document (i.e. print page, translated or mirror)|
|stylesheet||An external style sheet for the document|
|start||The first document in a selection|
|next||The next document in a selection|
|prev||The previous document in a selection|
|contents||A table of contents for the document|
|index||An index for the document|
|glossary||A glossary (explanation) of words used in the document|
|copyright||A document containing copyright information|
|chapter||A chapter of the document|
|section||A section of the document|
|subsection||A subsection of the document|
|appendix||An appendix for the document|
|help||A help document|
|bookmark||A related document|
|nofollow||"nofollow" is used by Google, to specify that the Google search spider should not follow that link (mostly used for paid links)|
|license||Link to copyright information for the document|
|tag||A tag (keyword) for the current document|
When a page is created, its important that the URL (website address) file name contain relevant keywords. Your SEO friendly CMS needs to have a configurable field to allow for the URL to be customized. Why? Search engine algorithms place weight on the URL of the page when determining relevance. Your site stands a much better chance of ranking for a key phrase when you use small, tightly "themed" groups of words that are incorporated throughout the page.
It is important that your CMS allow for the creation and administration of 301 redirects. In the event that you rename or delete pages, you will want to avoid loosing all the link juice you may have earned. Your CMS may automatically create a 301 when pages are renamed or deleted. if not, at a basic level, you need to be able to create them manually.
Why? Search engines like the keywords in the URL plus people tend to prefer friendlier URLs in search results and these types of URLs tend to get higher click-through rates.
Now For YOUR Feedback
These are very important features if you want your website to be search engine friendly.
Please add your feedback! Are you familiar with the features noted above? If not, do these make sense? If yes, have we left any features off for the list?