Updated December 1st, 2021 — version 307

The UnFair Advantage Book
Winning the Search Engine Wars

Chapter Six, On-page, Internal Ranking Factors

Chapter Six
On-page, Internal Ranking Factors

Back in Chapter Three you learned about some of the ranking factors. We used hypothetical dials to illustrate their relative importance. In some cases the dial-maximums were set high, in other cases the dial-maximums were set low. We even showed you dials that could register a negative ranking score. In this chapter you'll learn about all of the important internal (on-page) ranking factors and their relative importance on the current algorithm dial.

Let's start by defining Internal Ranking Factors. These are variable page elements found within your site's web pages. You have total control over all of these elements since they exist completely within the realm of your website. The internal factors covered in this chapter should be regarded as essential elements to your site's optimized web presence. Any internal factor NOT covered here should be considered not significantly important.

  • The Title Tag

The <Title Tag> has always been, and still is the #1 most important internal ranking element. Within the source code of your web page the title tag looks like this: <title>Your title tag keywords go here</title>.

The title tag is intended to tell the search engine what the page is about. That's why you should put your most important keywords in the title tag. If your page topic is about steel rebar, then the keywords steel rebar should be included in the title tag.

Below is an example of the title tags used by a real-world company that managed to rank 5 of their web pages in the Top 10 search results for the keyword steel rebar:

<title> Steel Rebar - Steel Reinforcing Bar | Harris Supply Solutions </title>
<title> Steel Rebar Sizes - Steel Rebar Stock | Harris Supply Solutions </title>
<title> #4 Rebar - #4 Reinforcing Bar | Harris Supply Solutions </title>
<title> Steel Rebar Supplier - Steel Reinforcement Supplier | Harris Supply Solutions </title>
<title> #6 Rebar - #6 Reinforcing Bar | Harris Supply Solutions </title>

Notice that NONE of the title tags are identical. This is important. You should never have any duplicate title tags anywhere on your website! Duplicate title tags are confusing to search engines and viewed as an error which can negatively affect your search rankings.

Generally speaking you should limit your title tags to 51 characters. This is intended as a guideline. You might notice that the fourth title tag listed above exceeds the 51 character limit. But it should also be noted that the most important keywords are arranged toward the beginning of each title tag just as they should be.

Always remember the content within your title tag is frequently the text that appears as a link in the search results.

Title Tag Example

This screenshot illustrates how Google pulls content from the title tag to provide your descriptive link. If you compare these links to the title tags listed above, you'll see how Google adapts them in whatever manner they deem appropriate. In many cases they'll simply shorten them while in other cases they'll delete a portion. Sometimes they'll pull content from within your page that matches the search query and add it to the descriptive link. In this screenshot you may notice that Google truncated the link description in the second result and deleted the middle portion of the title tag in the fourth result, which also happens to exceed the 51 maximum character guideline.

By the way, it isn't actually necessary to put your company name in your title tags, although Google tends to like it there. In some cases Google will add it for you if it isn't already there. We mention this because it can be puzzling to see content that isn't in your title tag appear in descriptive links as though it is. That's not Google messing with your source code. It's simply their way of making your descriptive link more relevant to whatever search query is generating the results. So don't tear your hair out trying to figure out what's wrong with your title tag. Nothing is wrong. Google routinely makes adjustments that it deems necessary to "provide a good user experience" for its site visitors. We've even seen cases where they'll pull content from a headline tag or an inbound link's anchor text if the page's title tag is lacking content relevant to the search query. Their goal is to make the descriptive link match the search query as closely as possible. And fortunately in most cases this works to your benefit because people are more likely to click links that match their search queries.

  • The Meta Description Tag

While not nearly as important to ranking as the Title tag, the Meta Description tag should never be overlooked. It's frequently the source from where Google pulls the text that's displayed directly below your descriptive link. Think of it as the enticement for clicking if the content of your link isn't already compelling enough. The Meta Description tag frequently provides that block of text.

Within the source code, it looks like this:

<meta name="description" content="A good meta description tag entices the searcher to click the link by describing what they'll find when they view the page."/>

Therefore, the Meta Description tag is important because it can affect click-through rates once your pages are actually found in the search engines. Remember, it doesn't do any good to rank well if your links don't get clicked.

Meta Description tag snippets

On the right we see the Meta Description tag was used fully or partially in three out of five of the page descriptions.

Take note that Google also pulled snippets of text from the body content located at the beginning of two of the pages and near the end of one of the pages. This tells us that a web page's opening and closing text content is important in regards to SERP descriptions and can be used to entice clicks.

Using your keyword in the Meta Description tag is unimportant in respects to ranking. However, it IS important in regards to click-throughs. Since search engines like to use the keywords that are searched, they will often grab a snippet of text from the page if the Meta Description tag doesn't contain the keyword that was searched for.

  • Keywords in the Domain Name as a Ranking Factor

The importance dial has been turned down considerably on keywords in the domain name. This is especially true of domains composed of highly popular generic keywords like buycheapairlinetickets.com. These look spammy to Google so they are no longer favored in terms of ranking well. Even if the domain name is an exact match with the keyword in the search, Google takes into account the content quality as well as the quality of the incoming links as they evaluate the relevance of the page. If either the links or the page content are of low quality, then having an exact match domain (EMD) will not help very much, if at all, in terms of ranking well.

Instead, more and more favor is being lavished on brands. It helps considerably to have your unique brand name as your keyword in the domain name. And, if you can manage to get people searching for what you're selling by using your brand name, then you have the ultimate advantage. That's what you should strive for in the long run.

  • Headline Tags

Having your best keywords in your <H1> (headline) tag is important. Google looks for keywords in the H1 tag and oftentimes pulls your snippet from this area of your page. We recommend using the H1 tag only once per page and to place it somewhere near the beginning of the body text rather than in the left or right rail content where we often see the navigation portion of the page.

Ideally your keyword should appear early in the headline but keep in mind the headline must read well to site visitors or else it will hurt sales. So the rule is to use a headline intended to attract attention to your product first, and then cater to Google's algorithm second.

By the way, as you probably know, an H1 headline can appear on a page as disturbingly HUGE. In most cases, the font size is too large in terms of creating aesthetically pleasing web page design. The work-around involves CSS (cascading style sheets). By using CSS to adjust the font size of the headline to align with the design goals of the page, you can address both design and SEO concerns at the same time. And, in case you are wondering, using CSS to reduce your H1 font size is perfectly ok with Google.

While the H1 tag can help your ranking, H2 and H3 tags are less effective. Regardless, it's ok to use them, they might help a little and they certainly won't hurt. But you shouldn't expect a boost of any significance from keywords in headline tags other than the H1.

  • On-page Anchor Text

The keyword text within your on-page links — the anchor text — provides a bit of help, ranking-wise. But, since Google knows this is easily manipulated, they don't turn the dial up very high on the algorithm. Regardless, it will usually help and not hurt your ranking efforts provided that you do not abuse the strategy. If Google thinks your on-page anchor text is there to manipulate rankings, they can penalize you. Therefore, we recommend that you limit your keywords within your anchor text to only a few per page. Any more than that could be counterproductive.

  • Keywords in Body Text

As you might imagine, having your keywords in your body text (the page content) is also important. This is what Google indexes and uses to determine if a page is relevant to the search query. In regards to on-page ranking indicators, the keywords found within page content are typically a medium to strong ranking factor on the algorithm dial.

However, it's a bad idea to stuff or repeat an excessive number of keywords. Doing so will get you penalized. It's best to sprinkle in your keywords naturally in ways that sound comfortably conversational when you read it out loud. Otherwise your page's quality score will suffer and its ranking will be hurt.

It's best to place your keywords toward the beginning of the body copy. It can also be beneficial to place them toward the end of the text as well. If it seems natural to use them in other locations, then do so. Just be sure to avoid using them in the way that a used car salesman might overuse your name when trying to sell you something. If it sounds a little creepy when you read it out loud, then you've probably repeated your keywords too frequently.

  • Images

Images are an often overlooked ranking factor. While it's true that search engines can't "see" images, they can see the filenames and the Alt tag. Therefore you should name your image files by using applicable keywords like keyword.jpg.

Remember that some people use Image Search as their primary search vehicle. In such cases you'll want your images to rank well because top ranking images are another great way to drive traffic to your site. The Alt tag provides an opportunity to associate keywords with your images.

<img src='/images/BMWZ4M.jpg' Alt='BMW Z4 M Series Roadster Convertible'>

However, you should NOT over use this strategy. Using keywords to name images is acceptable to Google. Adding a brief description of the image in the Alt tag is fine. But keyword stuffing your image Alt tag will get you penalized.

You can help the engines index your images more completely by using an XML image site map. For in-depth information on this topic, take a look at these articles (requires SEN Membership):

  • Keyword Density

Keyword Density is a ratio that is calculated by dividing the number of times your target keyword appears on the page by the number of total words on the page. For instance, if your keyword appears 10 times in a page with a total of 500 words, the keyword density is 2% (10/500=0.02).

Although a perfect keyword density ratio was, in the past, an important ranking factor, today that is not the case. Our best advice regarding the ideal keyword density is to simply make it higher than any other word that appears on your page. If you're selling rebar, then the keyword rebar should have the highest keyword density ratio. This ensures the search engines will accurately determine the topic of the web page.

Remember to keep it natural. Your page content should make sense to humans when they read it. Avoid hammering any keyword too much and don't stress over trying to get the perfect keyword density ratio because there is no such thing as a perfect ratio anymore.

  • URL Structure

Using a simple URL that includes your targeted keywords will typically provide a slight to medium boost in ranking. Looking at our rebar example, we see the keyword is used in all five of the 'Harris Supply Solutions' URLs we found ranked in the top 10 as such:

  • http://www.harrissupplysolutions.com/steel-rebar.html
  • http://www.harrissupplysolutions.com/steel-rebar-sizes-stock.html
  • http://www.harrissupplysolutions.com/4-rebar.html
  • http://www.harrissupplysolutions.com/steel-rebar-supplier.html
  • http://www.harrissupplysolutions.com/6-rebar.html

Don't get carried away trying to stuff too many keywords into the URL. Remember that when the URL becomes too long, it's difficult to direct someone over the phone where to go. In addition, long URLs can be difficult to use in an email because the link tends to break at the hyphen if it uses more than one line. So, with these considerations in mind, it's a good idea to include your keywords in the URL provided that you take the conservative approach.

  • Uniqueness of Content

Having unique content is critically important. That's because search engines tend to view duplicate content as a waste of their indexing resources and counter to what their searchers are looking for. They correctly reason that nobody wants to search for a red widget and find hundreds of red widget pages that are all alike. The engines want to provide searchers with a variety of unique pages. This might include red widget product, red widget specifications, red widget reviews, red widget discounts, red widget videos and so forth.

Of course if you're writing a blog and producing original content of your own, then uniqueness is easy. But if you're one of a thousand sites selling a name-brand product, then creating uniqueness is going to be more challenging.

In such cases, the key to overcoming the challenge is in the product descriptions. While it's true that many websites have hundreds of merchants listing the same products in their shopping carts, you will find that...

only pages with unique product descriptions will typically rank well

...while those that use the brand-name-suggested description are either filtered out or buried in the rankings. And, by the way, this applies to images as well.

So, if you are selling something that a lot of others are also selling then you must rewrite the product descriptions and rename the product image files so that your product pages and images are not filtered out of the search results as duplicate content.

In cases where it isn't allowed to change the product description, you can add content to make the page unique. For instance, some sites add user reviews and product demonstration videos. By enriching the manufacturers content you can make your page unique and more deserving of a good ranking.

True, this requires a bit more work. But if you don't do it, then you can't expect to rank well because you're probably competing with the likes of Amazon and perhaps also the name brand company that manufactures the product.

If there's one ranking dial that's trending up, it's Mobile Compatibility. It's becoming increasingly important that web pages display properly when viewed on ALL devices and especially on mobile devices.

As far back as 2013 we began seeing a ranking preference given to sites that are fully mobile compatible. Google is committed to providing searchers a quality experience when using their smartphones and tablets. They've come out and stated that companies need to design their sites to be responsive to all devices if they want to be successful long term.

This has launched a trend toward what's known as Responsive Web Design — arguably the most fluid sector of SEO/SEM strategy right now.

Concurrent with the emphasis on mobile compatibility, is page-loading speed. Mobile compatibility and page-loading speed are elements of a website that Google is increasingly factoring into their ranking algorithm. Their thinking is that quality websites load fast and are responsive to display properly on all devices. If your site isn't responsive and fast loading then it's just not going to be able to compete online.

Google has gone so far as to change the way their algorithm works completely. They're calling it the Mobile-first index, which means that instead of focusing on the way your site works on a desktop computer for rankings (as of early 2018) they're focusing on your site from a mobile device perspective. If your site has a responsive design then it's the same content on both desktop and mobile so you're fine. However, if you don't have a responsive design or have separate sites for mobile and desktop then you could have some work to do. You can learn all the details on our resource:

Check the site using a Smartphone such as an Android and iPhone, does it have at least minimal functionality? Test with 10" and 7" tablet as well, note problem areas if found. If you do not have access to these devices, at least test with a Smartphone emulator such as "https://www.mobilephoneemulator.com" and/or "https://www.BrowserStack.com" or the Chrome DevTools mobile emulator (look for smartphone icon on top left).

Page speed is also such a hot topic that we've expounded on it considerably. We suggest carefully studying the following article-tutorials (requires SEN Membership):

  • Broken Videos and Faulty Redirects as Negative Ranking Factors

As the emphasis on Mobile continues to grow, Google is now penalizing sites with page elements that do not display properly or break when viewed on a mobile or tablet device. Most commonly, we're talking about Flash videos and faulty redirects.

As you may already know, Flash does not work on smartphones or tablets. And for that reason, Google "suggests" that you not use Flash on your website. And, since websites and pages that are 100% smartphone and tablet compatible gain ranking favor over those that aren't, it means that incompatibility translates to a ranking penalty.

The same is true if your redirects are faulty or if your site returns 404 Page Not Found errors. You can learn more about this by studying the aforementioned tutorials, The Practical Guide to Mobile SEO, Parts 1 & 2.

  • Spelling, Grammar and Readability (i.e. quality) issues as Negative Ranking Factors

Spelling and grammar usage are also factored into Google's ranking algorithm as quality signals. However, comments that are posted on a web page are not factored into the quality scores.

In addition there is evidence indicating that rankings are also affected by readability level. Since mathematical formulas like the Flesch-Kincaid test can analyze a document for such, there is every reason to believe that Google calculates a score based on the average number of syllables used per word and the number of words used per sentence. We already know that Google's Advanced Search provides the option to apply a reading level to filter the search results, so we have hard evidence that Google owns the capability to factor readability into their algorithm. The higher the readability level, the better, ranking-wise.

  • Page Freshness as a Ranking Factor

Earlier you learned about query deserves freshness (QDF) search results related to hot topics. Since the search engines love new content, it only stands to reason that newer (i.e, fresh) content will have a ranking advantage over older content.

Therefore it is always to your advantage to update your best pages as often as is practical based on the type of content you're presenting. The more up-to-date your web pages, the better you can expect them to rank. On the flip side, the reverse is true. You should do everything in your power to avoid having stale, out-of-date content on your site because that will definitely hurt your rankings.

Pay very close attention to dates. Copyright, articles, reviews, and product pages that reference dates can be a problem if they indicate anything other than the current year or recent months. Your credibility and your rankings will suffer if you're touting the best widget for 2017 if we're already in 2018. You get the idea.

  • Geolocation Signals

In most cases, especially for businesses that attract customers locally, it's important to include geolocation signals like your address and phone in addition to your company name. This is typically referred to as your NAP (name, address, phone).

Your NAP should be displayed in several locations on your site.

Be sure to make it consistent!

The search engines do NOT like multiple phone numbers or locations. It's confusing to their database and your rankings can suffer. If a location keyword is an element of your customers' searches, then be sure to also include it in your title tags. And, if your geographic location has a nickname or slang term, be sure to work that into your content as well. For instance, if you're a dentist in the upper peninsula of Michigan then you know the term Yooper refers to residents of the local region. As such, a search for Yooper dentist produces the following top result:

Yooper dentist search results listing

Notice how they've work the slang term for the geographic location neatly into their content.

Take note that Local Search is a specialty within search. That's why we've dedicated an entire eBook to the topic. For an in-depth study of local search strategies, get a copy of our Local Search Marketing Book.

  • Spider Friendly Website Architecture

Your site's layout, aka architecture is important. While it's obvious you should make it easy for site visitors to navigate, it's critical that you make it easy for search engine spiders to find all of your pages as they crawl and index your site. You do not want obstacles such as Flash menus or dynamically generated web pages to prevent spiders from finding and following your links.

Spiders can find, follow, and interpret text links best. These are links with normal anchor text. Drop down links, created using CSS, are also easily found, followed and interpreted. If ever you're in doubt about this, simply disable CSS in your browser and take a look at the page. (Below you see the path for disabling CSS in the Firefox browser. Select No Style.)

How to disable CSS

Once you've disabled CSS, you should see the links as normal looking anchor text links when the page is viewed. If you can see them, then so can the search engine spiders and you're good to go.

Image-links are also easily found and followed. However, spiders can't always tell what the image-link is about unless there are clues like keyword.jpg type file names and a description added to the Alt image attribute.

Static Links vs. Dynamic links - A static link looks like this:


Of course the web page in that link is: file.html. Static links offer many advantages over dynamic links. For starters, they make more sense to humans and therefore are more likely to get clicked. They help eliminate the problem of duplicate content, which is good because search engines hate duplicate content. Static links display better in print and other media advertising. And static links typically get broken less often than dynamic links because they are shorter and less likely to contain a lot of hyphens.

A dynamic link is generated on the fly by using a database to name the page on request and as needed. The link might look something like this:

http://www.yoursite.com/s/ref=nb_sb_noss/183-1484986-3953124?url=search-alias% 3Daps&field-keywords=keyword1%20keyword2

The web page in that link is: 183-1484986-3953124?url=search-alias%3Daps&field-keywords=keyword1%20keyword2 . As you can see the link is much longer and it contains a lot of hyphens and other characters that are likely to break the link when spread across two lines. It's nearly impossible to remember and difficult to convey over the phone. It wouldn't display well in print or any other form of advertising media and it could become duplicate content if the same search generated the same page but with a different assigned "dynamic" serial number. You get the idea.

But the worst disadvantage to dynamic links is when a spider gets caught in a loop. This happens when the spider finds a product link, indexes the dynamic URL, and then finds another link to the same product and indexes a different dynamic URL even though it's the same product page but with a different serial number. And when this process happens again and again, the spider is said to be caught in a spider loop. This is bad in terms of getting your site properly indexed. Most spiders will leave your site to avoid such loops and therefore avoid indexing the rest of your site. Again, if your site isn't getting properly indexed then your web pages will NOT show up in the search results.

There are times, however, when dynamic links are desirable in terms of integrating product databases with web page display. But the good news is there are workarounds. Many Content Management Systems (CMS) like Wordpress, for example, can be set up to display static looking URLs that are actually dynamic. If your system demands dynamic URLs, then we suggest you look into making them as simple, unique and people friendly as possible.

Your goal should be to design your site architecture so that spiders can find every page on your site by starting at the home page. That does not necessarily mean that your home page must link to all of your pages. It does mean that, by starting at the home page and following links to secondary pages, these secondary pages allow the spiders to eventually find all of your pages. And, once again, remember to avoid links that depend on Flash even though you may hear that spiders are getting "pretty good" at finding such links.

For more information on setting up the ideal site architecture, take a look at our in-depth tutorial:

  • XML sitemaps

XML (Extensible Markup Language) is a type of markup language where tags are created to share information. An XML Sitemap tells the search engines what content you want indexed. Theoretically the search engines should find all of your content by following links. But an XML sitemap can help speed up the process and reduce the chance of spiders missing some content that isn't easily indexed. This is especially true for getting content like images, videos, and product pages indexed.

Most experts agree that an XML Sitemap is essential for keeping the engines up-to-date with your website changes. It helps to ensure that all of your important content is indexed and provides supplemental information (metadata) about your content.

By the way, you should not confuse a navigation site map with an XML Sitemap. The former is simply a page of links for your site visitors to use while navigating your site. The latter is a list-feed intended just for search engines and is not at all seen or used by site visitors.

Although XML sitemaps are not technically required, they are highly recommended. They provide useful metadata for the search engines and they're especially useful for content other than web pages. They're fairly easy to generate and there are plug-ins available for WordPress and other CMS systems to help you do this.

Additional Reading: The article-tutorial listed below is a few years old but the step-by-step process is still the same today. To learn how to create and feed your XML sitemaps, take a look at:

  • Robots.txt

One of the earliest names given to search engine spiders, crawlers, and bots was robots. Thus, the function of a robots.txt file is to tell spiders what to do in regards to crawling and indexing pages on your site. You might picture your robots.txt file as the tour guide to your site for the search engines. It provides a map that tells search engines where to find the content you want indexed. It also tells them to skip the content you don't want indexed. The end result is a faster and more complete indexing of your site.

If you do not have a robots.txt file, then the spiders will index everything. But regardless of whether this is what you want, we recommend that you have a robots.txt file anyway because the search engine spiders are looking for it.

There is a lot you can do with a robots.txt file to improve the efficiency of getting your site indexed. We highly recommend that you study and bookmark the following tutorial so that when the time comes to implement the various functions of robots.txt you'll be able to easily create the perfect file that will give you the results you're looking for.

  • URL Redirection

Redirects, sometimes referred to as URL forwarding, make a web page available under more than one URL address. When attempting to visit a URL that's been redirected, a page with a different URL opens up. For example, www.yourolddomain.com is redirected to www.yournewdomain.com.

Redirects can be used to forward incoming links to a correct new location whenever they're pointed at an outdated URL. Such links might be coming from external sites that are unaware of the URL change. They may also be coming from bookmarks that users have saved in their browsers. Sometimes they're used to tell search engines that a page has permanently moved.

There are two kinds of redirects that you need to know about.

  1. Browser based redirects
  2. Server side redirects

Browser based redirects have fallen out of favor with the search engines due to their frequent use in manipulating the search rankings. For that reason, they can often do more harm than good. That's why we recommend that, if you use them, you'd better know what you're doing. Otherwise, you should avoid browser based redirects if your site is dependent on good rankings.

Server side redirects are safer and necessary to use in specific instances, like when a URL has moved. The two most common redirects are the 301 redirect and the 302 redirect. Both of these are highly useful. We recommend that you study the tutorial below in order to gain a full working knowledge of how these valuable webmaster tools can be safely applied. Consider it essential reading.

  • Duplicate Content

As previously mentioned, search engines hate duplicate content. Their thinking is that it wastes their resources and provides a bad user experience. That's why they tend to filter out and sometimes penalize sites that clog their index with duplicate content.

The biggest offenders are product pages that all carry the same product description. Google doesn't care where you buy the product. They only care that you aren't served the same product page coming from multiple websites. So they look for the content originator, tend to favor the name-brand company that produces the product or else prominently rank a large site like Amazon.com. The rest of the pages selling the same item tend to get filtered out of the rankings.

We also mentioned how duplicate content issues might arise whenever a site uses a dynamic database system to create product pages on the fly. This is to be avoided as well.

And, of course, any other content that duplicates what is already on another site should also be avoided. The bottom line is that Google is looking for original content. Anything that isn't original reflects badly on the overall site quality. So, you should see to it that your site contains only original content and not something that can be found elsewhere.

  • Canonical URL

Your Canonical URL is your preferred URL. This is relevant because http://www.yourdomain.com and http://yourdomain.com are NOT the same URL even though they both land site visitors on your home page.

As you can see, the first includes the www, the second does not. The fact that these two URLs are not the same is important because it means that some websites might link to one and some might link to the other. In such cases your PageRank (see next chapter) is divided instead of combined. This will hurt your rankings.

Furthermore, Google sees two different URLs with the same content. This creates a potential duplicate content problem. They can't know which URL you want indexed and this puts your site at a disadvantage in the rankings.

The solution is to choose one "Canonical" URL over the other. It doesn't matter which one you choose. Google doesn't care whether you use the www or not. But you must consistently choose one over the other. And then simply redirect the traffic from the one you're not using to the one you've chosen to use. In addition, you must see to it that your incoming links are pointed to your Canonical (i.e., preferred version) URL.