Updated January 1st, 2022 — version 308

The UnFair Advantage Book
Winning the Search Engine Wars

Chapter One, A Brief History

Chapter One -
Search Engine Strategies - A Brief History

Everyone knows a Search Engine is the vehicle people use to find things on the Internet. But many are oblivious to the hyper-competitive behind-the-scene strategies used to secure the highly coveted top-ranking positions.

Back in 1996, when the first version of this book was originally written 300 updates ago, the leading search engines included the all-but-forgotten likes of WebCrawler, AltaVista, Infoseek, Excite, Open Text, Lycos, Inktomi, and the directory site, Yahoo.

Universe of Search Engines

Those engines all responded to search queries with results based solely on matching keywords on web pages to keywords (aka, search words) being used in the search queries. The search engine strategy was very simple back in those days. The top search results consisted of whichever pages contained the most keywords that matched the search query. Due to this fact, online marketers began "stuffing" extraordinary numbers of targeted keywords into web pages for the sole purpose of manipulating the search rankings. They went so far as to design entire web pages specifically to rank well for each of their targeted keywords. In many cases, this meant flooding the search engine indexes with hundreds, or even thousands, of superfluous web pages in order to dominate the rankings. And that is how the arms race to 'Winning The Search Engine Wars' began.

In those early days, whoever knew how to "stuff" the right mix of keywords into a web page literally gained an unfair advantage! And, frankly, it was pretty easy to quickly score a whole bunch of top ranking pages on most any search engine under any topic. You see, back then the search engine formulas, or algorithms, for responding to a search query were pretty basic. And the data compiler programs — aka, robots, bots, spiders, and crawlers — that "crawled" the web simply indexed whatever they saw wherever they found it. Neither the bots nor the algorithms passed judgments based on the "quality" of the site. Nor did they evaluate the trustworthiness of the brand, the credibility of the links, the quality or originality of the content, the popularity of the page, the reputation, or speed of the site. While it's true today that ALL of these elements and more are factored into the ranking algorithm, in the early days of the commercial web, all that mattered were keywords.

For instance, if you were an online seafood outlet that sold soft shelled crabs via mail order, then you would stuff the keyphrase soft shelled crabs into as many places on your web page that mattered. You would place the keyphrase multiple times into the web page Title and Meta tags, the header, the headlines, and the body copy. You would also repeat it a hundred times or more in the footer of the page, in a minuscule white font that matched the background rendering these keywords invisible to site visitors even though the spiders would "see" them in the source code and therefore index them. From 1995 through 1997, this was the number one strategy for scoring pages at the top of the search engines.

But today's search engines know all of the tricks. So, it's not only a waste of time to use them, it is counter-productive because your site will be penalized in the rankings if you get caught; and you will get caught!

Regardless, to gain the insight necessary to build today's top-ranking websites, it helps considerably to know the basic history of the arms race for top-ranking pages. As one would expect, the engines eventually countered the keyword stuffing strategy by programming their algorithms to recognize it as search engine spam. And from that time forward the arms race to manipulate the engines has been repeatedly stymied by the search engines' all-out-war on whatever they consider to be search engine spam.

Eventually, getting caught spamming the engines became highly detrimental to a website's search rankings. Penalties now range from mild to wild. Mild might equate to being dropped a hundred pages back in the rankings until you've admitted the error of your ways, corrected the problem and promised in writing never to do it again. Wild would be something like never being able to gain traction in the rankings again due to flagrant and repeated violation of the rules. Neither is good and both tend to cost the website owners dearly due to significant losses of traffic.

Spamming the Engines — a Moving Target

spam as a moving target To combat the strategy of keyword stuffing, the engines switched to an algorithm based on keyword positioning and keyword density (the number of keywords relative to the number of total words on a page). When online marketing experts got ahold of software to crack that formula, the engines countered by adding link popularity to the algorithm. At this point the search engine ranking formula used a combination of strategically placed keywords within a specific number of keywords used on a page heavily influenced by the total number of external off-site links that pointed at the page using keyword rich anchor text.

anchor text defined as the words that appear in a link

Their thinking was that it would be difficult-to-impossible to fake keyword-specific incoming links from external web pages that were (supposedly) under the control of a separate entity.

But, no surprise, online marketers are a creative and persistent lot. They quickly figured out all kinds of fabricated link systems designed to manipulate the search algorithms and score top rankings. They created link exchanges and links pages — pages that were nothing more than a collection of links. And, for a while, these so-called "link farms" were tolerated by the engines until all of a sudden, they weren't. Then the effort shifted to covertly buying links. Link brokers sprang up for a few years and were successful and even profitable for a while until the engines dropped the hammer by wiping the offending sites all-at-once from their index.

During these years, the engines began reclassifying mainstay and widely accepted search engine strategies like reciprocal link exchanges as artificial link structures and then later as spam. So, reciprocal links, link exchanges, link farms, buying links, brokering links, and heavily laden anchor-text links — strategies that were at one time widely used and mostly acceptable — all got tagged as "link schemes" and relegated to the list of forbidden strategies. The strategies themselves were derided as "black hat" (as opposed to the Google endorsed "white hat" strategies) in an effort to polarize online marketers into either good guy or bad guy categories. And then Google set out to teach lessons to the nasty "spammers" by severely penalizing their websites in the rankings or, in some cases, completely kicking them out of their index.

historical warning SEN

It's important to note that, by this time (circa 2005), ALL of the original search engines (except Yahoo) had been rendered irrelevant by the overwhelming popularity of Google — which didn't even exist back in 1996 when the search engine wars began heating up. In fact, it wasn't until 1999 when the fledgling Google first appeared at the Search Engine Strategies Conference as an almost-unheard-of panelist in San Francisco, sharing the stage with all of the aforementioned "leading" search engines. I was there when Google co-founder Sergey Brin proudly exclaimed to the attendees that 'Google doesn't worry about spam, you can't spam Google.' But online marketers did indeed figure out ways to spam Google. And a few years later, once they had acquired the majority market share and crowded out all of their competition, they changed their minds. Now they say: You had better not even think about spamming Google. And they really, really mean it!

And now that Google is, effectively, the only search engine, they've become the tail that wags the dog. They dictate (via recommendations and suggestions) almost everything a website can and cannot do — all the way down to the design of the website itself and, especially, how fast it should be. Sure, there's Microsoft's Bing, and the privacy-focused DuckDuckGo is coming on strong. They matter, but not enough so that you can ignore the demanding website "quality" requirements of Google. There's also the social media sites Facebook, Twitter, Pinterest, Instagram and review sites like Yelp and the encyclopedic Wikipedia that influence rankings. But none of them matter near as much as Google, although they DEFINITELY DO MATTER! ...but if you defy Google, you probably aren't going to do very well in any of these other important sources for search traffic. So, search strategies today are centered around strategic compliance with Google's terms of service and Webmaster 'best practices' guidelines.

To state it simply, today's search engine strategies focus on constantly adapting your site to comply with whatever Google currently thinks is a "good" website. In other words, if Google likes you, then all of the others are likely to like you too. Your online web presence will flourish. But, if you fly outside of Google's "recommendations", that you find published in their quality guidelines, then you probably won't do very well ranking-wise anywhere. And if you piss Google off, it's safe to expect that your website's listing will be buried deeper than page 100 in the search results and your efforts to "fix" the problem will be ignored. So pay close attention to the following chapters because your primary goal is to keep your website in Google's good graces! And remember, even the basics can be a constantly moving target as they frequently keep raising the bar by honing their requirements and refining their suggestions.