Before we actually get started on the history of SEO, let’s have a look at some chronological milestones. Just to put things into perspective.
The Earth formed roughly 65 billion years ago.
The first signs of life date back 3.8 billion years ago.
Life has seen five major extinction events so far.
“The Great Leap Forward” for humanity happened 80.000 years ago.
The Industrial Revolution happened roughly one century ago.
The Digital Revolution took place less than 60 years ago.
The Internet appeared roughly 35 years ago.
The first search engine – Archie – was created in 1990.
The history of SEO starts roughly in 1991, some say.
Almost no other term has caught so much traction and importance.
You can see and hear SEO everywhere. Blog posts, videos, podcasts, infographics – you name it.
That’s why it’s a great idea to invest in tools that help you stay on top of it. Try out Squirrly SEO plugin, to stay on top of All the SEO updates through history.
In the following lines, you won’t find another set of tips & tricks to master the SEO niche. However, the information provided will help you better understand the industry.
Let’s have a look at how SEO changed over the years.
You can’t say you know the history of SEO without knowing that the first website was launched in 1991. It’s still live today! Granted, that’s not when the idea was born, but it’s strongly related to that moment.
You can’t have SEO without SE – search engines.
You can’t have search engines without websites.
However, Archie was the first search engine. It was started on September 10, 1990. You’re probably guessing that it was as primitive as possible and you’re right. Archie was a tool for indexing FTP archives, allowing people to find specific files.Fact: The first search engine was launched before the first website. Click To Tweet
Indeed, it’s weird that the first search engine[-ish] was created before the first website. Some consider it a pre-web search engine.
But as more and more website flooded the World Wide Web, things had to change.
Mosaic was the first web browser available to other than people at CERN. Some say it is the tool that popularized the World Wide Web. Its official launch took place at the beginning of 1993.
And it makes sense. The number of websites launched after Mosaic is impressive for that time. And this move has created an additional need.
When you have large quantities of information available, you need a way to index it. This index then needs to have search options. The easier it is for people to find information, the more they will search for it.
Thus, the big search engines were born.
February ’93 saw the birth of Architext. It was developed by six Stanford students and it later became the search engine Excite.
The same year witnessed the launch of five more search engines. Some of them worked differently than others, but they were all designed with the same purpose in mind.
Yet, Excite is believed the true revolutionary. The Search Engine sorted results based on keywords found within content and backend optimization.
In the following year, 1994, more search engines were introduced:
- Alta Vista
- Lycos and
Slowly, more people and institutions got access to the Internet.
As it grew more and more popular, a shift occurred.
The first search engines weren’t backed by complex algorithms. Artificial intelligence and multiple factors did not determine the search results. Most of them just matched the words from a user query.
You may see where this is going. Website owners realized this aspect pretty fast. They came to the conclusion you can manipulate search results.
It was all about the money.Fact: Manipulating SERPs used to be easy... and profitable. Many opposed change. Click To Tweet
This way of searching engines working resulted in 3 things:
Keyword stuffing became a thing. Repeating keywords over and over again was providing solid amounts of traffic. And it was used for advertising opportunities.
Then spammy backlinks started to happen. These were used to increase the website's authority, but there was no way to check for quality.
Excessive tagging became popular. Website owners mentioned all the keywords they wanted to rank for, in the tags section.
There were no guidelines or ranking criteria at the time. To make things even worse, Search engines changing algorithms to punish black hat SEO wasn’t enough. Other techniques were already in place and they weren’t addressed by the updates.
But then two young brilliant minds from Stanford happened.
Larry Page and Sergey Brin first addressed the issues they wanted to fix in 1998. They published a paper at Stanford where they wrote:
...the predominant business model for commercial search engines is advertising. The goals of the advertising business model do not always correspond to providing quality search to users.
The same paper introduced a term that revolutionized the industry. It made the first steps towards real SEO: PageRank.
The technology was a way to filter search results based on quality, not just keywords alone.Larry Page and Sergey Brin revolutionized the search industry. Click To Tweet
Sadly, the pair needed more time to make it happen. But the concept came to life.
At first, Google wasn’t available to the public. It wasn’t even called Google – but BackRub. They used Stanford’s server and the search engine was accessible locally at google.stanford.edu.
Here’s the timeline of their begging:
September 15, 1997: Google.com was registered.
1998: The company was born in Melo Park, California.
It received $100.000 from Andy Bechtolsheim, the co-founder of SUN.
They slowly started implementing what they wrote in the white paper mentioned before.
Early 2000's: Google started providing guidelines for website owners that wanted to shift from black hat to white hat SEO.
For the following two years, nobody took these rules seriously.
Nobody wanted to be a part of the good guys’ team. But today it’s easier than ever to make sure your site is part of the good guys’ team. With the Squirrly SEO plugin, you can make your site SEO friendly and make sure it stays that way.
This happened partially because PageRank was ranking pages based on inbound links. The more, the merrier. Back then, there was no way to measure the authenticity of those links.
So publishers stuck with black hat SEO since it was giving them solid results.
However, things have changed for the best. Here’s an interview where Google founders talk about how the search engine was working back then:
It's all about getting information to more people.
Then, September 2002 came around. And the algorithm updates started rolling out. The first one was called “Boston”.
In April 2003, “Cassandra” addressed some of the issues of backlink quality. It also threw a nice punch at hidden text and hidden links.
The following months saw each another Google update being rolled out, changing the way the search engine – which was already getting massive – determined rankings.
“Dominic”, “Esmeralda” and “Fritz” only addressed small issues – some of which are even unclear. These updates focused mostly on backlinks quality and many websites noticed bounces.
However, the “Florida” update felt for some website owners like car accident happing in their own backyard. Massive penalties came along with it, targeted at black hat SEO practices.
Many websites lost ranks dramatically!
Which outlined the fact that Google was no longer intending on playing nice. You had to follow the rules if you wanted to rank high.
Some believe this is the birth point of the entire SEO industry. Granted, there was far less complexity from what we know today.
Google came under attack hard after their last update.
For each page that lost its rank, another rose to take the place. But that didn’t matter.
The ones that were penalized, were determined to get “justice”. They even accused Google of favoring paid advertising clients over non-AdWords users.
Google’s “Florida” update was also accused of being an evil tactic to boost advertising sales.
But the website owners didn’t see “Austin” coming. Released in January 2004, this update came to address what “Florida” missed.
Another round of accusations emerged. Google was even blamed of being controlled by the CIA.
As you may imagine, that wasn’t the case. Google was trying to provide more relevant results to people. Its algorithm updates weren’t targeted at the best-optimized sites, but the most relevant ones.
Penalties were applied to:
- those who relied on “on-page” SEO;
- those who exchanged links with off topic sites to rank well;
- location specific sites.
Google continued to put a lot of effort into improving the relevancy of search results. That meant it needed additional indexing capabilities and algorithms.
This meant the introduction of three new terms:
- and Latent Semantic Indexing (LSI).
They would change the history of SEO and make the first steps towards modern days.
As you may have noticed, Google slowly started its domination. It took over all the other search engines authority. In 2005, it was processing already hundreds of million search queries every day.
Almost all websites had to start playing by the rules. The ones that refused had to suffer.
This became even more obvious in January 2005. Some of the biggest search engines united their efforts for the Nofollow Attribute.
The change had one simple objective; Decreasing the number of spammy links and comments – especially on blogs.The NoFollow Attribute was implemented in 2005 - a busy year for SEOs. Click To Tweet
2005 brought other changes as well, directly from Google.
A new algorithm was using the user's search and browsing history. It personalized their search results to make them more relevant. This was an important milestone in the history of SEO and it once again paved the road to the modern age.
'Sophisticated. Easy. Free', the tool is still widely used today to determine key metrics.
'Big Daddy' update
The 'Big Daddy' update, that took several months to complete. It changed the way Google handled URL canonicalization, redirects (301/302) and other technical aspects.
In 2009, Microsoft introduced what they hope to be Google’s killer: Bing. They advertised it as producing noticeably better results than its competition.
However, that didn’t prove to be quite true. Experts observed a tendency to favor:
- keywords in URLs,
- capitalized words,
- and “pages from large websites”.
The same year brought a preview of the “Caffeine” update.
Google asked for help from users to test some features. Even if the update wasn’t released but for another full year, it did bring one more important aspect to SEO.
At the end of 2009, Google introduced a tweak for its algorithm that changed the meaning of SEO. Search results now included tweets and breaking news.
This meant that SEO was no longer just for webmasters. Journalists had to start learning how to optimize their content as well.2009: Journalists had to start learning how to optimize their content as well. Click To Tweet
Here’s Google’s head of webspam talking about the “Caffeine” update in 2009:
As you can see in the interview, things were already pretty advanced. All that keyword stuffing and black hat SEO that was happening in the old days was gone.
The bad part was webmaster had already year of experience in the field, but the rest of us didn’t. The good thing about that today is there’s the Squirrly SEO plugin. That can help Non-SEO experts become SEO Superstars.
Now, as you may expect, the changes kept on rolling. From here on out, we can talk about the modern era. Google has reached a certain maturity and nobody can trick it anymore.
For some of you, writing something in a Google search box and waiting for a suggestion might be a default. But not all of us could enjoy the benefits of suggestions.
The change came in 2010. Its given name was Google Instant. The technology made many SEOs angry once it rolled out. But then they noticed nothing.
Rankings weren’t affected at all.
Social media posts started being indexed. Facebook posts started appearing in search results. Certain Twitter profiles got their own PageRank.
Here’s more about that period of 2010:
From here on out, cute animals will slowly bring a false terror to SEOs.
The release of the “Panda” update, in 2011, sought to punish websites that relied on a form of non-white SEO to boost their rankings:
- Irrelevant backlinks from .edu domains
- Websites with a high ad-to-content ratio
- Content farms – websites with a lot of low-quality content, constantly updated to trick the algorithm
Up until 2015, “Panda” had many tweaks and updates.
Then along came the “Penguin” update, in 2012. It came to address the issues the previous update didn’t. Specifically, it targeted websites with informative content that were sprinkled with unrelated spammy hyperlinks.
One more notable feature was implemented in 2012: the Google Knowledge Graph. The system understands facts about people, places and how these entities are all connected.
You’ve seen the boxes that give you straight answers without sending you to a website.
Then, in 2013 we saw Google’s efforts to keep spammy results away. This update targeted queries for payday loans and porn, for example.
This is the time when content became king. Google had in place so many algorithms to punish black hat SEO, that the only way to success was proper white hat SEO.
Another cute animal – a “Pigeon” – shook the SEO scene in 2014. It was a massive update targeted at local SEO.
It actually empowered the bond between the local search algorithm and the web search one. Thus, local results were receiving more web based signals. Here’s what Search Engine Land wrote about it at the time:
Google told us that the new local search algorithm ties deeper into their web search capabilities, including the hundreds of ranking signals they use in web search along with search features such as Knowledge Graph, spelling correction, synonyms and more.
However, this particular update affected only the U.S.
The United Kingdom, Canada and Australia were targeted later on, at the end of 2014.
22nd of April, 2015. Google released a new mobile-friendly ranking algorithm. Its purpose was simple: mobile-friendly pages received a ranking boost in Google’s mobile search results.
Maybe that’s the most exciting factor about Google and 2015 or maybe not.
This was actually one of the most exciting things to date. Not exactly exciting in the history of SEO, but in terms of progress and thinking for the future. Yes, it is mostly about Google.
RankBrain is complex. But to break it down into simple facts, you basically need to know the following:
- It uses artificial intelligence to filter results
- In tests, that artificial intelligence beats Google’s experts at page selection
That means it’s smart and it has Google’s data that enables it to keep on learning.
2016, same as the previous few years, is exciting regarding SEO and updates.
We’ve seen an AdWords shakeup at the beginning of the year. The changes are sweet and most advertisers are happy with the new features and extra copy space.
May brought forth another update to the algorithm that “gave” us “Mobilegeddon”. The impact was low, since there’s only a small number of websites that don’t support mobile.
Most Web Owners already know that, in order for them to get people to their site, they need to adapt their content. That even means adapting to the device they are using. Gratefully, this wasn’t a lesson learned from Google.
Then a “Possum” scared SEOs a bit. Implemented at the very beginning of fall, it filtered many Google My Business listings.
Search Engine Land notes that the main purpose of the update was to diversify the local results and also prevent spam from ranking. It was one of the biggest updates impacting local SEO, since Pigeon.
And the last update we will note in this article: the “Penguin” 4.0. It was implemented a few weeks after the previous one.
It’s a filter to capture sites that are spamming Google’s search results. Of course, it refers to spamming what Google’s regular spamming systems might not detect.
We can only imagine what the future will bring. There are some speculations, but you don’t need an opinion to reach a conclusion.
One thing you can be sure of is that it will keep you updated. The Squirrly SEO plugin is constantly getting better each day. We like to make sure our Clients are up to date with their SEO.
Just look at each update since 2010 – everything, absolutely every algorithm focused on one aspect. Removing low-quality and spammy content that used black hat SEO from SERPs.
The phrase “Content is King” should be adapted to “Quality Content is King”. That’s what all of us will have to keep on doing.
Of course, without a social proof and authentic backlinks, you won’t be able to reach top 3 in organic results.
However, the most important SEO factors, according to Google, are:
Granted, each of them has many sub-factors and categories. Here’s the AMA where Google confirmed this aspect:
If you’ve made it through over 25 years of digital updates, shifts and changes, give yourself a pat on the back. I also thank you for reading this huge compilation of important facts about the history of SEO.
The good thing is that now you know. The sole purpose of this article was to outline the need to qualitative content. That’s the most important thing you have to master to be future proof.
What’s next? If you’ve really enjoyed this piece, I encourage you to join Squirrly’s Education Cloud.
It’s quite a big club, with over 49,000 members. We’ve got everything from managers, small business owners and marketers. It’s also free, in case you didn’t know. Go ahead and register.
Let me know in the comments section below if you feel like there’s anything else worth mentioning.
Latest posts by Adi Bot
- Lesson 4: The Secrets of Turning Data Into Actionable Insights - June 23, 2017
- History of SEO: Important Then Vs. Now Changes You Have to Know - January 24, 2017
- The Best Affiliate Marketing Tools That’ll Get You to the Top - November 17, 2016