You’ve probably heard a lot of talks in the past months about animals and paranormal beings affecting wordpress seo or even more so the web in general.
Let me start by clarifying that these are not the fluffy, cute respectively spooky beings you’re probably thinking about. They’re actually algorithms Google implemented on their search engines to provide more organic quality results.
Before getting our hands dirty
Let’s have a look at how a search engine works. I’ve met numerous people from all walks of life, some even mildly tech savvy who lived by the misconception that when you search something you’re searching the internet. What you’re actually searching on is the search engines index of the internet.
You’ve most likely heard about spiders (no, i’m not referring to the 8 legged creatures of Tartarus who infest your nightmares and feed on your fear and anguish). They’re bots deployed by Google to constantly scan the internet. How they work is pretty straightforward. They start by fetching a few web pages back to the server for indexing. Afterwards they follow the links on those pages and fetch the pages they point to as well and so on, and so forth.
After the server finishes the indexing process those pages are available for search. But there’s more than one page containing the information you are looking for, and by more than one I mean millions of pages with that specific keyword. In order for you to get the most relevant results Google applies a series of filters and conditions to every webpage it indexes in order to rank it. According to them, analyzing a page takes into consideration more than 200 different aspects, each one of them affecting the final rank of the page.
Seems pretty straightforward
It’s quite simple then no? You got a list of ~200 objectives to accomplish and you’re surely going to hit the front page before you know it.
Whats the point of WordPress Seo then? Well for a searcher to get organic quality results Google has to resort to black magic (complex algorithms).
Upgrade Your WordPress
Become Super Competitive Now!
Overtake your competitors with today's free Upgrade!
Google has a fine taste for quality. It’s something they’ve perpetuated throughout their search software. Panda, Penguin and Phantom are 3 of the more popular algorithms implemented in the last 2 years and their purpose besides making many people’s nights sleepless was to reward quality and originality while penalizing poor, low quality, thin content.
The Panda that actually went to Kung Fu school
Google’s Panda algorithm was first introduced in February 2011. Same year Kung Fu Panda 2, the animation that brought the 2 great Jacks together, Jackie Chan respectively Jack Black, was released. Coincidence? I think not!
Thin foil hats aside, Panda hit the web big time. And by big I mean it affected the rankings of almost 12% of all search results. That’s 12% of 45 billion pages Google has indexed. Regardless of you using WordPress SEO back then, White Hat, Black Hat or even Stylish Fedora SEO( disclaimer: stylish fedora SEO is not a commonly accepted technique of search engine optimization) you were probably affected by it. Some got hit hard while others were rewarded. News websites and social media got a increase in rankings while sites with a lot of advertising and lacking quality content experienced huge decreases. Reports from around the web state that some sites were hit by a staggering 70% traffic drop over-night. That’s about the same percentage as the water-covered land on Earth. You wouldn’t be very happy either if you’d wake one morning and all the water was gone.
The reason why it hit so hard was because Panda was coded around new and improved Artificial Intelligence technology making it more scalable and sophisticated than ever before. It was used in conjuncture with human quality testers that rated tens of thousands of websites based on design, speed, trustworthiness, quality of content and other undisclosed factors. It was then used to find similarities in what people rated as high quality and low quality.
Matt Cutts, Google’s head of spam, explained in one of his blog posts the reasoning behind Panda:“This update is designed to reduce rankings for low quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”
Panda’s quest is a noble one. As it should be. And although it went and is still going through a lot of updates and revisions regarding factors, introducing new ones, changing priority of old ones, it still runs false-positives on certain types of websites.
Bellow is a list of 9 of the more common factors that make your website vulnerable to Panda:
• A low amount of original content on a page or site
• High bounce rate (percentage of visitors that land on your website and leave before visiting other pages from the one they landed on)
• Low percentage of returning visitors
• Low amount of links to a page from your website in Social Media and on other quality sites
• A high number of pages with a low amount of original content
• A high percentage of duplicate content
• Unnatural language on a page, over-optimization of SEO, keyword spam
• Spelling, stylistic, or factual errors
• A high amount of ads, especially ads that do not match the keywords the page has been optimized for
• A dirty sitemap
These factors are mostly speculative, deduced from what Google has released about their algorithms, and observations of how other sites ranked pre and post Panda updates. Google aren’t disclosing the actual ranking signals used in Panda or any other of their algorithms to promote quality content and fair game.
Even more so, it only takes a few low quality or duplicate content pages to penalize the entire website.
The web was hit so hard because it was broken. What Panda did was to improve the quality of the internet. It had yet again shifted the way WordPress SEO had to work to be effective. It made it cleaner and more inclined towards originality and quality.
A wild Penguin appears
The Google Penguin algorithm was first announced on the 24th of April 2012. That’s 2 months before Nintendo’s Pokémon Black 2 and White 2 release in Japan. You know what’s black and white? A penguin! Coincidence? Or the one truly secret society Dan Brown didn’t write about? I for one can’t wait for my new Google “Gotta catch ’em all” home screen.
When initially released, Penguin 1.0 targeted sites that displayed low quality backlinks, anchor text that was overly-optimized for one single term and shady link profiles. Basically if you were using black hat SEO at the time of release, you got hit. And if you weren’t you might have also been hit but to a lesser extent. Around 3% of English search queries were affected by Penguin.
Penguin was yet another statement that Google wants only the highest quality posts at the top of their search. And it wants everyone to have a fair fighting ground. It was even more an attempt to shift the mentality further away from “ranking is why my website has traffic” to “people visit my website, and actually come back, because of the quality of my content”.
But Penguin is just an algorithm. At the end of the day it’s all just a bunch of instructions and check boxes. Bellow is a list of the most common factors Penguin takes into account before deciding to penalize your website:
• Keyword stuffing: having the keyword in meta-tags or hidden behind images, basically optimizing your article using “invisible keywords”
• Duplicate content: Goes against Google’s dogma of delivering original content.
• Cloaking: The content presented to Google’s spider bots is different that the actual content of the page.
• Low-Quality backlinks: This is something perpetuated from the Panda era.
Most of the above are technical factors, shady technical factors, which is what Penguin was developed to focus on. It also banned results for keywords that were the same as the domain name. Something we included in Squirrly’s LIVE SEO assistant.
Googly the friendly ectoplasm
May 8 2013. It was a Wednesday. It was a stormy day in Waynoka, Oklahoma. There was chatter of a tornado threat. In London it was your typical chilly, partially-clouded day. It was the day Alex Ferguson announced his retirement as the manager of Manchester United.
You probably didn’t knew that. Just like you didn’t knew that Google released a Phantom update on the same day. It was called a Phantom update because no one knew what it was. And it stood the test of time because to this day no one knows if it was a localized Panda 2.0 release or a standalone algorithm. No conspiracy theories or secret societies here. Just your run of the mill friendly ectoplasm.
It was yet another Googly ectoplasm focusing on content quality. It gave a mild slap on the wrist to websites with thin, scraped, affiliated or low-quality content. It also penalized link issues like too many cross-links between sites and too many links going to a single destination. Viral wasn’t as viral anymore from that day forth.
A few years back there was a saying over the internet about eye sight, bowel movements and bricks. Well the pattern is quite obvious. Google promotes quality. It ranks for quality and penalizes for the lack of it. If I were to use one word to encompass the Panda, Penguin and your friendly ectoplasm algorithms it would be quality. It heightened the bar for quality content.
The trio that changed WordPress SEO
What these algorithm did besides supporting quality content was change the way SEO work. They made it cleaner, more inclined towards creativity. That’s not to say the technical part is not there anymore. It will always be a Yin and a Yang when it comes to SEO. We’ll let you call dibs.
Latest posts by Alexandru Coroiu
- How to make the most out of your Live Assistant SEO - October 23, 2013
- Get the content marketing machine going - October 21, 2013
- Research content marketing focusing on the Brand Pyramid - October 19, 2013