Menu
 

White Hat and Black Hat SEO Explained

502
Igor Eisenberg
Web Developer
 
Share Button
 

Mobfox web developer Igor Eisenberg is also an SEO expert and explains here the relationship of content to SEO and the dangers of going dark.

Black hat SEO, the evil twin of white hat SEO (sans goatee), is still being used and sometimes actually considered by decision-makers as a viable strategy even though it could cause unrecoverable damage to a project. But before we get into the details, we need to look at the root difference between them – how do they answer the question “What is SEO?”

I mean – we all know what SEO stands for (Search Engine Optimization, just in case), but what does that really mean?

As Wikipedia tells us, “Search Engine Optimization is the process of affecting the online visibility of a website or a web page in a web search engine’s unpaid results”. ‘Affecting’ is one very well-put verb. White hat (WH) SEO treats search engines, well, as search engines to which quality content should be delivered by removing obstructions. Black hat (BH) SEO treats search engines as a step-stone, which should be analyzed and exploited to get better results. Both approaches are ‘affecting’, but BH neglects the purpose of search engines themselves and actually intentionally harms search engine performance. When WH SEO plays by the rules, BH SEO cuts corners and tries to cheat.

Long term vs. short term results

For now, let’s put aside relative terms of ‘good’ and ‘bad’ and focus solely on the effectiveness of these two approaches. Because, after all, the goal of SEO is to see our website on the first position of the search engine result page (SERP). So why shouldn’t we cut corners?

Let’s look at it from the position of the search engines. Search engines’ primary purpose is to show the user the most relevant content and filter out less relevant content while making money on advertising in the process. Search engines ‘know’ that they are imperfect and that some users are trying to trick them into showing irrelevant or low-quality content. So search engines actively patch up vulnerabilities that could be exploited and punish offending users. That sounds scary, especially if you consider the size of some of them, but under the threat of punishment – why would people use BH SEO in the first place? The answer is simple – because it works. Or, to be precise, it works in the short term.

The boost BH SEO gives to rankings is nice, noticeable, fast and short-living. It’s that simple. Its good while it lasts, but when it ends, you are punished and out of the search results.

Because search engine position changes with a delay and depends on a number of factors (some of them unrelated to the resource/website in question), it is not clear what exactly affects the SERP ranking. But we have an easily accessible feature that could be used to understand SEO results without going deep. It’s the email spam filter.

The purpose of the email spam filter is to define which emails are good (‘position #1’ or the inbox) and which ones are spam (‘position #2’ or the junk folder of the mailbox). In essence, it delivers quality content to users. How does it do that? By analyzing the content of the mail and applying various filters which define whether the email is spam or not. As algorithms grew more and more sophisticated and precise, the efficiency of spam shrunk and now spammers may receive 1 reply for every 12,500,000 emails sent. (Sadly though that 0.000008% click rate is actually making them enough money to keep them going.)

panda, penguin and hummingbird: the much feared algorithms

A bear and two birds – or how search engines are getting smarter

Same rules apply to BH SEO – it is working, but since search engines evolve, it is getting harder. BH SEO always needs to be a step ahead, which requires constant awareness and loads of work. In order to trick search engines into thinking that mediocre and spammy pages are good, the pages need to be changed as fast as Search Engine algorithms do. It is a hard job, that requires constant attention.

If you`ve been around for some time, you are familiar with the sheer horror that words as innocent as “Panda,” “Penguin” and “Hummingbird” could produce. These three changed the world of SEO from the times when black hats looked down on white hats – to when black hat SEO became a ticking time bomb.
Let me explain.
Panda, Penguin, and Hummingbird are names of Google search engine algorithms.

In short: Panda tries to show higher quality content and punishes duplicate content, errors, hidden text, etc. Penguin’s goal is to reduce back-link abuse, and punishes links from thematically not-related resources, for example, if an article like this had a link to a cooking website. Hummingbird was created to get users the results they are looking for with their query. For example, when you ask for ‘What Jimi Hendrix drove,’ it will interpret it so that the result would show you Jimi’s cars, as opposed to just articles that contain the words ‘Jimi’, ‘Hendrix’ and ‘drove’.

Bearing the gravity of Google, these new algorithms delivered a serious blow to a niche of businesses focused on providing black hat SEO services. Before the rise of Panda, Penguin and Hummingbird black hat could be a long-term ‘not-so-honest’ solution. Since their arrival, getting punished is just a matter of time.

Now, let’s talk about white hat SEO – and to do so, we should take a closer look at how the user interacts with search engines. Our ‘test user’ wants to find a page, which answers his query. The search engine provides the best pages that that meet these (and more) conditions:
– have the least amount of errors,
– have the least amount of known black hat SEO tricks,
– have the most informative text with least amount of meaningless sentences,
– and of course, give a satisfying answer to the query.

And while there are rules to define errors as well as rules to define meaningfulness to a sentence, the degree to which a page matches the query is subjective. And to analyze it, search engines keep track of user behavior. For example, let’s look at the following scenario from a search engine’s point of view: user types in a query and gets 10 results. User clicks on the first link, but 10 seconds later the user clicks on the second link and returns an hour later.

So what does this mean for a search engine? The user opened the first link, found it lacking, closed it and opened the second one, which was good so the user kept reading. For the search engine, the user just demonstrated that the second link has higher quality content than the first one.

It’s worthwhile to be the good guy

Tracking is actually really simple – try this to see just how simple it is:
1. Open virtually any search engine (except for google.com when used in Chrome or Chromium). Make sure you log out from it or use incognito mode.
2. Search for anything.
3. Hover over the result link – in the bottom left corner you will see the actual URL of the result.
4. Right-click on the link and see how the URL changes in the bottom corner. This is an intermediate link, and it will send you to your destination after creating a record in the search engine’s database that the click was made. When you click on something on a search engine results page, your action is recorded. Intermediate links are just one of the ways to do it.

seo search engine operation image 1

When first hovering over the link, one sees the actual target URL

seo search engine tracks user behavior

After a right click on the search engine result the redirecting URL shows up.

White hat SEO accepts all this and does its best to provide search engines with the best content:
– with the least amount of errors,
– with clear page design and structure,
– with informative content that catches the attention.

The pages that are considered good by search engines are commonly regarded as good by users and have a higher chance of being shared. And that is a huge win! Not just because it is free advertising, but because it creates ‘ripples’ on the internet, increasing page visibility, subjective quality, and traffic.

As a nice bonus, an interesting, well-written page does not need any tricks to be considered good by search engines. On the contrary, search engines consider an interesting and well-written page as a ‘good one’. Thanks to the above algorithms all you need to please the search engine are thorough research (depending on the topic, of course) and good copywriting. And as long as you`re not using BH SEO, the site will not get punished. There are other actions you can take to make quality content more visible to search engines, like clear layout, sitemap, etc, but again, most of these also enhance the user experience.

To wrap up, both black hat and white hat SEO are investments of resources into promoting content, that will continue to give results as long as your content is online (which it won’t be for long if you are using BH solutions). The key difference from the point of view of the content-owner is that BH SEO is less focused on the content itself and more focused on pushing it to the user. And that its a temporary and constantly shifting solution, requiring high maintenance work compared to WH, where the maintenance is neglectable. A solid piece of content will always be favorable to search engines – no matter how many new algorithms pop up.

If we consider cost/effectiveness of these two approaches, we need to keep in mind, that as more time passes, the operating costs of BH SEO solutions will accumulate (if they still didn’t get you banned) and the results of the WH SEO solutions will rise.

And that is the practical reason why black hat SEO should never be used as a long-term solution.

Share Button
 

Comments are closed.

​​ Thanks for signing up to our blog!

Get the latest smart tips & news from our experts!