From White Hat to Black Hat: How Old SEO Techniques can Hurt Your Site

Making a website easy to navigate and understand for both the user and the search engine is the foundation of SEO work. The argument can be made that search engine optimization used to be a lot easier before changes were made to their algorithms that focused more on user experience and quality content. Techniques that were once effective and allowed, i.e., “white hat SEO” can, with one major update, go against the search engine’s guidelines, i.e., “black hat SEO” and hurt your ranking, indexed pages, and traffic.

So no one uses old black hat SEO techniques anymore, right?

These practices can give you temporary results, but it’s considered shady and deceptive. Sometimes these sites are de-indexed so they no longer appear on Google at all. Be wary of SEO services that claim they can get your site to #1 within a certain time frame. They may just be black hatters in disguise.


Anyone around for the early days of the internet remembers scrolling through the ancient GeoCities sites with their busy backgrounds and more often than not, midi music playing in the background. Back then Yahoo! was the biggest search engine and you could search for popular keywords that would result in quite a few of these websites. It wasn’t uncommon to see spammy lists of unrelated keywords and links built in to the footer. Believe it or not, this kind of keyword stuffing was an effective SEO technique. When Google eventually updated their algorithm to penalize this practice, some SEO providers worked around the update by matching the keyword font color to the background and reducing the font size until it was almost invisible. Needless to say, they weren’t fooling Google’s crawlers.

The number of black hat techniques is numerous but it’s most important to understand that effective techniques today may be penalized tomorrow.

Undoing the work of a previous SEO company is one of the first steps in a white hat SEO process. Oftentimes, the old content is auto-generated and will result in any number of similar pages displayed when entered word for word in a search engine. Nowadays, pages with duplicated content have a hard time ranking. Search engines only want to rank original and unique content that answers the searcher’s query directly. This is good white hat practice so it’s best to keep your content relevant, organized, and accurate.

Unless you have an unlimited amount of time to test search engine algorithms for holes, your trust should be placed in the experts in the field who do regular testing. Follow their lead. It’s important for an SEO specialist to always keep up-to-date on updates and new techniques that are proven to work. In 2014 there were fourteen updates to Google’s algorithm, five in 2015, and as of the date of this post there have been four so far this year. The specific details are rarely told in full, but we know that user experience is valued more and more. As long as you keep that in mind, your current campaigns shouldn’t have major setbacks due to new updates being implemented.

It’s not worth getting kicked out of a search engine’s index for temporary traffic boosts. Why pay for a service that will have the opposite effect of its purpose? Take the time to research your SEO provider and what they consider to be most important to your campaign. Remember that promises of rankings and optimizing for search engines sound good on the surface, but if new factors like overall user interests aren’t taken into consideration you’re not getting the most out of your SEO budget.


More Posts

DrivenIQ Podcast with Rob Mudd and Phil Sura

Albert Thompson with Digital Data Cafe chats with Rob Mudd and Phil Sura about who Mudd Advertising is, their plans for 2022, and Mudd’s new Omnichannel Advertising platform MuddVision. 

Send Us A Message