When Google emerged in the late 1990s, the sites best positioned on the results page were there for their keywords and also for links pointing to them.
Some developers quickly identified SEO “ways” to trick the tools and position their sites at the top of the results.
These techniques are known as black hat SEO, in allusion to western films, where the bad guys usually wore a black hat.
At that time, sites that used black hat techniques included several keywords and links to the sites and thus achieved the best positions in the ranking. But all that user attention has also turned Google’s spotlight on these sites and their techniques.
Since the 2000s, Google has been constantly updating its algorithm to identify techniques that try to manipulate its results, punish sites that use them and also prioritize sites that offer quality content and relevant to the user’s search, making the race for the top positions each fairer time. You can follow the entire history of updates to the Google algorithm on this Moz page.
In addition, Google also has its Webmaster Guidelines, with tips to facilitate indexing your site in the search engine, practices to follow and a list of techniques to avoid. Basically, through its guidelines, Google explains that you need to create quality, user-focused content.
Thus, we can define black hat as:
Aggressive SEO techniques that do not follow the guidelines of search engines and try to manipulate their rules, aiming to obtain great results in a short time and at risk of receiving punishments.
What are the main black hat SEO techniques?
Due to the frequent changes in Google’s algorithm and its ranking factors, several black hat techniques emerge and are also practically extinct over time. The best known techniques today are:
Keyword stuffing
One of the first black hat techniques used to try to manipulate Google, the strategy consists of including a keyword on a given page as many times as possible, in an attempt to increase the keyword density of the page and to show relevance to search engines.
This includes the content itself, its title, meta tags, and even the alternative text of images.
In the example above, it is possible to identify the term “e-commerce India” several times, possibly in an attempt to position the page in the first position for this search.
When calculating the word density on the page (percentage of times the term is used), the result is that “e-commerce India” has a very high density, higher than 7%.
The Google team realized that many sites were abusing the application of keywords on pages with the sole purpose of getting the top positions, and also that the number of times the keyword appears in the text is not a guarantee of relevance.
Since the middle of 2000, Google has been reducing the relevance of keyword density on the page and even penalizing its overuse. With the Panda update, many sites ended up losing positions and traffic due to the practice of keyword stuffing and had to run after the loss.
To get away from this technique and even avoid a penalty, the recommendation is to produce the texts naturally, keeping the keyword density of the page at 2% or less, and also prioritizing the use of synonyms. Before publishing the page, read and ask others to read it to ensure a higher quality text.
Hidden content
Another old black hat technique that was also very successful as a way of inserting more keywords, other relevant terms, and links just thinking about search engines, without showing the user anything.
The most common ways of using this technique are:
- Apply the text in the same color as the website background;
- Reposition the text off the page via CSS;
- Change the font size to.
Google has all these ways of hiding content that have been mapped for some time and can easily identify and penalize sites that use this technique. The rule here is never to include “Hidden content” in your SEO strategy.
Duplicate content
Do you know the phrase “nothing is created, everything is copied”? In the case of the content of the pages of your site, forget it. That’s because, for Google, good content is original content. Any page that has identical content to another already published is considered duplicate content.
There is a lot of debate among webmasters as to whether there really is a penalty for this, but the fact is that when similar content exists online, Google will prioritize just one to display to the user, usually what was published first, and to hide copies of the results of the search.
Only if you copy content excessively, or have automation to copy content from other sites, will you be considered SPAM and will certainly be penalized for doing so.
Let’s see what the Google team has to say on the subject:
For these reasons, avoid copying content from other sites, especially competitors, as most of the time, you will be doing them a favor since Google will choose to display the original content on the subject, which will not be yours.
Choose other content relevant to your business as references to do something better and more complete.
In addition to not copying content from other sites, some settings are required on your own site to correct common problems that can be considered duplicate content, such as page layout, similar content and a mobile version of the site, for example.
Cloaking
Camouflage, in literal translation, is the technique in which the developer sets up so that a page is displayed in a way for search engine robots, responsible for reading and indexing the page through tools, and in another way for users, defining it through the user-agent tag.
The page that is displayed to the robots is made only with the objective of gaining positions in the ranking and uses all possible techniques for this, without the need to care about usability.
The page that is displayed to the user is totally different, generally without much relevance and of low quality. When applying this technique, the site will be penalized by Google Penguin!
Doorway page (or gateway page)
It is a technique that also explores cloaking, but here are created several pages focused on accessing robots, each optimized for a specific keyword. And, when the user accesses it, he finds content that is totally generic and sometimes without any connection with what was searched for.
In 2015, Google launched an update to the specific algorithm to identify and penalize sites that exploit the Doorway page.
Link farm
It is literally a “link farm”, in which all participants generate links among themselves, in an attempt to improve the PageRank of all sites.
It is not very difficult to identify sites that still use Linkfarm: they have the content like all the others, but are full of links to sites with no relevance.
Such a practice made a big difference at a time when the only PageRank mattered, taking low-quality sites to the top of Google. But now all it will generate is a penalty from Google.
Private Blog Networks (PBN)
PBN is a network composed of several blogs and websites that generate links to the website that needs to rise in the Google ranking.
The sites present on the network generally have good online authority, as they are old domains that have expired, returned to the market, and were acquired for this purpose. There are several websites and even companies focused only on that!
As the Google algorithm is becoming more intelligent and rigid, website developers and owners are improving this technique to avoid punishment (which may even be Google’s exclusion).
For this, they take certain precautions when creating or contracting a PBN, such as ensuring that the sites use different CMSs and registering the domains in the name of different people or companies.
In early February 2017, there was a major update to Google Penguin, with the goal of further improving the detection of PBNs, generating sudden drops in traffic from several sites that used this strategy.
Paid links
Basically, the goal is to pay for a website to generate a link for you. And this does not only apply to cash but also to offer a discount on the product or an advantage in the company, for example. Any link generated only for a reward is considered a paid link.
Check out the opinion of Matt Cutts, former coordinator of Google’s webspam team and one of the biggest SEO references in the world:
Of all the techniques, this is the most difficult to be identified by the Google algorithm, as there is no way for Google to find out if you talked to a friend at a cafe and paid him to include a link to your site.
To avoid this technique as much as possible, Google also analyzes the relevance between sites. It is not natural for shoe e-commerce to generate a link to a butcher’s website, do you agree?
Another follow-up to Google’s algorithm that can identify this and other strategies is when a website starts getting multiple links overnight. This is certainly a sign of black hat and this site is now being closely monitored.
The recommendation is not to adhere to this strategy since, as it is another way of circumventing search engines, it is also subject to punishment. If you already take actions like this, do not be alarmed if there comes a time when your organic traffic has plummeted, as it is a risk you are taking.
Remember that advertising content is also considered a paid link, except when the nofollow mark is included in the links, indicating to the search engine robots that no authority should be passed to that site.
SPAM Blog
Have you seen totally irrelevant comments on blogs and forums, just for the purpose of including a link? This is yet another attempt to gain authority and traffic in an easier way. There are even tools developed just to distribute links in those places.
It is a practice that was already very common to try to improve the PageRank, but it fell out of use with the inclusion of the nofollow tag in these spaces, indicating to search engine robots that all links inserted in the site should not receive any authority.
Today the only advantage that still exists when including a link in comments and forums is the generation of traffic and even the possibility of generating Leads with it. In this case, if the link is relevant to the discussion in the forum or to the comments on the content, including a link in these places can bring positive results.
Negative SEO
Within the dark side of SEO, some webmasters choose not to apply black hat techniques to the website itself, but rather to harm competitors with negative SEO techniques (also known as negative SEO).
This technique consists of including negative comments and evaluations on the website and on Google My Business of competitors and even generating several low-quality links to their websites (called toxic links), to generate a punishment for search engines.
Fortunately, Google offers some ways to get around these attitudes. Irregular reviews on Google My Business can be reported to Google and have a good chance of being removed.
The Google Webmaster Tools also provides a tool to disavow links, with which you can add a list of toxic links from your site, indicating that you do not want to receive this link and removing any link (the link to your site will still be there, but he will not be harmed by it).
What are Google’s punishments?
If you do not follow Google’s guidelines, your site may receive a penalty, which varies according to the severity of the violation – determined by the volume of irregularities and their influence on the performance of the site.
Possible punishments are:
- Drop of 30 positions in the ranking;
- Drop of 50 positions;
- Drop of 950 positions;
- Google ban.
That way, if your organic traffic drops overnight, chances are great that you have received some punishment.
How do test if the site has been punished by Google?
- Check Google Search Console if you received a message or if there is any information in the “Manual Actions” area;
- Go to Google and search for “site: www.seusite.com.br”. Did any results appear? If you didn’t show up (or the answer just brought home) you were probably banned from Google;
- When searching for your company name on Google, is your site still well-positioned? If you are not, it is likely that you have received a punishment;
- Search Google for specific terms for pages you used to rank well for and make sure they are still between the first and second pages. Otherwise, chances are good that you have been punished.
If none of these steps pointed out punishment, you probably lost relevance with some Google update or increased competition. In that case, the only thing to do is to improve your content and SEO strategy to recover your website.
Did Google punish? The first step is to identify and remove what caused the punishment (especially if you used a black hat technique). You can then make a reconsideration request to Google to try to reverse the punishment.
Conclusion
Now that you know the main black hat techniques and all the problems that can arise with practice, I believe you want to get away from them, correct?
Just for the record, in March of this year, a new update called Google Fred took down several sites that had scarce content and were well-positioned due to other factors (many used black hat).
Using white hat techniques (which follow Google’s guidelines) is certainly more work and takes more time to get the results, but it is a guarantee of not being punished upfront and seeing all the work thrown away.
To start, how about starting a link generation strategy that really works and does not pose risks? Check out the Link Building Tips to get started now.
And, to go further, explore The Complete SEO Guide, an article with everything you need to know to stay on the first page of Google and attract more qualified visitors.
Have you ever done black hat, do you know another technique or have you received any punishment? Share your experience in the comments.