Any novice SEO optimizer, starting their career in creating websites to promote them in search engine rankings and driving so-called search traffic, immediately encounters a list of various “Google filters” that supposedly make life difficult for webmasters.
One of these filters is the well-known “Googlebombing” in SEO circles.
The essence of this filter is simple: if a site receives numerous links from other sites using the same anchor text (i.e., the text description of the link), Google will “filter” the site, imposing penalties on it. Consequently, this site will never rank high enough in the search results for users to find it.
At first glance, this filter seems like a formidable weapon for search engines in their fight against spammers, i.e., webmasters who engage in “unauthorized” postings on abandoned forums. For those unaware, forums differ significantly from guestbooks or blogs, where you can only place a “bare” link, i.e., a link without an anchor. On forums, links are inserted via BB code, meaning with an anchor.
Modern spam technologies have advanced significantly, and today there are programs (“spammers”) that can change the anchors on links from forum to forum, creating the impression of voluntary “citation” of a site liked by the forum owners. However, it should be noted that professional specialists in this field are still quite rare, and such programs are expensive, practically excluding their use by thousands of novice spammers who usually have no money at the start of their activities. These beginners prefer to use free software, which typically sends the same link with the same anchor.
Given the above, one might think that Google uses this interesting filter to catch all these inexperienced spammers. But let’s consider more serious matters. Google is not a “library policeman,” and it doesn’t intend to catch anyone. Google’s policy is much broader and higher—it aims to block low-quality sites, i.e., web resources whose content does not match the interests of visitors landing on them. And it won’t judge sites merely based on some “incorrect anchors.” All this is just children’s stories, or rather, scary tales for novices and semi-scientific hypotheses designed to intimidate.
Let’s imagine you have a wonderful site filled with high-quality content that is very interesting and useful to people. Google recognizes this quality and places your site in a good position in its search results, providing it with a decent flow of grateful visitors. Suddenly, competitors appear, and they place hundreds, thousands, tens of thousands (or even millions) of links with the same anchor text to your site. According to this scenario, Google should then remove your excellent site from its good positions, push it to the bottom of the search results, or even exclude it entirely (i.e., ban it). Do you think Google would do this? Of course not. Naturally, it won’t look for the culprits, won’t ponder long and hard—whether they are competitors or not—it will simply ignore these links. More precisely, it will give some attention to them but will select the most high-quality links (if any exist), and if there are no quality links at all, this will not affect the site’s ranking.
It’s important to remember that Google, unlike Yandex (with all due respect), strictly adheres to the principle of presumption of innocence, so no tricks by competitors can harm good and useful sites.
To sum up, it’s worth noting that Google does not use any “filters” at all today.
What is a “filter”? It’s a situation where both the innocent and the guilty are treated the same. Google cannot allow this. If a site is good, it should not suffer even for a minute. That’s why today, alongside good sites, we see many “junk sites” in Google’s search results. However, all this “junk” quickly disappears from the results, while useful resources remain. This may not be ideal from the perspective of clean search results (since “junk” constantly replaces each other, persistently and successfully pushing good sites out), but it’s perfect for fair business conduct. Democracy, what can you do. From this perspective, maybe Yandex is right when it operates on the principle, “better ten innocent sites suffer than let one junk site into the results.” But its search results are clean, and experienced webmasters know exactly what price is paid for this.