Google Spam Update: FAQ & Everything You Need to Know [ november 2022]

Google Spam Update: FAQ & Everything You Need to Know [ november 2022]

5/5 - (19 votes)

Google Spam Update: FAQ & Everything You Need to Know

Google is constantly updating its search algorithm to improve the quality of search results and prevent spam. In 2021, Google released an update specifically targeting spammy content. Here’s what you need to know:

Q: What is the Google Spam Update? A: The Google Spam Update is an update to Google’s search algorithm that aims to improve the quality of search results by targeting spammy and low-quality content.

Q: When was the update released? A: The update was first released in July 2021, with a second update released in December 2021.

Q: What does the update target? A: The update targets several types of spammy and low-quality content, including:

  • Thin content: pages with little or no original content, or pages that are essentially just copies of other pages on the web
  • Duplicate content: pages that are identical or very similar to other pages on the web
  • Keyword stuffing: the excessive use of keywords in an attempt to manipulate search rankings
  • Cloaking: showing different content to search engines than to human users
  • Hidden text or links: text or links that are hidden from users but visible to search engines
  • Affiliate or doorway pages: pages designed solely to funnel traffic to other pages
  • Misleading or deceptive content: content that attempts to deceive users, such as fake news or phishing sites

Q: How will the update affect my website’s ranking? A: If your website contains spammy or low-quality content, it’s likely that your ranking will be negatively impacted by the update. However, if your website contains high-quality, original content that provides value to users, your ranking may improve.

Q: What can I do to avoid being affected by the update? A: To avoid being negatively affected by the update, you should focus on creating high-quality, original content that provides value to users. Avoid keyword stuffing, duplicate content, and other spammy tactics. You should also make sure your website is mobile-friendly, loads quickly, and has a good user experience.

Q: What should I do if my website is negatively affected by the update? A: If your website is negatively affected by the update, you should focus on improving the quality of your content and fixing any spammy tactics you may have used in the past. You should also make sure your website meets Google’s guidelines for webmasters.

Q: Where can I find more information about the Google Spam Update? A: You can find more information about the Google Spam Update on Google’s webmaster blog and other SEO websites.

What is a Google Spam Update?

A Google Spam Update is an update to Google’s search algorithm that targets spammy and low-quality content in search results. The update aims to improve the quality of search results by penalizing websites that use spammy tactics such as keyword stuffing, duplicate content, cloaking, hidden text or links, and other deceptive or misleading practices. The Google Spam Update is part of Google’s ongoing effort to provide users with the most relevant and trustworthy search results. Websites that engage in spammy tactics are likely to see a decrease in their search ranking and traffic as a result of the update.

In this case, Google then speaks of an update of the “SPAM Update” type and the latter is announced through their usual means of communication (eg: Twitter ).

What is Google SpamBrain?

SpamBrain is the automatic AI-based spam prevention system developed by Google .

From time to time, Google improves this system to make it more efficient in detecting spam and to be able to detect new types of spam.

What are the main SEO techniques (black hat / gray hat) targeted by these Google SPAM Updates?

Here is an overview of the main BlackHat SEO techniques penalized by Google via SPAM Updates and its SpamBrain algorithm .

1- Cloaking

Cloaking : Cloaking is a spammy SEO technique in which a website shows different content to search engine crawlers than it does to human visitors. The purpose of cloaking is to manipulate search engine rankings by presenting optimized content to search engine crawlers, while showing different content to human visitors.

Cloaking is often used to hide spammy content or to present content that violates search engine guidelines. For example, a website might present text that is the same color as the background of the page in order to hide it from human visitors, while still making it visible to search engine crawlers.

Cloaking is considered a violation of Google’s Webmaster Guidelines and can result in penalties, including the removal of the website from search results. In general, it’s best to avoid cloaking and to focus on providing valuable and relevant content to both search engine crawlers and human visitors.

Here are some examples of cloaking penalized by Google and the SPAM update:

  • Showing a page about travel destinations to search engines while showing a page about discount drugs to users;
  • Insert text or keywords into a page only when the user-agent requesting the page is a search engine and not a human visitor.
  • If your site uses technologies that search engines struggle to access, like JavaScript or images, see our recommendations for making that content accessible to search engines and users without obscuring.

It’s good to know :

  • If you use a paywall or content control mechanism, Google does not consider it cloaking if Google can see the full content of what is behind the paywall as anyone with access to the controlled content and if you follow their recommendations on flexible sampling .

2- Doorways

Doorways : In the context of search engine optimization (SEO), doorways refer to pages or websites that are created solely for the purpose of ranking highly in search results for specific keywords or phrases. These pages are typically created using spammy tactics, such as keyword stuffing, and are designed to funnel traffic to another website or page.

Doorway pages are not created for human users, but rather for search engines, and often contain little or no original content or value. They are considered a form of spam by search engines like Google and can result in penalties or even the removal of the website from search results.

It’s important to note that not all landing pages or gateway pages are considered doorways. Landing pages that provide valuable and relevant information to users, and that are optimized for specific search terms, can be beneficial for both users and search engines. However, doorway pages are a violation of Google’s Webmaster Guidelines and should be avoided.

Here are some examples of Doorways-type practices that are penalized by SpamUpdate:

  • Have multiple websites with slight variations in URL and homepage to maximize their reach for any specific query.
  • Having multiple domain names or pages targeted to specific regions or cities that direct users to one page.
  • Pages generated to direct visitors to the actually usable or relevant part of your site(s).
  • Very similar pages that are closer to search results than a clearly defined, searchable hierarchy.

3- Pirated content

Hacked content is any content placed on a site without permission, due to vulnerabilities in the site’s security. Hacked content gives poor search results to Google users and can potentially install malicious content on their machines.

Here are some examples of website hacking practices that are targeted by Google SpamUpdates:

  • Code injection : When hackers access your website, they may try to inject malicious code into existing pages on your site. This often takes the form of malicious JavaScript injected directly into the site or into iframes.
  • Page injection : Sometimes, due to security vulnerabilities, hackers are able to add new pages to your site, which contain spam or malicious content. These pages are often intended to manipulate search engines or attempt phishing. Your existing pages may not show signs of hacking, but these newly created pages may harm your site’s visitors or its performance in search results.
  • Content Injection : Hackers can also try to subtly manipulate existing pages on your site. Their goal is to add content to your site that search engines can see, but may be harder for you and your users to spot. This can range from adding hidden links or hidden text to a page using CSS or HTML, to more complex modifications like cloaking.
  • Redirects : Hackers can inject malicious code on your website that redirects some users to dangerous pages or spammy pages. The type of redirect sometimes depends on the referrer, user agent or device. For example, clicking on a URL in Google search results may redirect you to a suspicious page, but there is no redirect when you visit the same URL directly from a browser.

4- Hidden text and links

Hidden text or links are placing content on a page for the sole purpose of manipulating search engines and not being easily seen by human visitors.

Here are some examples of hidden text or links that violate Google’s anti-spam policies:

  • Use white text on a white background;
  • Hiding text behind an image;
  • Using CSS to position text off screen;
  • Set font size or opacity to 0;
  • Hide a link by linking only a small character (for example, a hyphen in the middle of a paragraph).

There are many web design elements today that use dynamically showing and hiding content to improve the user experience; these items do not violate Google’s policies. Here are some concrete examples:

  • Accordions or tabs that hide or show additional content;
  • Slideshow or slider that alternates between multiple images or paragraphs of text;
  • Tooltip or similar text that displays additional content when users interact with an item. ;
  • Text accessible only to screen readers and intended to improve the experience of users of screen readers (ex: alt tag);

5- Keyword stuffing

Keyword stuffing is a spammy SEO technique in which a website attempts to manipulate search engine rankings by excessively using a particular keyword or phrase on a page or website. The practice of keyword stuffing involves overusing the targeted keywords in various parts of the website, such as the meta tags, headings, content, and alt tags, in an effort to make the website rank higher for those keywords.

Keyword stuffing is considered a violation of Google’s Webmaster Guidelines and can result in penalties, including a decrease in search rankings or the removal of the website from search results. Instead of using keyword stuffing, it is recommended to use natural language in website content and only include keywords where they are relevant and appropriate. A well-written, informative, and relevant content that satisfies user intent is key to achieve high search engine rankings.

6- Link spam (abusive netlinking)

Google uses links as an important factor in determining the relevance of web pages. Any link intended to manipulate rankings in Google’s search results may be considered spam. This includes any behavior aimed at manipulating links to your site or outgoing links from your site.

The following are examples of link spam :

  • Buying or selling links for ranking purposes. This includes: exchanging money for links or articles containing links;
  • Exchanging goods or services for links: ex: sending a product to someone in exchange for an article on this product and a link;
  • Excessive link exchanges (“Link to me and I’ll link to you”) or partner pages for the sole purpose of cross-linking;
  • Using automated programs or services to link to your site;
  • Requiring a link as part of a terms of service, contract, or similar agreement, without giving the owner of the third-party content the choice to qualify the outbound link;
  • Text ads or text links that are not declared nofollow;
  • Advertorials or native advertisements for which payment is received for articles that include links that pass PageRank, or links with optimized anchor text in articles, guest articles, or press releases distributed on other sites ;
  • Links from directories, directories or low quality bookmark site;
  • Keyword-rich, hidden or low-quality links embedded in widgets distributed on various sites;
  • Sitewide links, widely distributed in the footers or templates of various sites;
  • Forum comments with optimized links in the message or signature for example.

Good to know for publishers:

Google understands that buying and selling links is a normal part of the web economy for advertising and sponsorship purposes. The presence of such links does not constitute a violation of their rules as long as they are qualified by a rel=”nofollow” or rel=”sponsored” attribute value.

7- Traffic generated by machines / robots

Fake traffic, meaning traffic generated artificially by machines, consumes resources and interferes with Google’s ability to best serve users.

Here are some examples of automated traffic that is penalized by Google SpamUpdates:

  • Sending automated requests to Google to trick Google Suggest recommendations;
  • Extracting results for ranking verification purposes or any other type of automated access to Google Search without express authorization (in other words, not all SEO analysis software respects anti-spam guidelines from Google…).

8- Sites hosting malicious software and behavior

Google checks websites to see if they host malware or unwanted software that harms user experience.

Malware is software or a mobile application specifically designed to harm a computer, mobile device, the software it runs, or its users. Malware has malicious behavior that can include installing software without user consent and installing harmful software such as viruses. Site owners sometimes don’t realize that their downloadable files are considered malware, so these binaries can be inadvertently hosted.

Unwanted software is an executable file or mobile application that exhibits deceptive, unexpected behavior or negatively affects the user’s browsing or computing experience. This could be, for example, software that changes your homepage or other browsing settings to settings you do not want, or applications that disclose private and personal information. without disclosing them.

9- Misleading features

Site owners need to create sites with high quality content and useful features for users. However, some site owners intend to manipulate search engine rankings by intentionally creating sites with deceptive features and services that trick users into believing they can access certain content or services, but do not actually cannot.

Here are some examples of deceptive features:

  • A site with a fake promo code generator that claims to provide a promo code from an online store, but actually does not.
  • A site that claims to provide certain functionality (eg, PDF merging, countdown timer, online dictionary service), but intentionally directs users to misleading advertisements instead of providing the claimed services.

10- Scraped, duplicated, stolen and dynamically recovered content

Some site owners base their sites on content scraped from other, often more reputable, sites. Content fetched, even if it comes from high-quality sources without additional useful services or content provided by your site, may not add value to users. It may also constitute copyright infringement. A site may also be downgraded/penalized by Google if a significant number of valid legal takedown requests have been received.

Here are some examples of abusive scraping:

  • Sites that copy and republish content from other sites without adding original or valuable content or even citing the original source.
  • Sites that copy content from other sites, modify it only slightly (for example, by substituting synonyms or using automated techniques) and republish it.
  • Sites that replicate content streams from other sites without providing a unique benefit to the user.
  • Sites devoted to the integration or compilation of content, such as videos, images or other media from other sites, without substantial added value for the user.

11- Sneaky redirects

Redirection is the act of sending a visitor to a different URL from the one they originally requested. Sneaky redirection is a malicious action aimed at showing users and search engines different content or showing users unexpected content that does not meet their initial needs.

Here are some examples of sneaky redirects penalized by Google:

  • Show search engines one type of content while redirecting users to something significantly different;
  • Showing desktop users a normal page while redirecting mobile users to a completely different spam domain.

Although sneaky redirection is a type of spam, there are many legitimate, spam-free reasons to redirect one URL to another.

Here are some examples of legitimate redirects:

  • Move your site to a new address;
  • Consolidation of several pages into one;
  • Redirect users to an internal page once they are logged in.

When considering whether a redirect is sneaky, one must consider whether it is intended to mislead users or search engines.

12- Content automatically generated by spammers

Auto -generated (or “self-generated”) content is content that has been programmatically generated without producing anything original or adding sufficient value; on the contrary, it was generated for the main purpose of manipulating search rankings and not to help users.

Here are some examples of automatically generated content for spamming purposes:

  • Text that makes no sense to the reader, but contains search keywords;
  • Text translated by an automated tool without human revision or curation before publication;
  • Text generated by automated processes without regard to quality or user experience;
  • Text generated using automated techniques of synonymization, content spinning , paraphrasing, or obfuscation;
  • Text generated from RSS feeds or search results;
  • The assembly or combination of content from different web pages without adding sufficient added value.

It’s good to know :

  • If you host such content on your site and you believe that this content brings real added value to your users, you can use non-indexing methods to exclude it from search results and thus preserve your site in its entirety.

13- Affiliate pages with “thin” content

Thin Affiliate Pages are pages with product affiliate links where product descriptions and reviews are copied directly from the original merchant with no original content or added value.

Affiliate pages can be considered thin if they are part of a program that distributes its content through an affiliate network without providing added value.

These sites often seem to be low quality sites with a poorly designed design and with the same or similar content reproduced on the same site or on several domains or languages.

If a Google search results page displayed multiple such sites, all with the same content, thin content affiliate pages would create a frustrating user experience.

It’s good to know :

  • All sites that participate in an affiliate program are not necessarily sites with low quality content. Good affiliate sites add value by offering helpful content or features. Examples of good affiliate pages include offering additional pricing information, original product reviews, rigorous testing and reviews, product or category navigation, and product comparisons.

14- User Generated Spam

User-generated spam is a type of unwanted content added to a site by users through a channel intended for user content.

Often site owners are unaware of this unwanted content.

Here are some examples of unwanted user-generated content:

  • Spammer accounts on web hosting services that anyone can sign up for free;
  • Spam messages in forum threads;
  • Blog comment spam (SPAMco);
  • Junk files uploaded to file hosting platforms.

Why does Google have to roll out these anti-SPAM updates?

Google’s spam policies help protect users and improve the quality of search results .

To be eligible to appear in Google’s web search results (webpages, images, videos, news content, or other items found by Google on the web), the content must not violate the general rules of Google Search or the spam rules listed in this article .

These policies apply to all web search results, including those from Google properties.

Google increasingly detects content and behavior that violates the rules through automated systems and, where appropriate, human review that can lead to manual action.

Sites that violate their rules may rank lower in the results or may not appear in the results at all .

If you think a site violates Google’s spam policies, you can let Google know by completing a Search Results Quality Report.

Has your site been negatively impacted by a Google SPAM Update? What does this mean?

Sites that see a negative change in SEO KPIs after a spam update should review Google’s spam policies to ensure they are in compliance.

Sites that violate Google policies may rank lower in results or may not appear in results at all.

How long does it take to get out of a penalty for SPAM by Google?

It may take several months.

If changes are made to improve a site, Google’s automated systems will learn over the months that the site complies with its spam policies and the latter may therefore claim better positions after the next major update, for example.