Web Spam

Web spam is a tactic used by webmasters to manipulate search engines in order to perform better within search results. This may be through spam links, hacked content or malware from third party tags, amongst many other unnatural methods. Our SEO Office Hours notes and recaps cover how Google handles what they deem as spam, along with advice for avoiding it.

Spammy backlinks to 404 pages are ignored by default

May 13, 2022 Source

When asked how to deal with thousands of spammy backlinks, John was keen to reassure users that low-quality sites linking to 404 pages won’t impact your site negatively. Because it’s a 404, Google essentially reads that link as not connecting to anything and ignores it (for the same reason, it’s important to review/redirect links coming from valuable sources to a 404 page). If it’s just a handful of sites providing spammy backlinks to non-404 pages, the recommendation is to set a domain-level disavow.


Google Wants to Automatically Ignore Unnatural Content

March 20, 2020 Source

Instead of applying manual penalties for unnatural content, Google wants to develop automatic solutions to ignore anything unnatural, like they already do for unnatural linking, so it won’t harm you and you won’t have to take any action. But in any situation where a penalty would be applied, the reviewer would probably take the time to look at the site.


Reconsideration Requests Can Take a Month to Process

March 20, 2020 Source

It can take Google up to a month to respond to reconsideration requests, particularly linking related issues. Google doesn’t send warnings first because they want to take immediate action when they find content with a problem.


Reconsideration Requests Are Reviewed in Batches & Grouped by Issue Type & Country

January 7, 2020 Source

The team reviewing reconsideration requests will review them in batches, and may group requests by issue type, country, and other factors. Once getting through one batch, they will then move on to the next one, and so on.


Submitting Another Reconsideration Request Won’t Affect Site’s Existing Place in Queue

November 12, 2019 Source

Submitting a second reconsideration request for a website while the original is still waiting to be reviewed won’t move your site up in the queue, nor will it move your site to the bottom of the queue. The best course of action is to wait for the original request to be reviewed, which can take time as queues often form.


Use Testing Tools to Identify if Your Site Has Hacked Content That is Being Cloaked

October 18, 2019 Source

If you suspect you have hacked content that is currently being cloaked from search engines being able to see, John recommends using testing tools including the inspect URL tool in GSC to identify if Google is finding this content.


Auto-generated Content is Against Webmaster Guidelines

September 6, 2019 Source

Using auto-generated content, for example spun content, to create text-based pages is considered against webmaster guidelines. This is particularly true if the content created has no value for users or is similar to other content provided elsewhere on the web.


The Request Review Option in GSC Is The Best Way to Inform Google That Content is Legitimate

July 12, 2019 Source

If Google Search Console is flagging that content appears to be hacked, the request review approach is the best way to inform Google that the content is legitimate.


Rich Snippets Spam Report Tells Google About Manipulated Structured Data

May 31, 2019 Source

Use the rich snippets spam report form to inform Google about instances of manipulated structured data.


Related Topics

Copyright/DMCA Issues Thin Content Duplicate Content Embedded Content Images User Generated Content Hidden Content Interstitial Pop-ups Expired Content Keyword Optimization Header and Subhead Tags Page Structure Videos Social Media