“Disavow files” for poor quality backlinks aren’t always necessary
Unless there’s a clear reason to use a disavow file (e.g. a manual action you’d otherwise take that can be resolved by disavowing specific domains), these days, it’s generally acceptable to remove your disavow file altogether. John Mueller says in the clip below that Google understands that poor-quality links are a natural part of the web ecosystem and that most sites will accumulate some spammy backlinks over time. He confirms that now Google is better than ever at recognizing and ignoring instances of poor-quality backlinks. But if you do delete an existing disavow file, it’s recommended to keep a copy of the document just in case you need to refer back to it later.
Spammy backlinks to 404 pages are ignored by default
When asked how to deal with thousands of spammy backlinks, John was keen to reassure users that low-quality sites linking to 404 pages won’t impact your site negatively. Because it’s a 404, Google essentially reads that link as not connecting to anything and ignores it (for the same reason, it’s important to review/redirect links coming from valuable sources to a 404 page). If it’s just a handful of sites providing spammy backlinks to non-404 pages, the recommendation is to set a domain-level disavow.
Updating backlinks to a migrated domain helps with canonicalization
An attendee was talking about a website migration from domain A to domain B. They were setting up redirects, but asked whether the page authority and rankings would be negatively affected if there were many existing backlinks that point to domain A.
John replied that setting up redirects and using the Change of Address tool in Search Console will help Google understand the changes that have occurred during a site migration. However, he said that on a per-page basis they also try to look at canonicalization. When dealing with canonicalization on migrated domains, John said that redirects, internal links, and canonical tags play a role —- but external links also play a role. What could happen, if Google sees a lot of external links going to the old URL, is that they might index the old URL instead of the new one. This could be because they think the change might be temporary due to these linking signals. During site migrations, they recommend finding the larger websites linking to your previous domain and requesting that those backlinks are updated to make sure that they can align everything with the new domain.
Many sites don’t require a disavow file
One user asked about managing the size of their site’s disavow file. Only links that could make a user or Google think they’ve been paid for belong in a disavow file. That means that not every instance of a spammy or low-quality link to your site needs to be included in a disavow file. John suggested that a disavow file isn’t necessary for most websites (and having one could be causing more problems than it solves).
Disavowing a redirected URL may be enough to prevent the passing of poor link signals
One participant asked for the best approach when dealing with a redirecting URL that has poor quality backlinks (in this case, a page with 18k spammy links was redirecting to the site homepage). If the rest of the destination URL’s backlink profile is relatively healthy, it may be enough to disavow that redirect alone. Disavowing all of the backlink spam would still be the ideal, but in cases like this the outcome probably doesn’t warrant that additional time and effort.
Backlink spam issues are unrelated to core algorithm updates
A site owner mentioned that they saw visibility drops after a core algorithm update and that around that time, they were working to solve technical issues like 404 pages in their sitemap, but also suspected it might also be related to spammy backlinks they had pointing to their site. John replied that if you’re seeing changes after core updates, backlink spam issues are likely unrelated to the update. Core updates are more about understanding your site’s overall quality and relevance, and less about backlink spam or specific technical issues. He emphasized that overall quality and relevance are likely more important aspects to focus on after core updates occur.
Google algorithms can still lose trust in sites displaying a strong pattern of manipulative links
Google tries to isolate and ignore spammy or toxic backlinks. However, there still are some cases where a site with a very strong pattern of unnatural links can lose trust and be penalized with a drop in visibility.
Unnatural Links Can Hurt a Site Regardless of Algorithm Updates
If you believe your site is suffering from poor link-building schemes, John recommends focusing on removing these unnatural links, regardless of any algorithmic updates. This can be done in a number of ways, including using the disavow file or removing links from the source site.
Disallowed Pages With Backlinks Can be Indexed by Google
Pages blocked by robots.txt cannot be crawled by Googlebot. However, if they a disallowed page has links pointing to it Google can determine it is worth being indexed despite not being able to crawl the page.