Technical issues don’t tend to trigger core update drops
One user was concerned about running into new technical issues just before a Google core algorithm update was launched. If you’re seeing a negative impact after a core update, it’s usually due to data from a longer period of time and not just a reflection of new technical issues in place at the moment of the core update. It’s also worth remembering that technical issues usually don’t tend to fall into the same category as the quality issues core updates focus on. Running into new technical issues just as a core update is released doesn’t mean that your site will definitely be negatively impacted.
Core Web Vitals metrics are weighted toward the pages with the most traffic on your site
Google’s Core Web Vitals (CWV) metrics are typically looked at through a sample of traffic to the overall website. Therefore, it’s the pages on your site that get the most visits that will contribute the most to the overall CWV score. Having pages on your site that perform poorly for Core Web Vitals, but don’t bring in a lot of traffic, is going to be less of a concern. In the same vein, pages with little traffic and great CWV metrics aren’t likely to pull up the site-wide score. The exception is if Google has enough data to segment a certain part of the site and treat it separately. For example, a super fast blog with lots of visits may end up being looked at on its own, away from the rest of the content on the site.
The desktop Page Experience roll-out is unlikely to cause immediate or significant ranking changes
As page experience rolls out as a ranking factor on desktop, John is keen to clarify that the weight it holds will largely mirror what’s seen on mobile. If it’s clear that a page is the best result for that query, page experience signals could be downplayed. If there are multiple pages in SERPs that could answer the user’s query and intent equally well, page experience is one of the factors that could be used to distinguish between them and rank one site above another.
There’s generally no SEO benefit to repurposing an old or expired domain
When asked about using old, parked domains for new sites, John clarifies that users will still need to put the work in to get the site re-established. If the domain has been out of action for some time and comes back into focus with different content, there generally won’t be any SEO benefit to gain. In the same vein, it typically doesn’t make sense to buy expired domains if you’re only doing so in the hopes of a visibility boost. The amount of work needed to establish the site would be similar to using an entirely new domain.
It can take months for Google to reassess site quality
Google essentially has no memory when it comes to technical issues and there should be no lasting impact once a cause has been resolved. However, it can take Google weeks or even months to determine the quality of a site and establish how it fits into the wider context of the web. Therefore, improvements to site quality can take a lot longer to make a significant impact.
Backlink spam issues are unrelated to core algorithm updates
A site owner mentioned that they saw visibility drops after a core algorithm update and that around that time, they were working to solve technical issues like 404 pages in their sitemap, but also suspected it might also be related to spammy backlinks they had pointing to their site. John replied that if you’re seeing changes after core updates, backlink spam issues are likely unrelated to the update. Core updates are more about understanding your site’s overall quality and relevance, and less about backlink spam or specific technical issues. He emphasized that overall quality and relevance are likely more important aspects to focus on after core updates occur.
SEO improvements based on CWV metrics take about a month to show results
Google’s Core Web Vitals look at data that is delayed by around 28 days. This means that any significant page speed improvements you make on your website will typically take about a month to show up in the search results.
Impressions from bots and scrapers can sometimes still make their way into GSC
Google filters and blocks bots at different positions in their search systems. It’s still sometimes possible to see impressions from bots and scrapers appearing as impressions in Google Search Console (GSC). If bots appear in your GSC, you can flag these in the ‘feedback’ function. Sometimes bots are filtered out at a later stage in the search system but still will appear in GSC.
Core Web Vitals are weighted equally across all industries and website types
Neither the type of website nor industry vertical will alter how much weight is given to Google’s Core Web Vitals metrics.