A concerning new type of negative SEO attack has emerged, with website owners reporting what appears to be deliberate Core Web Vitals performance manipulation. The attack, which involves injecting intentional render delays, has caught the attention of Google’s John Mueller and Chrome’s Barry Pollard, who helped investigate this unusual cyber threat.
The Discovery: A Novel Performance Attack
A website owner recently brought attention to this issue on Bluesky, tagging Google’s John Mueller and Rick Viscomi, a Developer Relations Engineer at Google. The report detailed suspicious activity affecting multiple websites across different countries.
The victim described the attack:
“We’re seeing a weird type of negative SEO attack that looks like core web vitals performance poisoning, seeing it on multiple sites where it seems like an intentional render delay is being injected… Seeing across multiple sites & source countries”
What makes this attack particularly sophisticated is that the traffic patterns originate from multiple countries, targeting the same pages while often forging referrer data. This suggests a coordinated effort rather than random bot activity.
Understanding the Technical Details
The performance degradation was detected using web-vitals.js, a JavaScript library created by Google’s Chrome Team. This tool allows website owners and SEO professionals to measure Core Web Vitals metrics directly on their sites, providing data consistent with PageSpeed Insights and Search Console.
The critical aspect here is that these degraded scores come from actual server requests rather than cached content. When web-vitals.js records poor performance metrics, it’s measuring real-time page loading experiences on the origin server.
The Cache-Bypass Connection
The situation becomes more complex when considering the concurrent denial-of-service (DoS) attack mentioned by the website owner. They explained:
“Hard to get a clear picture because on top of the LCP issue the site is being hit with some kind of cache-bypass DOS attack that jacked up TTFB & has had the hosting maxxed out…”
Cache-bypass DoS attacks deliberately circumvent content delivery networks (CDNs) and local caches. Instead of serving fast, cached versions of pages, these attacks force servers to generate fresh content for every request, overwhelming server resources and degrading performance.
Google’s Official Response
John Mueller from Google provided reassurance about the potential impact on search rankings. His response was measured but optimistic:
“I can’t imagine that this would cause issues, but maybe @tunetheweb.com has seen things like this or would be keen on taking a look.”
Mueller tagged Barry Pollard, Chrome’s Web Performance Developer Advocate, to investigate further. This collaborative approach demonstrates Google’s commitment to understanding and addressing novel attack vectors.
Chrome Team Investigation
Barry Pollard questioned whether this might be a bug in the web-vitals library itself. More importantly, he asked if the performance degradation appeared in Chrome User Experience Report (CrUX) data.
The answer was revealing: the degraded Core Web Vitals scores were not reflected in CrUX data. This discrepancy provides crucial insight into the attack’s actual impact.
Why CrUX Data Remains Unaffected
The absence of performance issues in CrUX data suggests that real users aren’t experiencing the degraded performance. Here’s why:
CrUX data comes from actual Chrome users who have opted into sharing their browsing experience data. If these users are accessing cached versions of pages through CDNs, they won’t experience the server-side performance issues caused by the attack.
The attack targets origin servers while legitimate users benefit from cache layers that protect them from the performance degradation.
Impact on Search Engine Rankings
The concern about Core Web Vitals poisoning affecting search rankings appears to be largely unfounded. Several factors support this conclusion:
Limited Ranking Influence
Website performance, while important for user experience, functions as a relatively weak ranking factor. Google prioritizes content relevance and quality over technical performance metrics when determining search positions.
User Experience Protection
Since real users accessing websites through CDNs and caches don’t experience the performance issues, the actual user experience remains intact. Google’s ranking algorithms focus on genuine user experiences rather than artificial server-side measurements.
Data Source Reliability
Google’s ranking algorithms likely rely more heavily on CrUX data and similar real-world metrics rather than individual site measurements that could be manipulated or affected by attacks.
Protecting Against Performance Attacks
Website owners concerned about similar attacks should consider these defensive strategies:
Implement robust CDN protection to ensure legitimate users access cached content rather than hitting origin servers directly.
Monitor multiple data sources including both server-side metrics and external tools like Google Search Console and PageSpeed Insights.
Set up attack detection systems that can identify unusual traffic patterns and automatically implement protective measures.
Maintain baseline performance measurements to quickly identify when metrics deviate from normal patterns.
The Broader Security Implications
This incident highlights the evolving landscape of cyber attacks targeting SEO and website performance. While this particular attack appears to have minimal impact on actual search rankings, it represents a new category of threats that website owners should understand.
The collaborative response from Google’s team also demonstrates the search giant’s commitment to investigating and addressing novel attack vectors, even when they may not directly impact search results.
Key Takeaways
The Core Web Vitals “poisoning” attack, while concerning, appears to have limited real-world impact on search rankings. The discrepancy between server-side measurements and actual user experience data (CrUX) suggests that proper caching and CDN implementation can protect both users and search performance.
Website owners should focus on maintaining robust defensive measures while understanding that Google’s ranking algorithms are designed to reflect genuine user experiences rather than manipulated server metrics. The incident serves as a reminder of the importance of comprehensive monitoring and the value of Google’s multi-layered approach to performance measurement.