Since January 15, Google introduced a requirement to limit scraping of its SERPs: enabling JavaScript to access its search engine. While this decision was explained as an improvement to services for users, its consequences go far beyond that, directly affecting SEO tools and LLMs such as OpenAI.
Key takeaways:
- Google wants to require JavaScript to be enabled to access its search engine, affecting SEO tools and LLMs.
- This measure aims to strengthen protection against scraping, spam, and abuse.
- SEO tools will need to adapt, notably by using headless browsers, but this could increase costs and reduce speed.
- The French SEO community quickly found solutions thanks to mutual support.

Google's declared goals
According to a Google spokesperson interviewed by TechCrunch, the new requirement to enable JavaScript to access its search engine is motivated by:
- Protecting services against abuse, such as heavy scraping and spam.
- Provide more personalized and up-to-date search results.
JavaScript indeed makes it possible to limit abusive requests thanks to techniques like rate-limiting, and strengthens control mechanisms against bots. However, this measure affects a significant volume of users: around 8.5 billion searches per day are currently performed without JavaScript, according to that spokesperson!
SEO tools in turmoil
SEO tools, many of which rely on scraping results pages to collect essential data, were directly impacted by this change from Google, which blocked them entirely or partially. We had already reported last week on this “SEO tools outage" when the impact began to be felt, following Vincent Terrasi's post.
Adrian Delmarre, founder of the tool Haloscan, told us: "The Thursday/Friday changes ultimately affected us fairly moderately. We lost about 50% of our scraping capacity on Thursday (I should note that this in no way impaired Haloscan’s operation, which remained 100% available and usable at all times). We were back at full capacity by the end of the day on Friday. No extra cost, notably thanks to a tip shared by Paul Sanches.“
Many thanks
à @Seoblackout who, with a little trick
, allowed us to regain an optimal scraping speed
without any slowdown!
Many thanks!
— HaloScan (@haloscancom) January 17, 2025
Thanks to the responsiveness and sharing within the French SEO community
, and hats off to Paul for finding the ideal solution that helped many tools.
“Google changed something. It happens often, but usually we handle it without anyone noticing.", explains Adrian, while emphasizing that there is no need to be alarmist. "This is not the end of SEO. There will be other changes. We'll adapt. That's the principle of SEO, right?“
If Google announces a mandatory execution of JavaScript to access its SERPs, according to Adrian, it is not "yet" necessary to enable JavaScript and therefore emulate a browser for all requests. Maybe that will be the case soon.
A measure primarily directed at LLMs?
For several experts, this Google decision appears to primarily target language models like those developed by OpenAI or Perplexity. In a webinar organized on January 21 by Olivier Duffez, Fabien Faceries (MyRankingMetrics) and Vincent Terrasi (Draft'n'Goal)the experts explained that LLMs retrieve the best content from Google and reuse it to answer users' queries. This creates a twofold problem: on one hand, Google loses control over this data and doesn't directly benefit from its use; on the other, these tools make some users less dependent on Google.

This interpretation is shared by Adrian Delmarre: "Perhaps Google is not targeting SEO tools, but its competitors like LLMs. If that's the case, it's good news for us, because these models adapt quickly and show Google that blocking them is pointless.".
Consequences and solutions for professionals
These Google changes affect not only scrapers but also semantic optimization tools, rank-tracking tools, AI writing tools and cannibalization analysis tools.
The solutions being considered to adapt to this new reality nonetheless have limits:
- Use of JavaScript. This approach requires full page rendering, which significantly increases technical costs for the tools.
- Tightening of constraints by Google. The potential addition of traps or new anti-scraping rules could make these solutions even more complex and increasingly difficult to bypass.
- Reduced execution speed. Vincent Terrasi notes that, as things stand, tools should not increase their costs in the short term. However, their real-time speed could be compromised, although this is not necessarily critical for SEO, where immediacy is not essential.
A united SEO community
The SEO community's response to this upheaval has highlighted a remarkable capacity for adaptation and collaboration (despite the SEO dramas!). The trick shared by Paul Sanches, which allowed several tools to quickly bypass the JavaScript constraint, is a striking example of that solidarity, and that warms the heart.
Thanks also to Olivier Duffez and Fabien Faceries for having organized a webinar so quickly to review the situation, the context of the outage and the solutions found by SEO tools!
Vincent Terrasi urges SEOs to develop plans B, C, etc., so they are able to cross-reference the data without relying on a single tool, to avoid being left without metrics in case of a service outage. When you lose one, everything feels depleted!
The article "JavaScript required: how Google is complicating the work of SEO tools and LLMs" was published on the site Abondance.

à
, allowed us to regain an optimal scraping speed
without any slowdown! 
