Since 2024, the "num=100" parameter, which allowed displaying 100 results per page on Google, has no longer worked reliably. Between random tests, increased technical requirements and new challenges for SEO tools, this removal hits search professionals hard. Analysis of the causes, concrete consequences and field feedback about this major change.
Key takeaways:
- Google now tests and limits displaying 100 results per page via “num=100”, which is no longer guaranteed or stable.
- This change strongly impacts SEO tools, forcing a review of technical infrastructure and ranking-tracking strategies.
- Pagination and the requirement for JavaScript increase the costs and complexity of retrieving data, especially for scraping.
- Reliably accessing and quickly reaching the TOP 100 of the SERP is becoming uncertain for industry professionals.
The end of the “num=100” parameter: Google changes the game for SERP display
Since late 2023, a change has been disrupting the habits of SEOs and web actors: the " num=100 " parameter used to force Google to display 100 results on a single page, has lost its reliability. This change, first noticed on forums and relayed by experts like Barry Schwartz (Search Engine Roundtable), is causing numerous malfunctions for advanced users and SEO tools.
An operation that has become random and unstable
From autumn 2024, several signals show that the "&num=100" option is no longer systematically supported by Google. When this parameter is added to the search URL, it only works about one time out of two now, and its behavior varies depending on whether one is signed in to a Google account or not. For some, it only works when logged out; for others, it simply has no effect regardless of context. This giant "A/B test" was observed on X (formerly Twitter) and confirmed by many SEOs, whose rank-tracking tools now encounter inconsistencies and random captchas.
Major technical and economic consequences for SEO
As a reminder Fabien Barry of Monitorank in a LinkedIn postThe removal of the " num=100 " parameter has forced rank-tracking tools to adapt very quickly. Already, when continuous scrolling was introduced in the SERPs in November 2023, some had to switch to headless browsers to simulate user behavior and execute the JavaScript that generates the page. This solution exploded infrastructure costs (up to 5x) for industry players, notably due to heavy RAM and CPU usage.
At the end of June 2024, the return of classic pagination did not restore native support for " num=100 ", while making the loading of subsequent pages even heavier (full DOM for each page, no more "light" AJAX reloads)[Fabien Barry LinkedIn].
The consequence? Each new requested page requires re-downloading most of the content, multiplying request time and server resources, raising costs to 7x compared to when " num=100 " was supported. Optimization attempts (disabling JS, selectively retrieving result blocks) improve efficiency but do not offset the unnecessary volumes downloaded, especially when a proxy is blocked as early as the third page, partially cutting off access to the TOP 100 in a single call.
Data reliability, a new challenge for SEO tools
The loss of “num=100” forces a new philosophy on SEO tools: prioritize the reliability and representativeness of extracted information. Fabien Barry explains he chose to limit tracking to only five pages in order to provide higher-quality data despite a 20% price increase. For some, this situation marks the end of easy TOP 100 collection and leads professionals to ask themselves: Is a TOP 100 still relevant or useful in current practices?
Since January 2025, Google has gone further by requiring JavaScript to be activated on its SERPs, further proof that SEO bots are in the crosshairs of the Mountain View company. This change is therefore not just a UX test: it reflects a clear intent to complicate the mass collection of information for third-party tools, while emphasizing the protection of its infrastructures and data.
The article "Google drops the num=100 parameter: a change that disrupts the SEO ecosystem" was published on the site Abondance.