Blog

SEO Goossips: AI, External Signals, Core Web Vitals, Crawl, GEO

Some unofficial bits of information about Google (and sometimes Bing) and its search engine picked up here and there in recent days, with a few answers to these questions on the agenda this week: What is the impact of AI-generated content and images on SEO? How should professionals approach AI-driven searches? Why does it exclude certain external signals? What are the consequences of negative SEO on Core Web Vitals? What could explain a sudden drop in crawl? What is John Mueller's opinion on the new acronyms GEO, AIO or AEO?

Goossip #1

AI-generated content should be proofread by a human

Google, speaking through Gary Illyes, clarified that AI-generated content is accepted by the search engine as long as it meets strict criteria of quality, originality and factual accuracy. Illyes insists that the term "created by a human" is not the most appropriate; he prefers to talk about "human curation", emphasizing that all content produced by AI should be proofread and verified by a human before publication. The goal is to avoid bias, factual errors and the overproduction of very similar content. Human intervention is not meant to be explicitly flagged on the page, but must ensure beforehand that the content is reliable and relevant for users.

Source: Search Engine Journal

Reliability rate: ⭐⭐⭐ We agree!

It's obvious! If LLMs can work wonders, they can also produce content riddled with errors, approximations and hallucinations. In this context, a human review is the least we can demand.

Goossip #2

AI-generated images do not affect ranking

Gary Illyes also specified that the use of images generated by artificial intelligence does not entail no penalty or direct negative impact on organic SEOany penalty or direct negative impact on organic rankings. According to him, integrating this type of image on a page, as long as the rest of the content is legitimate, does not affect the site's position in search results. He nonetheless reminds us that adding images can use more server resources, and that conversely, these images could generate traffic via Google Images or video search. He did not delve into the question of authenticity, but stresses that priority should remain user experience and relevance for internet users.

Source: Search Engine Journal

Reliability rate: ⭐⭐⭐ We agree!

An image, whether AI-generated or not, has the power to attract or repel an internet user. As with content, the rule remains quality and relevance.

Goossip #3

AI searches: SEOs urged to study the transformation of clicks into conversions

Fabrice Canel, from the Bing team at Microsoft, points out that search engines have limited visibility on what happens after a click from an AI search: they don't know whether the visitor converts. According to him, although improving the quality of results leads to more relevant clicks, it is up to the SEO community to study the full journey, from click generation to conversion. He calls for more research on this topic, because the search engine's classic indicators (return to the results page, stability of clicks…) are not enough to measure the real effectiveness of AI clicks; only on-site conversion analysis can judge that. Thus, Bing and Google are now encouraging marketers and SEO experts to better track and analyze conversions generated by AI in order to adapt their strategies.

Source: Search Engine Roundtable

Reliability rate: ⭐⭐⭐ We agree!

AI search is a new discipline. For now, it remains difficult to measure its impact on conversions, particularly using traditional indicators.

Goossip #4

Google excludes certain external signals it cannot control

Gary Illyes explains that the search engine excludes external signals such as shares or views on social networks from its ranking criteria because it has no control over their reliability or validity. According to him, Google must be able to control its own signals using those coming from third-party platforms exposes them to manipulation (example: artificially inflating the number of shares) and to unpredictable fluctuations. This philosophy also extends to other signals that can be easily manipulated by SEOs or publishers; Google does not consider them ranking factors, preferring to rely on criteria it can measure and control directly to ensure the relevance and integrity of results.

Source: Search Engine Journal

Reliability rate: ⭐⭐⭐ We agree!

One can understand Google's approach: since they are easily manipulated, external signals are not necessarily the most reliable elements.

Goossip #5

The "Core Web Vitals poisoning" has no impact on ranking

The attack called “poisoning Core Web Vitals” aims to artificially sabotage a site's web performance metrics (LCP, FID, CLS) in order to distort scores measured by tools like webvitals-js. John Mueller believes thatsuch sabotage has no direct impact on ranking, because Google bases its analysis on real user data (CrUX) and not on internal metrics or automated tests. Even if a DoS-type attack slows the server and degrades local scores, as long as the real experience of users is not affected, there are no SEO consequences. Experts still recommend strengthening security and monitoring for reputation and reliability reasons.

Source: Search Engine Journal

Reliability rate: ⭐⭐ We have some doubts...

Let's recall that Google has repeatedly recommended not to over-focus on Core Web Vitals. While they remain important, they should not become an obsession at the expense of other SEO aspects.

Goossip #6

A sudden drop in crawl is often an indication of a server-side problem

John Mueller explained that a sudden drop in Googlebot crawling is rarely caused by 404 errors; he believes it is much more. often a sign of server-side problems (429, 500, 503 errors) or timeoutsHe recommends checking server logs and Search Console to identify any spikes in errors or potential blockages, notably at the CDN level or in security rules. Once issues are resolved, crawling returns to normal, but recovery can take several days or even weeks, with no guaranteed timeframe. 404 errors are normally handled by Googlebot which retries later, while 429/500/503 responses prompt the bot to reduce its crawl frequency to protect the server.

Source: Search Engine Journal

Reliability rate: ⭐⭐⭐ We agree!

It's always useful to check logs and Search Console to identify the cause, taking into account the role of CDNs and other security measures. Once the incident is resolved, Googlebot gradually resumes its crawling activity with no precise timing, while 404 errors are simply retried in the standard way, and 429/500/503 errors trigger a temporary reduction in crawling to preserve site stability.

Goossip #7

GEO, AIO, AEO… Beware of scams!

John Mueller of Google warns against the proliferation of new SEO acronyms related to AI such as GEO, AIO or AEO, noting that the more their promotion is insistent and urgent, the more likely it is to hide spammy or scammy practicesHe considers this trend essentially marketing and irrelevant to search ranking. Mueller encourages SEO professionals not to give in to the pressure of these trends and to focus on solid, proven strategies centered on content quality and user experience, rather than following opportunistic terminology that often only serves to sow confusion or exploit the industry's gullibility.

Source: Search Engine Roundtable

Reliability rate: ⭐⭐⭐ We agree!

As the industry evolves, it's perfectly natural to see new acronyms appear to describe this new way of approaching SEO in the AI era. However, this should not leave us mystified: while it is essential and even vital to pay attention to ongoing changes, the fundamentals of organic SEO remain more relevant than ever.

The article “SEO Goossips: AI, External Signals, Core Web Vitals, Crawl, GEO” was published on the site Abondance.