Notizie IA Logo

AITalk

News and analysis on Artificial Intelligence

EU Investigates Google: Is AI to Blame for Publishers' Decline?

CopyrightEthics & SocietyBusiness

google-ue-indagine.jpg

The EU opens a formal investigation into Google over the collapse of publisher traffic. But behind the dispute over spam lies a bigger game: who really controls the flow of online information and who profits from it.

The Spark That Ignites the Fire

On November 13, 2025, the European Commission initiated formal proceedings against Google under the Digital Markets Act. The accusation is precise: changes to search ranking policies, introduced under the label of combating spam, allegedly caused a vertical collapse of traffic to European publisher sites, particularly penalizing those that host third-party commercial content.

Teresa Ribera, Executive Vice-President of the Commission, does not mince words: "We are concerned that Google's policies do not allow news publishers to be treated in a fair, reasonable and non-discriminatory manner in search results." The investigation focuses on two specific articles of the DMA, 6(5) and 6(12), which oblige so-called gatekeepers to ensure transparency and equal treatment for commercial users of their services.

Google's response arrived a few hours later, with a post on the company blog that completely overturns the narrative. Dan Taylor, Vice President of Google Search, frames the issue as a battle to protect users from manipulative practices: "parasite SEO," where authoritative sites sell space on their URLs to host third-party content that exploits their reputation to climb the search results.

The timing of the European announcement is not accidental. In October 2025, Google had already received a fine of €2.95 billion for violations in the advertising technology sector, also under the DMA. This new investigation, which could lead to penalties of up to 10% of the company's annual global turnover, is part of a context of growing tension between Big Tech and European regulators. ue.jpg Image from the European Union website

Anatomy of a Downgrade

To understand what is really happening, we need to go back a few months. In March 2024, Google introduced the "site reputation abuse" policy, aimed at combating what the company calls a systematic pollution of search results. The mechanism is as simple as it is devastating: news outlets and authoritative sites sold subdirectories or subdomains to commercial operators, who published SEO-optimized content for financial products, online casinos, and sponsored reviews.

The symbolic case is that of Forbes, which hosted affiliate content for credit cards and current accounts in its subdomains that had no editorial connection with the publication. The same goes for the Wall Street Journal, CNN, and dozens of other publications that had transformed portions of their URLs into veritable commercial enclaves. Google defines this practice as "parasitic" because it exploits the authority of a domain built up over time for purely commercial purposes, creating a competitive distortion with respect to specialized sites that operate independently.

However, the initial policy left a loophole: it could only be applied manually, on a case-by-case basis. In November 2024, the update arrived that closed every gap, making the detection and downgrading of this content automatic. The effects were immediate and drastic. According to data reported by various sources, some European publishers recorded traffic drops of up to 34% in a few weeks.

The problem is that the algorithm does not always distinguish between parasitic exploitation and legitimate partnerships. A publication that publishes clearly labeled sponsored content, or that hosts affiliate sections consistent with its editorial line, can end up in the same bucket as pure spam. It's like using a sledgehammer where a scalpel is needed. google.jpg Image from the Google blog

The Real Battlefield

Behind this technical dispute lies a much larger game about the future of the digital information ecosystem. The Digital Markets Act, which came into force in 2023, designates Google as a gatekeeper precisely because of its dominant role in the distribution of online information. The Mountain View company controls over 90% of the European search market, a position that gives it unprecedented power to decide which content reaches users and which does not.

But there is another dimension to the issue that rarely emerges in public debate: Article 15 of the European Copyright Directive, commonly known as neighbouring rights. This rule, approved after years of legislative battles, recognizes the right of publishers to be compensated when their content is used by digital platforms. Google has always seen this directive as an existential threat to its business model.

Recent history documents this conflict well. In 2018, when the directive was still under discussion, Google had conducted tests in several European countries to demonstrate the impact of a possible removal of news snippets. The results showed traffic drops of 45% for publishers, an intimidating message that said: without us, you are dead. In Spain, where a similar law was passed in 2014, Google simply shut down Google News, causing significant economic damage, especially to small publishers.

France took a different path. After lengthy negotiations and threats of sanctions, Google agreed to pay compensation to French publishers, although the precise amounts and terms of the agreements remain confidential. Angela Mills Wade, executive director of the European Publishers Council, had at the time accused Google of "abusing its dominant position and putting itself above the law."

Today the script is repeated with new variations. Google claims to be fighting spam, publishers denounce an arbitrary downgrade, and Brussels investigates. But the real underlying issue is always the same: who controls the tap of online traffic and who profits from it economically.

The Silent Revolution

To fully understand Google's strategy, one must look beyond this specific controversy and analyze a broader phenomenon: the transformation of search from a navigation tool to a final destination. This is where the most disruptive element of the whole affair comes into play.

A study published by SparkToro in 2024 revealed staggering data: 59.7% of European searches on Google end without any clicks to external sites. This means that for every thousand searches, only 374 lead to traffic on the open web. The rest dissolves within the Google ecosystem: searches that end without action, queries that change without ever leaving the platform, users who find the answer directly on the results page.

The main mechanism behind this transformation is AI Overviews, the automatically generated summaries that appear at the top of the results. When a user sees a complete answer already packaged by an artificial intelligence algorithm, the probability of them clicking on a link is drastically reduced. Research from the Pew Research Center has shown that the presence of these summaries reduces the propensity to click by 50%, and only 1% of users click on the links cited within the AI Overviews themselves.

As I documented in the analysis of the Google AI revolution, this transformation is not accidental but planned. On September 6, 2025, when google.com/ai was redirected to standard search, artificial intelligence became the default engine for billions of daily queries. No longer an experiment, but the new reality of the Internet.

This evolution raises fundamental questions. If Google trains its artificial intelligence models on content produced by publishers, and then uses those models to keep users within its ecosystem, who benefits economically from this transformation? The original creator of the content or the platform that reworks and distributes it?

The debate on how to remunerate content producers in the AI era has just begun. As I delved into in the article on Really Simple Licensing, the protocol proposed by RSS co-creator Dave Winer seeks to create technical standards that allow publishers to specify terms and conditions for the use of their content in training artificial intelligence systems. Platforms like Reddit, Yahoo, and Medium have already joined, but the road to universal adoption still seems long and uncertain.

The Hidden Interests

The dispute between Google and European publishers has multiple layers of complexity, where legitimate reasons intertwine with economic interests and positioning strategies. Analyzing the motivations of each party requires going beyond public statements and looking at the underlying business models.

Google claims to be protecting users from spam, and this statement has its objective validity. Parasite SEO is a real problem: authoritative sites that sell portions of their URLs to commercial operators do indeed create distortions in search results. A user searching for financial information who ends up on a Forbes subdomain with non-transparent affiliate content has reason to feel deceived. Google's algorithm, in this case, is trying to re-establish a consistency between the user's expectation and the content actually served.

However, this protective narrative clashes with an inescapable economic fact: Google directly benefits from the reduction of outbound traffic. Every user who stays within the Google ecosystem for longer is potentially exposed to more Google ads, uses more Google services, and generates more data for Google. Zero-click searches are not an undesirable side effect, but a feature of the system. When the company declares that it is "fighting spam," it is also building an increasingly self-sufficient walled garden.

On the other hand, publishers denounce discriminatory treatment, and here too the argument has concrete foundations. The distinction between legitimate commercial content and parasitic spam is often blurred. Is a publication that publishes well-curated buying guides with transparent affiliate links doing something different from what the New York Times' Wirecutter does? The difference lies in the quality and editorial honesty, not in the presence or absence of commercial purposes.

But the position of publishers also hides non-negligible ambiguities. Many publications have for years built opaque business models, where the line between journalism and advertising has progressively blurred. Native advertising, when done well, can be informative and useful. When done poorly, it becomes indistinguishable from the spam that Google claims to be fighting. The publishers who are protesting against the downgrade today are in many cases the same ones who for years have accepted to host non-transparent commercial content, seeking to maximize short-term revenues at the expense of long-term credibility.

Finally, the European Commission is moving in a delicate balance between protecting competition and safeguarding the information ecosystem. The DMA was created with the aim of preventing digital gatekeepers from using their dominant position to distort markets. Google, with its near-monopolistic control of search, fits perfectly into this category. But the complex question is: when does an algorithmic change become an abuse of a dominant position? If Google really improves the user experience by fighting spam, can the EU force it not to do so in order to protect publishers' revenues?

Checkmate or Stalemate?

The paths that lie ahead for the protagonists of this story are many, each with profound consequences for the future of the web. The most immediate outcome could be an under-the-table agreement, where Google agrees to marginal changes to its policy in exchange for the investigation being dropped. This solution, already seen in the past in other antitrust disputes, would however leave the underlying issues unresolved.

A more drastic scenario involves heavy sanctions and structural remedies. The Commission could force Google to make its ranking criteria more transparent, to create appeal mechanisms for penalized publishers, or even to separate the search business from the advertising business. Similar measures have been applied in other cases of abuse of a dominant position, but their practical effectiveness remains debated.

The most extreme hypothesis, but not entirely implausible, is that Google decides to withdraw some features in Europe, as it has threatened to do several times in the past. The closure of Google News in Spain in 2014 showed that the company is willing to play hardball when it believes that local regulations threaten its business model. A similar move today would have even more dramatic consequences, given the almost total dependence of the European publishing ecosystem on Google's traffic.

The geopolitical context adds further complexities. Public statements from the Trump administration, which has repeatedly criticized European sanctions against American tech companies, could turn a trade dispute into a diplomatic incident. Google, like other US Big Tech companies, could invoke the political protection of its own government, turning the EU investigation into a case of transatlantic tension.

But perhaps the most profound consequence of this affair does not concern Google or the publishers, but the future of online information. If users get used to getting synthetic answers from artificial intelligence without ever visiting the original sources, what incentive remains to produce quality content? If publishers cannot monetize organic traffic because Google keeps it within its ecosystem, how will they finance investigative journalism?

These questions have no simple answers. The balance between technological innovation and the sustainability of the information ecosystem is fragile, and any regulatory intervention risks producing unexpected effects. What is certain is that the model built over the last twenty years, where Google functioned as a large universal distributor of traffic to the open web, is rapidly dissolving. In its place, a system is emerging where access to information is increasingly mediated by artificial intelligences that synthesize, rework, and present content without necessarily leading users to the source.

The challenge for European regulators will be to find a point of balance that protects competition without stifling innovation, that protects publishers without crystallizing obsolete business models, and that guarantees users access to quality information without artificially imposing ways of navigating the web that no longer correspond to real behaviors.

In the meantime, as Brussels and Mountain View face off in what could turn out to be a long legal tug-of-war, the digital ecosystem continues to transform. The most astute publishers are already diversifying their traffic sources, investing in direct newsletters, proprietary communities, and subscription models. Others, less adaptable or simply smaller, risk being overwhelmed by a perfect storm: less traffic from Google, more competition from synthetic content generated by AI, and the growing difficulty of monetizing an increasingly fragmented attention.

The real question, in the end, is not who will win this specific legal battle. It is whether we will be able to build a digital ecosystem where those who produce quality information can be adequately compensated, where technology platforms are held accountable for how they exercise their power of intermediation, and where users maintain access to a plurality of voices and perspectives. The "game" between Brussels and Mountain View is just the latest chapter in a transformation that will profoundly redefine how we produce, distribute, and consume information in the 21st century.