Continent of Contradictions
How EU’s Digital Regulations Fail on Their Own Terms
In two weeks, the European Union (EU) will publish its first mandatory triennial review of the Digital Markets Act (DMA). The DMA, along with its twin, the Digital Services Act (DSA), ranks among the most comprehensive tech regulations in the world. The two cover topics as mundane as app store hosting fees, as technical as interoperability requirements with competitors, and as all-encompassing as election integrity. The DMA review takes place under the backdrop of a larger, omnibus simplification effort launched last year to review EU law, with an eye toward reducing compliance burdens and minimizing inconsistencies.
The DMA and DSA have captured a great deal of attention in the US, and not just because the Trump Administration views them as an unfair money sink for American companies. The concerns that animate the laws are gathering sympathy by the day from American politicians, judges, and juries. That’s why this week, I published a white paper analyzing the efficacy of the DMA, DSA, and related EU laws. Undoubtedly, the laws have had some success, including in pushing companies to open up their tech to greater interoperability with competitors. But those successes are dwarfed by the broader incoherence of a legal scheme that ignores basic constraints inherent to regulating the digital realm. If Europe genuinely wants to improve both the efficacy of these laws and the business environment, it needs to reconsider how these constraints effect the DMA and DSA’s viability.
Compliance Must Be Achievable
Technically and legally infeasible obligations not only discourage major firms from operating, but also limit the growth potential for domestic competitors. When companies complain that they cannot comply with the DMA and DSA, they often say it’s because they are trying to protect users from some other violation of rights to privacy or freedom enshrined elsewhere in European law. Technical expertise helps lawmakers differentiate between when industry claims infeasibility just to delay compliance and when compliance requirements ask the impossible. Yet technical expertise must be coupled with legal feasibility for the regulatory regime to work. When regulations aim to balance competing interests like interoperability and data privacy, or freedom of speech and content moderation, regulators must clarify priorities. Otherwise, compliance with one component of the law will endanger compliance with another. The EU’s regulatory regime fails on this front, preferring to pretend that firm animosity is the only thing impeding the balance of contradicting obligations.
The DSA’s age assurance provisions, for instance, are among the least feasible, since robust compliance using current technology often infringes on both the EU Charter’s right to privacy and its right to expression and information. The DSA mandates online intermediary platforms must appropriately alter minors’ access, content, and advertisements without collecting more data to determine age, profiling based on pre-existing data, or collecting identification. It further requires that platforms not overly age-gate, excessively alter content, or implement methods with too many false positives so as not to impinge on the EU Charter’s right to expression and information.
The best way to comply with these obligations is using digital wallets that confirm identity through privacy-protecting zero knowledge proofs, wherein the application itself can confirm age without accessing identification locally stored on your device. Apple piloted this in 2021 in a handful of states; Google Wallet rolled out its zero-knowledge proof feature less than a year ago. Nonetheless, developers keen on using them in the EU are in a holding pattern while member states focus efforts on an EU-wide December deadline to establish their own digital identity wallets. In the meantime, the lack of clarity around what qualifies as a satisfactory method of age verification has not stopped a slew of European Commission investigations. Should the EU Digital Identity program roll out smoothly (unlikely given the lack of progress among many member states), this would mark a major milestone in kids online safety. Until then, the DSA will remain in force for two years and counting where the path to compliance is murky at best.
Regulations Must Actually Support the Interests They Aim to Favor
Regulations that lack sufficient market evidence to prove their efficacy are likely to undermine the groups they aim to serve. European regulators, and Europeans themselves, may prefer a society where fairness among economic competitors is prized above product cost or quality. Such sentiment is not only legitimate, but has returned to vogue around the globe, including in factions of both American political parties. However, regardless of whether a law aims to support the interest of consumers or competing firms, it needs evidence it will actually serve its intended purpose. Otherwise, laws will, at best, complicate the market and, at worst, hurt the very interests they have intervened to support.
Perhaps the most obvious case of such a provision is the DMA’s interoperable messaging requirement, which aims to make WhatsApp able to communicate with other messaging services. Putting aside the technical difficulties of architecting an inter-app messaging system for end-to-end encrypted systems, DMA Article 7 minimizes the differences in design and function that make competitors attractive in the first place.
Take Telegram, one of the few notable contenders against Meta’s WhatsApp and Messenger in the EU. Unique Telegram features include the ability to deliver messages quietly and to sync chats seamlessly across devices. Each of those is undermined by interoperability with Meta. If WhatsApp adopts quiet message delivery, the feature loses its uniqueness; if it does not, the feature is lost when interoperating. Telegram syncs seamlessly across devices because it uses cloud encryption for most chats; chats using end-to-end encryption are only available on their device of origin. To interoperate with WhatsApp would mean either dropping that key feature or, as Telegram claims WhatsApp does, neutralizing the value of end-to-end encryption. Signal offers another example. WhatsApp sends messages even if the security key changes while the message is in transit, while Signal will not send the message at all in that instance. Signal prioritizes security because it markets itself to especially security-conscious users; WhatsApp prioritizes convenience because it markets itself to everyday users. A common protocol would eliminate product differentiation that benefits different consumer niches.
Today, there is a small contingency of users who opt for other messaging apps simply to protest Meta’s power. For their protest to remain meaningful, the apps they use will have to refuse to participate in the very interoperability scheme that is meant to empower them. Rather than diminish the power of gatekeepers, Article 7 gives Meta legitimate reason to audit competitors’ security practices and gives Meta the metadata of conversations with users of other services. All this is done without offering users much additional reason to choose an alternative service. Indeed, nothing is stopping Meta from simply integrating the novel features of the few startups that have taken the firm up on its offer to sync chats with WhatsApp.
Mandating Content Moderation Inevitably Limits Free Speech and Press
Promoting freedom of speech necessarily hampers regulators’ ability to control illegal, objectionable, and misleading content. In attempting to both crack down on such content and promote free speech and media, the DSA runs afoul of European legal customs while forcing platforms to faithfully predict what EU regulators deem acceptable speech. Even though these provisions originate from a European distrust of platforms’ ability to police content well, the DSA places a much greater expectation on platforms to do precisely that.
In theory, the DSA maintains the EU’s “safe harbor” principle, which establishes that platforms have no obligation to comb through every post to ensure its legality. In reality, the DSA chips away at this limited liability by establishing a network of “trusted flaggers” whose flags for illegal content, including hate speech, must be prioritized and ideally reviewed within 24 hours. Given the volume of flagged posts and the documentation involved in removal, the turnaround window incentivizes platforms to rubber-stamp trusted flaggers’ suggestions, regardless of whether the post would qualify as illegal if scrutinized by a court. This is why France’s constitutional court struck down a similar hate speech law in 2020, ruling that the scheme encouraged over-removal and allowed flaggers (in this case, France’s law enforcement agencies) to circumvent the courts in deciding what is legally acceptable. It’s also why the UN Special Rapporteur on Opinion and Expression warned that governments holding platforms liable for what is posted on them chills free speech and cannot be the path toward more civil online spaces.
The EU’s content moderation requirements go well beyond trouncing the due process of hateful posters, and its goals are chock-full of contradictions. In promoting legitimate journalists by making platforms warn them before removing or demoting their content, they impede platforms’ ability to minimize false and misleading information. In giving media companies (among others), the ability to fact-check, they allow more established media outlets a privileged position in the information market, to the detriment of independent journalists and media pluralism. The harms are not hypothetical, case law and examples of independent journalists hurt by the type of removal the EU now mandates are plentiful.
Lessons Learned
In some ways, Europe is a distorted reflection of America. It believes a free society need not lead to one where vitriol and brain rot are its primary means of political engagement. It wants small market actors to have a fighting chance against billion, and now trillion, dollar companies. Somewhere deep down it believes that industry is not the enemy, that a refined version of it can and must fuel its workforce and advance its society. Its laws governing the digital world, unrestrained as they are, are trying to express the same sentiments that seem to be some of the little common ground left in American political life. For all this, I wish the Europeans luck in their review and reform efforts. But even if they would prefer to maintain their utopic yet unworkable morass of digital regulations, American lawmakers can’t afford to ignore the lessons the DMA and DSA have taught us.
Let Europe tilt at windmills, America needs to touch grass.




