Each time know-how collides with copyright, a well-recognized thought resurfaces: obligatory licensing as a shortcut. Immediately, the goal is synthetic intelligence. The declare, particularly from India, is that the one lifelike approach to deal with AI coaching on copyrighted works is thru a blanket, obligatory license—easy, environment friendly, inevitable. In a mysterious accident, the EU Parliament issued one other report calling for obligatory license for AI coaching. What a coincidence!
A obligatory license for AI is none of these issues.
Obligatory licensing for AI shouldn’t be a realistic coverage answer. It’s a tutorial train that collapses beneath its personal weight as soon as it contacts real-world transaction prices, attribution failures, enforcement gaps, governance disputes, and platform seize. And we don’t want to invest about how this performs out. We have now already run this experiment—repeatedly—and it has failed each time.
That’s presumably as a result of it’s a protected harbor masquerading as equity.
THE CORE FALLACY: DEFINING A MARKET INTO EXISTENCE
The obligatory AI licensing proposal rests on the identical flawed premise that animated earlier “international licensing” schemes for piracy: that the absence of a functioning market proves market failure, and that the answer is to switch markets with administrative allocation.
However markets don’t fail in a vacuum. Markets require enforceable property rights. When these rights are suspended or diluted, the consequence shouldn’t be effectivity however arbitrage, opacity, and the erosion of reputable distribution channels.
This was the central flaw of ISP licensing proposals within the 2000s. These schemes sought to legalize mass infringement after which compensate creators by means of a collective, statistical clearinghouse. You can not outline a market into existence by legalizing the very conduct that destroys it.
THE UNWORKABLE FISHER PROPOSAL: TOO ELEGANT FOR ITS OWN GOOD
A lot of at this time’s enthusiasm for obligatory AI licensing traces again to the early 2000s work of Professor William W. (Terry) Fisher III, most notably his ebook Guarantees to Maintain. Fisher argued that digital copying had rendered conventional copyright enforcement impractical and proposed changing unique rights with an “different compensation system.” Underneath his mannequin, creators would register their works, the federal government would accumulate a compulsory charge or tax from customers or intermediaries, utilization can be statistically estimated, and funds can be distributed proportionally. In trade, most unauthorized copy and distribution can be legalized.
I truly watched Fisher current this mannequin on the OECD’s Future Digital Financial system convention in Rome in 2006. Dude was clean, I’ll give him that, however he was preaching to a room largely stuffed with boffins who didn’t know higher. The present actuality hasn’t modified a lot.
As a tutorial train, the proposal was elegant. As a real-world system, it was fatally flawed and by no means bought very far.
Fisher’s mannequin assumed dependable identification of works, correct measurement of utilization, low transaction prices, and a impartial administrator able to resisting seize. None of these assumptions held true even within the comparatively easy world of peer-to-peer music recordsdata. Matching failures, governance disputes, and misaligned incentives overwhelmed each severe try to operationalize the thought. The consequence was not effectivity however delay, opacity, and persistent underpayment.
AI coaching exposes these flaws in excessive type. If Fisher’s system couldn’t be made to work for discrete songs with steady identifiers, there isn’t any credible argument that it might probably work for fragmented, reworked inputs throughout each class of copyright. What was as soon as theoretical optimism now reads as a warning label: magnificence on paper shouldn’t be an alternative choice to enforceable rights in observe.
FRANCE, CHORUSS, AND THE GRAVEYARD OF “GLOBAL LICENSES”
Proponents of obligatory AI licensing usually level vaguely to Europe or previous experiments as proof of idea. The historic document tells the other story.
France’s mid-2000s flirtation with international licensing—designed to “resolve” peer-to-peer piracy by means of levies and blanket permissions—collapsed beneath political resistance, measurement impossibility, and creator mistrust. It by no means produced a steady, trusted market.
Choruss, backed by main labels and know-how firms, promised frictionless licensing and mass participation. It failed for a similar causes: who pays, who collects, how utilization is measured, how audits work, how impartial creators keep away from being drowned out by incumbents, and the way rights holders retain any significant management. Choruss quietly disappeared as a result of its inner logic by no means matched operational actuality.
These weren’t implementation glitches. They have been structural failures.
AI MAKES THE MEASUREMENT PROBLEM WORSE, NOT BETTER
AI coaching doesn’t eat works in neat, trackable items. It ingests fragments, embeddings, and latent representations. Attribution is probabilistic. Outputs are non‑isomorphic. Any licensing scheme that depends upon dependable identification of “utilization” is combating the structure, not the legislation.
SPOTIFY IS THE CASE STUDY — AND THE WARNING
Spotify is about as near a sound recording obligatory license as a non-public firm can get with out statutory designation. It enforces licenses by means of brutal market energy, units costs administratively by means of monopoly pricing energy, and renders consent largely theoretical.
On the track aspect, Spotify already loved an awfully favorable track obligatory license beneath Part 115. Even so, it failed catastrophically at primary administration: matching works, figuring out rightsholders, and paying songwriters. This led, predictably, to class motion litigation and thousands and thousands in settlements. Fairly than repair these failures, Spotify went to Congress to protect its IPO.
The consequence was the Music Modernization Act’s rewrite of Part 115 in Title I—a rewrite bought as modernization however functionally designed to offload Spotify’s compliance failures onto a brand new collective, the Mechanical Licensing Collective (MLC). The executive burden didn’t disappear; it was merely transferred. The implications are actually well-known: a billion-dollar “black field” of unmatched royalties and the MLC, extended nonpayment to creators, and a system the place the prices of Spotify’s scale are borne by songwriters and publishers somewhat than the platform that precipitated them.
This historical past issues as a result of it exposes the core phantasm behind obligatory AI licensing. Even when a license is slim, well-defined, and restricted to a single class of works—as Part 115 is—administration fails at scale. When it fails, platforms don’t take up the associated fee. They restructure the legislation to externalize it.
To suggest obligatory licensing for AI—throughout all copyrighted works, throughout borders, throughout modalities, throughout probabilistic coaching methods—whereas ignoring the Spotify precedent shouldn’t be optimism. It’s willful blindness.
If essentially the most highly effective streaming firm on this planet couldn’t responsibly handle a obligatory license for songs it was already legally entitled to make use of, there isn’t any credible argument that AI firms will do higher when granted obligatory entry to the complete corpus of human tradition.
Title I of the Music Modernization Act rewrote Part 115 to switch these administrative failures to the Mechanical Licensing Collective and protect Spotify’s extractive enterprise mannequin. The consequence was a billion‑greenback black field of unmatched royalties and extended nonpayment to creators.
“BUT THE MLC FIXED IT.” NO, IT DIDN’T.
The predictable response to the Spotify instance is that the Music Modernization Act “fastened” the issue by creating the Mechanical Licensing Collective. That declare doesn’t stand up to even informal scrutiny.
The MLC didn’t resolve Spotify’s administrative failures; it inherited them. The unequalled-royalty drawback didn’t disappear—it metastasized right into a billion-dollar black field. Years later, the MLC continues to battle with matching accuracy, knowledge high quality, timeliness, and governance legitimacy. Songwriters nonetheless wait. Publishers nonetheless audit. Disputes nonetheless pile up.
Extra tellingly, the MLC is now beneath energetic criticism by the very licensees it was designed to learn. Digital providers that championed the MMA as an answer have complained publicly and privately about MLC prices, processes, redesignation threat, and governance. The collective is already going through questions on whether or not it ought to even proceed in its present type.
That’s the actual lesson of Part 115. Even when Congress bends over backward to accommodate platforms—granting blanket entry, capped legal responsibility, protected harbors, and a centralized administrator—the system nonetheless strains beneath scale. And when it does, the stress flows downhill, not up. Creators bear the delay. The executive physique absorbs the blame. The platform retains working.
Calling this a hit requires an awfully selective definition of “working.”
THE PUNCHLINE
If a obligatory license for one class of works, in a single nation, with many years of precedent, collapsed beneath administrative failure, why would anybody outdoors the college eating room assume there’s a probability in hell {that a} obligatory license for all copyright classes would fare higher? These regimes are extractive and are fully for the advantage of the licensees with the political clout to jam it by means of the legislative system.
Obligatory AI licensing shouldn’t be lifelike. It’s retreat dressed up as rigor. In the true world, it can collapse beneath its personal weight—identical to each model earlier than it. Which is the thought from the licensee’s standpoint.



