“There’s an infinite realization that it’s actually now AI, mass and autonomy that’s driving the brand new battlefield.” Spotify CEO and Helsing Chair Daniel Ek.
There’s one other “monumental realization” which is that plucky AI startups will not be glad with getting their snouts into the general public markets, they need the true cash: The navy industrial complicated, mainlining money straight from the taxpayer and energy straight from the President of the USA. Autonomous weapons are the brand new, new factor however would they like a aspect of ethics with that? I doubt it.
The Huge Sleep Playlist
There’s a direct connection however maybe nonobvious connection between the horrors of napalm utilized by the U.S. and a handful of different militaries towards each navy and civilian “goals” and deadly autonomous or near-autonomous weapons like these supplied to the UK, Germany, and Ukraine militaries by Daniel Ek’s Helsing SE (Wikipedia lists Daniel Ek as Chairman of Helsing). Ek shouldn’t be alone, after all; Eric Schmidt can also be doing every thing he can to get his snout into the abattoir, additionally.
The usage of napalm each in WWII throughout the space bombing marketing campaign towards civilians in Japan as a part of the run as much as the atomic bomb and in Vietnam was deeply controversial. It led to the 1977 worldwide negotiation of a brand new rule for the Geneva Conference concerning the implementation of recent weapons methods that modified our idea of weapons of struggle.
So in the case of AI and the way forward for warfare, a very powerful rule you’ve possibly by no means heard of is Article 36 of the Geneva Conventions’ Further Protocol I. It says that earlier than a rustic builds or makes use of a brand new weapon, many states have agreed to examine if that weapon follows the legal guidelines of struggle and worldwide humanitarian legislation — a form of “fight security inspection” for legality and morality. And guess who’re signatories to the Geneva Conventions? The US, UK, Germany and Ukraine, very fashionable prospects for deadly near-autonomous weapons all of that are fairly simply convertible into absolutely autonomous weapons (aka “LAWS”). Article 36 is nothing new—all of them know all about it and sure seen it as a “gating merchandise” to full LAWS implementation (and let’s not overlook revenue).
So take into account that the controversy about autonomous weapons has been very lively and really excessive stage for a really very long time. In truth, Elon Musk, Stephen Hawkings and lots of different luminaries signed an open letter foreshadowing Schmidt’s and Ek’s investments into killer robots and in search of the ban of autonomous weapons and LAWS in 2016 which mentioned partially:
“Autonomous weapons choose and have interaction targets with out human intervention. Synthetic Intelligence (AI) know-how has reached a degree the place the deployment of such methods is — virtually if not legally — possible inside years, not a long time, and the stakes are excessive: autonomous weapons have been described because the third revolution in warfare, after gunpowder and nuclear arms.”
Autonomous weapons are a sizzling matter in JAG circles, too, as one would anticipate—in reality, a distinguished Decide Advocate argued within the Army Regulation Overview that navy attorneys must be “on the design desk” to observe legislation of armed battle points with autonomous or near-autonomous weapons.
The purpose being the nationwide mobilization being referred to as for by the AI cabal is all of a bit that individuals like Daniel Ek and Eric Schmidt are very conscious of. Are there different unhealthy guys, too? Oh certain, it’s simply that these two owe their huge wealth roughly on to corrupting the music enterprise and artists are noticing.
SIPRI’s Gold Commonplace
In Sweden…ahem…there’s no greater authority on this than the Stockholm Worldwide Peace Analysis Institute (SIPRI). SIPRI has spent a long time learning weapons governance and arms management. In its 2015 Perception 1501, it warned that the majority Article 36 critiques are caught previously — centered on {hardware} — and don’t correctly examine software program, AI algorithms, datasets, or publish‑deployment efficiency. Most likely simply the place Schmidt & Co. would love them to be—permissionless innovation, don’t you recognize.
SIPRI’s answer? Check not simply the steel and mechanics, however the code and coaching knowledge; require unbiased oversight, not only a navy company marking its personal homework. Then preserve reviewing the system even after it’s deployed. Simply possibly Peter Pan and the Misplaced Boys are as much as no good on Treasure Island?
The NSCAI Loophole
Talking of Peter Pan, now leap to Washington, D.C., 2021. Eric Schmidt — ex‑Google CEO — chairs the U.S. Nationwide Safety Fee on Synthetic Intelligence (NSCAI). As a substitute of pushing for SIPRI‑fashion upgrades, the NSCAI mentioned the U.S. is already in compliance and that current Pentagon critiques are “enough.” Translation: preserve the method closed, don’t invite exterior scrutiny, and let nationwide militaries determine for themselves. Nothing to see right here.
However that U.S. place issues, as a result of it shapes international “norm‑setting” — and may set a low bar that different nations comply with, type of just like the DMCA protected harbor and what David Sacks is chasing with the Trump AI Motion Plan.
And don’t overlook—Eric Schmidt chairs Swift Beat, a tech agency supporting Ukraine with AI intercept drones as readers will recall. These methods have reportedly intercepted round 90% of Russian Shahed drones. In July 2025, Schmidt signed a long-term strategic partnership with Ukraine’s management to mass-produce tons of of hundreds of drones by 2025–26. Ukrainian studies word Schmidt’s drones are the simplest AI interceptors towards Shaheds underneath battlefield situations. Schmidt has described himself as a “licensed arms supplier” after launching a drone enterprise referred to as White Stork, aimed toward producing low-cost autonomous strike drones for Ukraine. He publicly argues that trendy battle might be dominated by AI drone swarms, devaluing conventional platforms like tanks.
So are you stunned that Schmidt was on NSCAI and is making an attempt to persuade the Pentagon that they don’t want to fret about Article 36 compliance? Lengthy con a lot? (Whose thought was creating NSCAI within the first place I’m wondering?)
Enter Daniel Ek and Helsing
Daniel Ek, CEO of Spotify, is now a serious participant in European protection tech. By his enterprise fund Prima Materia based with Spotify investor Shakil Khan, he chairs and bankrolls Helsing SE, a quick‑rising AI protection firm primarily based in Germany however with associates in different nations together with the UK. Helsing builds autonomous offensive strike drones utilized in Ukraine, AI‑pushed underwater surveillance gliders, and fighter‑jet autonomy methods. You already know, LAWS.
Examples are the HX‑2 drone, utilizing the Altra AI system, is deployed in massive numbers in Ukraine and may function autonomously when communication is unreliable; the SG‑1 Fathom underwater drone, powered by the AI system Lura, autonomously displays underwater threats for months at a time; and the Centaur AI piloted a Gripen E fighter jet in take a look at flights, indicating autonomous air fight functionality.
These are precisely the sorts of autonomous methods SIPRI says require essentially the most rigorous modernized Article 36 critiques — as a result of they’ll establish, observe, and have interaction targets with little or no human enter. And “no” is on observe to win out over “little” simply any day now.
Artist Backlash
Artists like Deerhoof, King Gizzard & the Lizard Wizard, Xiu Xiu, and others — have eliminated their music from Spotify in protest of Ek’s funding in Helsing. They argue Spotify’s earnings in the end subsidize AI drones and protection methods used on actual battlefields — know-how that may hurt civilians. As Deerhoof put it: “We don’t need our music killing individuals.” I recognize the sentiment, however belief me, that ship has sailed.
The Open Query
Helsing’s mission assertion talks about defending democracies. Don’t all of them. However mission statements aren’t authorized critiques. The SIPRI mannequin would require Helsing’s AI run by one in all Sweden’s richest males to be audited for algorithmic bias, concentrating on accuracy, accountability, and transparency in coaching knowledge and code. The NSCAI mannequin, in contrast, would let these checks occur fully inside nationwide protection companies, with no public accountability. Or would probably argue that no matter Schmidt is doing within the Ukraine doesn’t implicate DOD evaluation, which I feel wouldn’t maintain up underneath DOJ scrutiny if anybody bothered to look.
Potential Battle Crimes Publicity
So once I say that these frontier labs could have some struggle crimes publicity, that is what I’m speaking about. Provided that the navy attorneys need to be on the design desk—which they clearly will not be—I’m not the one one who thinks so.
If autonomous weapons deployed by Helsing or corporations backed by Eric Schmidt trigger illegal hurt—equivalent to indiscriminate assaults on civilians—each traders may, underneath uncommon however believable circumstances, face struggle crimes scrutiny.
Article 36 obliges states to vet new weapons for compliance with worldwide humanitarian legislation (IHL), but when that evaluation is weak or bypassed, and the weapon proves inherently illegal, particular person duty could comply with. Underneath doctrines of command duty and aiding and abetting, designers or financiers may very well be liable in the event that they knowingly facilitated illegal use.
Historic precedent exists within the post-WWII Krupp trials, German industrialists have been convicted for supplying weapons utilized in struggle crimes. Extra not too long ago, producers of cluster munitions and landmines have confronted civil and political penalties. If AI-enabled methods like these deployed by Helsing fail to tell apart civilians or function unpredictably, and people failures have been foreseeable, Ek or Schmidt may face investigation — particularly if deployed in high-profile conflicts like Ukraine.
Useless Souls Redux
Is Helsing’s AI know-how being reviewed underneath the stricter, software program‑inclusive Article 36 course of SIPRI requires, or underneath the looser, closed‑door mannequin NSCAI prefers? For an organization headquartered in Stockholm and led by one in all Sweden’s most excessive‑profile entrepreneurs, the reply issues — not only for compliance, however for credibility.
Each Schmidt and Ek could discover authorized cowl in the identical protection contractor protected harbor that shielded Dow Chemical from struggle crimes costs over napalm. However authorized immunity received’t shield them from public outrage, reputational fallout, or international protests if their AI weapons trigger civilian hurt. The backlash will come. In truth, in Ek’s case, it’s already began.
No matter protected harbors these two and others like them can extract from the AI Motion Plan is a Faustian discount—a pact buying and selling long-term human dignity for short-term technological energy. Like Faust, who declares, “If ever I to the second shall say: Lovely second, don’t go away!”, policymakers danger surrendering ethical judgment to fleeting progress. The Plan’s silence on copyright, labor, nationwide safety, and democratic oversight echoes Mephistopheles’ seduction: “I’m the spirit that negates.” In courting AI dominance, the federal government assumes management with out accountability, promising innovation whereas stripping creators and residents of rights.
It appears clear to me that Schmidt and Ek have already signed their take care of Devil. They know all this and so they did it anyway. As Faust in the end learns, unchecked ambition invitations damnation. So too could we, except we reclaim the soul and spirit of our legal guidelines.



