A family with a newborn baby stands in front of their destroyed home in Gaza

The Product Demonstration

American Companies helped Israel use AI in Gaza. They destroyed humanity as proof of concept. Now they are selling it. Tonight: an 8pm deadline. "A whole civilization will die." The machine is still running — and no one is governing it.

Photo: @smyrt4732 — A family of three in Gaza. They have a newborn baby girl. They are asking for your support →
This is the second article in the Captured Tech series. Read Part 1: Captured Tech →

Tonight, April 7, 2026 — 8pm Eastern — President Trump has set a deadline for Iran to reopen the Strait of Hormuz or face the destruction of its power plants and bridges. This morning he wrote: "A whole civilization will die tonight, never to be brought back again." The AI system that has already struck 11,000 targets in Iran is operational. The oversight body that should govern it does not exist. This article documents how we arrived at this moment — and what the business model behind it is.

What "Battle-Tested" Actually Means

When defense companies market weapons internationally, the phrase "battle-tested" carries a specific commercial meaning. It means the system has been used on real targets, in real conditions, against a population that could not refuse. Gaza was that population. The results — documented, quantified, exported — became the sales pitch.

Israel sells its technology and weapons to an estimated 130 nations, including military dictatorships in Asia and Latin America. Annual Israeli arms sales reached a new record in 2024, for the fourth consecutive year, amounting to double the value of exports of five years prior — nearly $14.8 billion, up from $13 billion in 2023. Between 2018 and 2020, that number hovered between $7.5 and $8.5 billion. The Gaza war did not disrupt this. It drove the increase.

Weapons manufacturers' promotional materials explicitly boast their products have been "tested" and "proven" in "battle." The military intelligence Unit 8200 serves as an incubator for Israeli surveillance tech start-ups. The testing ground those promotions reference is the occupied Palestinian territories. That is not a characterization — it is the documented sales mechanism, established across decades of occupation and accelerated in Gaza.

"I think Israel sells two things: what weapons you can use to murder and target Palestinians — but also how to get away with it." — Antony Loewenstein, journalist and author, The Palestine Laboratory

Investigative journalist Antony Loewenstein, whose book The Palestine Laboratory won the 2023 Walkley Book Award, documented this model before the current Iran war began. His interview below names the companies, the contracts, and the mechanism directly:

His core finding: the companies aren't primarily motivated by ideology. They are using Gaza to demonstrate to other governments what mass surveillance and targeting infrastructure can do at scale. The civilian deaths aren't a failure of the system. They are the demonstration of its capability.

The AI Systems Deployed in Gaza

While Project Maven and its role in Iran is now receiving significant public attention, the AI targeting systems deployed in Gaza preceded it — and their documented performance is what created the product catalog now being sold to 130 countries.

The Gospel is an AI system that automatically reviews surveillance data looking for buildings, equipment and people thought to belong to the enemy, and upon finding them, recommends bombing targets to a human analyst. Lavender is a separate system — an AI-powered database that at one stage identified 37,000 Palestinians as potential targets based on their apparent links to militant groups. A third system, called "Where's Daddy?", tracked individuals on the kill list and was deliberately designed to locate them when they were at home with their families.

37,000 Palestinians placed on the Lavender AI kill list — with a documented 10% margin of error acknowledged internally by the Israeli military.
20 sec Time intelligence officers reported spending per target before authorizing bombing of alleged militants marked by Lavender.
130 Nations Israel sells its battle-tested weapons and surveillance technology to — with arms exports doubling in five years.

Intelligence officers who spoke to +972 Magazine stated the military knew — because they checked — that approximately 10% of the people the machine marked for killing were not Hamas militants. The accepted collateral damage ratio for junior operatives was fixed at up to 20 civilian deaths per target. For a senior commander, the military on several occasions authorized the killing of more than 100 civilians in a single assassination. These are not allegations. They are documented accounts from Israeli military sources who described being "shocked by committing atrocities."

The Habsora system (The Gospel) generated up to 100 targets per day — compared to approximately 50 per year that human analysts previously identified. That is a 700-fold increase in targeting speed. Accuracy did not scale with it. What scaled was the pace of destruction, and the market value of demonstrating that pace to governments around the world.

The Connection to Iran — Tonight

Project Maven, the U.S. system now operational in Iran, is a direct institutional descendant of what was refined in Gaza. Israel's Lavender and Gospel systems are explicitly referenced in U.S. military documentation as comparable decision-support systems. The same companies — Palantir, Elbit, affiliated contractors — operate across both theaters. The same absence of enforceable international regulation governs both.

In the first 24 hours of Operation Epic Fury on February 28, U.S. forces struck over 1,000 targets in Iran. By early April, CENTCOM had struck more than 11,000. On that same first morning, a U.S. Tomahawk missile struck Shajareh Tayyebeh Elementary School in Minab, killing between 175 and 180 people — most of them girls between seven and twelve years old. The school had been classified in a Defense Intelligence Agency database as a military facility. That database had not been updated since at least 2016, when satellite imagery shows the school was separated from the adjacent IRGC compound by a wall.

The Pentagon investigation into the Minab strike remains ongoing. No findings have been made public. No one has been charged.

"A whole civilization will die tonight, never to be brought back again. I don't want that to happen, but it probably will." — President Donald Trump, Truth Social, April 7, 2026

Tonight, Trump has threatened to destroy all of Iran's power plants and bridges if Iran does not reopen the Strait of Hormuz by 8pm Eastern. Amnesty International's Secretary General Agnès Callamard responded: "International humanitarian law strictly prohibits direct attacks on civilians and civilian objects. The U.S. President's threat of extermination and irreparable destruction brazenly shreds core rules of international humanitarian law, with potentially catastrophic consequences for over 90 million people."

Senator Elissa Slotkin, a former CIA analyst, stated Trump's threatened targeting of civilian infrastructure "would be a clear violation of the law of armed conflict as laid out in the Geneva Conventions, as well as the Pentagon's Law of War Manual," and called for service members to refuse illegal orders. Former Trump ally Marjorie Taylor Greene called for his removal via the 25th Amendment, posting: "We cannot kill an entire civilization. This is evil and madness."

The AI system that would execute those strikes has no external oversight body. The White House blocked state-level attempts to build one. Congress is deadlocked. The June 2026 date when Maven begins transmitting 100% machine-generated intelligence to combatant commanders is ten weeks away.

The Accountability Gap Loewenstein Named

There is no enforceable global regulation for spyware or military drones. That is not an oversight — it is a market condition. Loewenstein's research makes the mechanism plain: governments worldwide are, in his words, "obsessed with these tools. They can't give them up. They're desperate to listen to their opponents, to journalists, to activists. It's very hard for these regimes to give them up because there's no regulation. There's just none. It just doesn't exist."

The engineers who oppose what their tools are being used for have signed NDAs. Those who object publicly face professional destruction — a pattern documented across Google in 2018, Google in 2024, and CBP in 2026. The resistance is real. It is also small relative to the institutional momentum of a $14.8 billion annual industry with a documented proof of concept and 130 existing customers.

What Loewenstein's framework adds to what "Captured Tech" established: the accountability gap isn't just enforced. It is profitable. The classification that makes oversight impossible is also the condition that makes the sales pitch possible. "Battle-tested" requires that what happened in the battle stay classified. The deaths are the product. The secrecy is the warranty.

What Is Documentably True — Right Now

Public Record — April 7, 2026

  • Trump has publicly threatened to destroy Iran's civilian infrastructure tonight and stated "a whole civilization will die." These are his words, on the record.
  • The AI system that has struck 11,000 Iranian targets is operational and ungoverned by any standing federal law.
  • The same targeting model — AI-assisted, fast, with documented accuracy failures — killed 170+ children in a Gaza school before the Iran war began, and 170+ in Minab on the first day of it.
  • Israel's arms exports doubled in five years of Gaza operations. The technology was marketed as battle-tested. The battle was Gaza.
  • The White House blocked a Republican state legislator's AI transparency bill with a one-line memo offering no explanation.
  • Congress has passed no comprehensive federal AI law. A DOJ task force is prepared to sue states that try to fill the gap.
  • No one has been held accountable for the Minab school strike. The Pentagon investigation has not been made public.

The Conscience Is Not Gone

The opposition to tonight's threatened strikes is not coming only from the left. At 3:12pm today, a verified Republican National Delegate from Texas — someone who worked to put Trump in office — posted this, now at nearly 250,000 views and climbing:

This is the cognitive dissonance window. The conscience senses something is wrong. The conditioned response — party loyalty, institutional momentum, the confidence of the algorithmic signal — has been acting anyway. Tonight that gap is visible, in public, from inside the coalition that built this. That gap is where accountability lives. It is also where it can be built.

What You Can Do — Tonight and Beyond

Tonight specifically: Members of Congress are on record opposing tonight's threatened strikes on civilian infrastructure as illegal under international and U.S. law. Senator Slotkin, Congressman McGovern, and others have issued public statements. Call your representative's office now and state your opposition to strikes on civilian power and water infrastructure. Congressional switchboard: 202-224-3121. Ask specifically: does your representative support or oppose the targeting of civilian infrastructure in Iran tonight?

Demand the Minab investigation: A U.S. military preliminary investigation into the school strike that killed 170+ children — mostly girls — exists and has not been made public. Your representative can demand its release. Eight UN experts have already called for an independent investigation. That call needs public pressure behind it.

Support state AI legislation: More than 35 states have active AI bills right now. The White House is working to kill them. Republican state legislators from 22 states have already written to Trump asking him to stop. Find your state's active AI legislation and contact your state representative to support it. This is the only governance layer being built.

Follow the money: Palantir is publicly traded. Its Maven contract is worth $1.3 billion. The pension funds, university endowments, and institutional investors that hold Palantir stock have standing to demand answers about system performance, liability exposure from the Minab strike, and the ethical framework — or lack of one — governing contracts that execute lethal strikes. If you have access to any institutional investor relationship, use it.

Name the business model: Share this article. The specific claim that must enter public discourse: the AI targeting systems used in Gaza were not tested despite the civilian deaths. The civilian deaths are what "battle-tested" means. That product is now operational in Iran, ungoverned, with an 8pm deadline tonight and a stated intent to destroy an entire civilization. That sentence needs to be in the public record before the bombs fall.

📡 How This Story Was Surfaced

This article is the second in the Captured Tech series. It was built from the research base of Antony Loewenstein's The Palestine Laboratory (Walkley Award, 2023), +972 Magazine's documented investigations into Lavender and Gospel, Human Rights Watch and Amnesty International's published findings on AI targeting in Gaza, and breaking CBS News, CNN, NBC News, Al Jazeera, and Amnesty International reporting from April 7, 2026. All sources are public record. The June 2026 Maven machine-intelligence date is a documented program schedule, not a prediction.

Searches breaking news · FLUX papers · investigations · peer-reviewed science simultaneously

Kaleido Investigates — Hidden in plain sight.