TrendNew Politics. Diplomacy. Markets. Tech. What matters.
Trends 6 min read

The Weird Summer When Tech Got Violent, Paranoid, and Real

From firebombs at OpenAI's doorstep to dancers performing as holograms, Silicon Valley is having a reckoning moment—and it's messy.

The Weird Summer When Tech Got Violent, Paranoid, and Real

A Molotov cocktail hit Sam Altman’s gate. A federal court just told Anthropic they can’t fix their defense department problem. Meta is desperately scrubbing ads for lawsuits about addiction. And somewhere in London, the mayor’s office is battening down against what they’re calling a “disinformation blizzard.”

This isn’t the smooth, inevitable march of innovation we’ve been promised.

What we’re watching instead is a tech industry under pressure from every direction at once—not just from regulators and competitors, but from its own contradictions finally reaching critical mass. The last few weeks have laid bare something that’s been building for years: the gap between what tech companies say they’re doing and what’s actually happening is wide enough to drive a truck through. Or, apparently, throw an incendiary device through.

When the Pressure Gets Personal

Let’s start with the blunt fact: someone threw a firebombed device at Sam Altman’s house. Police arrested a suspect. It’s easy to dismiss this as an isolated incident from a lone actor—the kind of thing that happens when you’re the public face of a $100 billion company. But it’s worth asking what it means when the CEO of the most closely watched AI company in the world becomes a literal target.

I’m not equating the attack with legitimate criticism. That would be dishonest. But I am saying that when technologists operate with the speed and scale of OpenAI—pushing frontier AI systems to millions of users while federal agencies still don’t have coherent policies—some friction was inevitable. The fact that it’s turned kinetic is a signal that the conversation has degraded below the level of arguing.

The timing here matters. This happened as Meta was scrubbing recruitment ads for social media addiction lawsuits after losing a landmark California case. Not settling. Losing. In court. The company spent years claiming its platforms weren’t addictive, that user engagement metrics were about value, not vice. A jury disagreed.

A close-up shot of VR goggles placed on a green grass lawn under sunlight. Photo by Scott Webb / Pexels

The Authenticity Problem Goes Holographic

But here’s what’s strange: while the industry implodes over lawsuits and firebombs, some of the most genuinely moving innovation is happening almost by accident.

Breanna Olson, a dancer with MND—motor neurone disease—performed on stage again using brainwave technology that translated her neural signals into a digital avatar. She talked about regaining “the expression and connection she felt had gone.” That’s not surveillance. That’s not engagement metrics. That’s assistive technology doing exactly what it should do.

So we have this weird moment where the defensive, proprietary, litigation-soaked parts of tech are getting exposed as hollow, while the stranger, wilder, less profitable edges are quietly solving real problems. A person with a degenerative disease gets to dance again. That works. That matters.

Meta’s new AI model, Muse Spark, underperforms rivals on coding ability. Amazon’s killing support for Kindles from before 2013 so users can’t download new books they thought they owned. And Volkswagen—Volkswagen!—is backing away from EV production, pivoting back to gasoline because the math doesn’t work yet.

The through-line here is that the tech industry oversold timelines. We were told EVs would replace gasoline by now. We were told AI would be obviously beneficial. We were told social media would connect us. None of that has landed the way the pitch decks promised.

The Disinformation Thing Is Different

What Sir Sadiq Khan is describing in London isn’t a new problem. Propaganda and coordinated messaging have existed forever. But the scale and the speed are different now.

When you can generate narratives at digital velocity, targeting specific demographics with personalized crisis stories, the traditional friction that used to slow disinformation down—cost, expertise, distribution—vanishes. Khan’s talking about a “blizzard.” That’s the right metaphor. Not a single lie, but so much noise that truth becomes a visibility problem.

And the White House staff ban on prediction market betting? That’s a tell. Those platforms have grown to the point where they’re creating new incentive structures around real-world outcomes—wars, elections, crises. If your job is national security and you’re suddenly worried about staff gambling on geopolitical events, you’ve recognized that the markets might be distorting the information environment itself. You’ve created a shadow oracle that’s starting to influence policy.

Businessman reading a financial newspaper at a desk, highlighting finance and commerce theme. Photo by nappy / Pexels

My Take on What’s Happening

I think we’re watching a civilizational lag problem in real time. We built these systems—AI, social platforms, predictive markets, digital identity infrastructure—faster than we built the governance or even the basic honesty frameworks around them.

The tech industry didn’t wake up yesterday to realize social media is addictive. They knew. They measured engagement because engagement drives ad revenue. The lawsuit loss in California is a court finally saying what users have known for a decade: the business model is incompatible with the product’s stated purpose.

Anthropic gets a “supply chain risk” label from Defense because they’re building AI systems while the military’s still figuring out how to use AI, and nobody wants AI-enabled weapons systems built by companies that just got tagged as risky. It’s bureaucratic caution, but it’s caution we should probably have.

Volkswagen backing out of EV production isn’t a doomsday signal for electric cars. It’s a signal that the transition is real, it’s happening, and it’s messier and slower than venture capital timelines prefer. That’s actually fine. Messy transitions are how things actually change without breaking.

Here’s my honest uncertainty: I don’t know if the violence at Altman’s house is going to accelerate security theatre around AI leadership, making it harder for actual dissent to be heard, or if it’ll force a real reckoning about what responsibility looks like when you’re moving at this pace. Both seem possible.

My prediction: by Q4, we’ll see at least two more major AI companies face either lawsuits or regulatory action around training data, safety claims, or labor practices. The industry’s luck with dodging accountability is running out.

What I’m Watching

  • Anthropic’s defense contract status by October. If they can’t clear that “supply chain risk” label, it signals the DoD has fundamentally lost confidence in AI safety oversight. That’s not just a company problem—that’s a sector problem.

  • Meta’s response to the addiction lawsuit. Watch whether they make structural changes to recommendation algorithms or just write a check and move on. The answer tells you if accountability is actually possible in this industry.

  • Prediction market volume post-White House ban. If activity spikes anyway, you’ve proven these platforms are now resilient to policy pushback. If it drops, the White House just demonstrated it can still influence behavior through mere discouragement.

  • Whether Breanna Olson’s brainwave tech gets VC funding. This is the clearest proxy for whether capital is actually looking at human problems or just chasing scale. If this gets funded heavily, we know there’s appetite for things that actually help. If it stays niche, we know the incentive structure hasn’t changed.

The industry’s having a moment. Not a crisis exactly. More like a reckoning that everyone knew was coming but nobody wanted to schedule. A firebombed gate. A lost lawsuit. A dancer who can move again. A mayor warning about coordinated lies. A company that quit before it started.

That’s not a trend yet. That’s the sound of something breaking and rebuilding at the same time.