2025 in review: an interesting year

On fascism, technology, and finding the helpers.

2025 in review: an interesting year
Photo by Paul Goyette, released under a Creative Commons license

“It’s an interesting year,” is a line I’ve used many times over the last twelve months, at conferences and in conversations. It’s a useful euphemism, sometimes delivered with a wry smile: we all know what I mean.

But it’s also a cop-out.

It lets me hide. If I say it’s been “interesting,” I don’t have to say that it’s been frightening, or exhausting, or quietly disillusioning. I don’t have to admit how often I’ve felt off-balance, or how much energy it takes just to keep moving forward as if this were all normal.

We all know this hasn’t just been interesting. It’s been a year of normalized chaos, of permanent emergency masquerading as background noise. Calling it interesting is a way of smoothing the edges. It lets us keep functioning inside systems that have been descending into nightmare territory faster than we might have imagined.

It’s also a cop-out because it’s non-confrontational. It’s open to interpretation. If you don’t share the same nightmares, if your limbic system isn’t in the same state of permanent activation that mine is, it gives us both an out. We don't have to talk about it. But that gap, where it emerges, is important: the people who haven’t laid awake at 3am with their heart racing have lived a different year. When I say “interesting” and we share a knowing nod, we’re agreeing to skim over the discomfort and ignore the detail.

The detail is important. What's happening is important.

Many bloggers publish personal end-of-year reviews. This is mine. But it can’t be a normal one.

It’s been an interesting year.

January 20

Some of us have spent our entire lives hearing stories of concentration camps, of pogroms, of war, and of political persecution. For some, those stories were close enough that they never felt historical. I will always remember the sound my Oma made, echoing through the walls, as the nightmares took her back to those events each night. My Dad spent the first years of his life in a camp. Many other people carry those memories at least as close.

So when an administration came to power that intentionally used the ideas, rhetoric, and increasingly, action of twentieth century European fascism, it set off emotional emergency alarms we’d spent a lifetime being prepared for. When it became clear that some people thought the threats were exaggerated, the alarms intensified. This is how fascism creeps into everyday life: through tolerance, through normalization, through people dismissing those who see it coming.

Although this post is published in my personal space, I work in a newsroom that investigates abuses of power in the public interest. My job there is to lead technology: the security, infrastructure, and publishing systems that allow the journalists to do their best work, safely. I read every story. I pay attention to the discussions on Slack and in story status meetings. I can’t look away.

So much is happening all the time. This resurgence of American fascism brings new threats — to individuals, to communities, to the newsroom where I work, to countless other companies, agencies, and organizations — and the temptation is to react to all of them at once. Everyone comes to work carrying fears inherited from their own histories. If we react to every version of those fears, we’ll be paralyzed.

This is the backdrop to everything else that has happened this year. Every other event, big and small, has happened while USAID was being dismantled, resulting in hundreds of thousands of deaths, or 500,000 Venezuelans were stripped of their immigration status, their children zip-tied to each other in midnight raids. We all need a threat model: a shared set of well-defined, concrete threats that are most likely and most severe. By building a shared understanding of what might actually happen, we give ourselves the ability to build a coherent strategy to counter it, and then to execute on it.

My role includes responsibility for newsroom security, which means building threat models: systematic assessments of which dangers are most likely and most severe. By identifying concrete threats rather than reacting to every fear, we can build coherent strategies to counter them, and then execute on them.

But this isn’t just a newsroom problem. This new fascist movement touches every aspect of American life, and an effective counter-movement will need to address each of them. The same principles apply: what are the real threats? How might we reduce the risk of them happening? How can we make plans for mitigating the impact if they do?

My work has touched several fronts of the tension between movement and counter-movement. Each of them — AI companies, journalism, and the open social web — reveals a different failure mode, and a different opportunity to push back.

Feel the AGI

Fascism creeps in through normalization. AI does something similar: it arrives framed as inevitability, progress, and efficiency, discouraging scrutiny until resistance feels naïve.

This is the year that generative AI truly became part of the mainstream discourse. ChatGPT, Claude, and Gemini are everywhere, and a long tail of other models and services have spread into every field, from software engineering to contract law and municipal architecture.

Ubiquity doesn’t just normalize a technology. It also normalizes the way it was built, and in particular, the values and assumptions of the team that built it: the features, design, ethical considerations, and guardrails they thought were important and unimportant. These models were trained by ingesting the work of millions of people without permission; in many cases, the work of independent artists and writers has been strip-mined to build a product worth billions of dollars. Creative work was pirated. But the vendors have largely been given a pass, because their enormously valuable companies matter more than the rights of the people whose work became training data.

Generalized models like ChatGPT and Claude work better the more data you pour into them. They also depend on vast data centers filled with expensive GPUs and custom hardware. That means, in their current state, that only a handful of companies can effectively run them. As we share more intimate detail with them — our personal lives, the inner workings of our companies and organizations, anything else you can think of — we’re delivering our most private and sensitive information to those companies. And because their use is framed as inevitable, opting out increasingly feels impractical, unprofessional, or even irresponsible.

Once systems are treated as unavoidable, they become available to power without consent. When Elon Musk’s DOGE, empowered by the Trump administration, set out to remake government institutions, it gathered information from each of them and connected them to generative AI models. In doing so, it dropped technical and administrative separations between datasets that have protected the privacy of Americans for generations, for example by copying the Social Security numbers and other personally identifiable information of 300 million people. Flock has created a nationwide AI-powered surveillance network used by ICE and law enforcement. Palantir has built ImmigrationOS, an AI-based system to speed up extrajudicial deportations.

None of this means the technology lacks value or that individuals using these tools are complicit. The problem is structural: when useful tools require dependence on centralized corporate infrastructure, that infrastructure becomes available for authoritarian exploitation regardless of users’ intentions. If something is made possible, we should assume that authoritarians will make use of it.

The alternative exists, but it requires rejecting the framing of inevitability. Small, local language models can be trained on specific, consented datasets for specific purposes: a newsroom analyzing its own archives, a research institution working with its own corpus. They run on local infrastructure, which means the data never leaves the organization that owns it. No centralized company intermediary; no pathway for government surveillance; no pirated training data.

This approach requires treating AI as a tool rather than infrastructure: something you deploy when you need it, with data you control, rather than a utility you depend on that controls your data. It’s harder than just using ChatGPT or Claude. It might cost more upfront (although likely not in the longer term). It won’t get you the same breadth of capability, even if it excels at the specific tasks it’s designed for. But, most importantly, it keeps consent and accountability in the system.

This year, centralized AI became part of the backdrop. They have enormous momentum: capital, talent, network effects, and now government partnerships. They’ve normalized dependence. But dependence on a technology that centralizes the capture of our most private data is not inevitable; a choice has been made to make it look that way. Recognizing that is the first step toward choosing differently.

Truth to power, truth from power

This year, it became clearer than ever before that while some journalists and newsrooms seek to speak truth to power, others seek to speak truth on behalf of power.

Most glaringly, Bari Weiss, founder of The Free Press, has become the editor-in-chief of CBS News after the network’s acquisition by David Ellison’s Skydance Media. Ellison is the brother of Oracle’s Larry Ellison, who has been described as a “shadow President” in the Trump administration. Recently, she chose to spike a 60 Minutes segment about CECOT, the brutal prison in El Salvador that has been the recipient of extrajudicial Trump deportees. The correspondent who reported the piece accused her of being politically motivated.

At the same time, Warner Bros is up for sale, and the President has been vocal that CNN, which it wholly owns, should be either sold with it or broken off and sold separately. Skydance has made a bid for it, too, which includes an irrevocable personal guarantee from Larry Ellison. If successful, the Ellisons would control both CBS and CNN.

Meanwhile, Twitter, which is both where many people receive their news and the place online where journalists historically hung out, was acquired by Elon Musk and rapidly reshaped into a Trump-aligned network. Another place where people learn about the events of the day is late night talk shows; Colbert was canceled by CBS (its last show will air next May), while ABC temporarily pulled Jimmy Kimmel’s show until a public outcry forced their hand.

Trump killed the Corporation for Public Broadcasting, which immediately undermined funding for NPR and PBS stations across the country. Hundreds of local stations are at risk of closure. In some cases, these stations are vital public infrastructure: the only way residents learn about safety announcements. Beyond that, in locations with no local news coverage, we know that local government and police corruption skyrockets.

The news industry was already vulnerable: financially weakened by the web, editorially compromised by long-standing failures to challenge power. There have always been newspapers that reliably took the government line on foreign wars, for example: speaking truth from power isn’t a new idea. Over the last year alone, reporters Without Borders described a media blackout on the war in Gaza; The New York Times failed to include trans voices in a majority of stories about trans issues. These were self-inflicted failures.

But this is different: Trump-aligned oligarchs are in the process of systematically acquiring editorial control over the ways we learn about the world, with the administration openly signaling which outlets should be sold and to whom. Stories, entire shows, and public media networks are already being pulled. If they’re successful, a significant chunk of how America learns about the world will be under the control — directly and indirectly — of the administration.

The capture isn’t total yet, and it’s not uncontested. It’s bleak out there, but there are points of light, if you know where to look. News startups — small newsrooms that are often run by women, people of color, and LGBTQ+ people, which are more likely to be worker-run co-operatives — continue to speak truth to power. In the spirit of disruption theory, they are likely to be overlooked by the larger networks until they’re too big to ignore. ProPublica, too, is making good on its mission to spur real-world change using investigative journalism as an instigating force.

Look for the helpers

As I write this, my three year old is sleeping in the next room. I worry about what kind of world he’ll grow up into: was the relative peace and freedom of the post Cold War decades an aberration, or will we bounce back into a democratic openness where everyone has the opportunity to lead a good life? The idea that we might be descending into a permanent authoritarianism terrifies me. That we were led here in part by the kinds of connective technology I used to love is deeply unsettling. When I wake up at 3am with my heart pounding, it’s not for me; it’s for him.

We need to find our way back. But how?

Fred Rogers, who memorably stood in front of the US Senate in 1969 to defend public media funding, had some famous advice about what to do when things seem bad:

“When I was a boy and I would see scary things in the news, my mother would say to me, 'Look for the helpers. You will always find people who are helping.' To this day, especially in times of 'disaster,' I remember my mother's words, and I am always comforted by realizing that there are still so many helpers — so many caring people in this world.”

Sure, there are people who haven’t felt the fear, who even now don’t see the depth of the troubles we find ourselves in. But there are also helpers; people who care.

I mentioned news startups. They help eradicate news deserts, represent underheard voices, and offer a meaningful alternative to larger media outlets that might be more susceptible to oligarchic capture. In part because they have less to lose, they do a far better job of getting to know their communities and being real with them. Their small size makes them more nimble and more accountable to the communities they serve. Organizations like Tiny News Collective do a good job of supporting them.

News is undergoing a kind of forced transformation, which I think is largely positive. News Product Alliance is instigating product thinking in newsrooms that might never otherwise have considered who their readers actually are. It’s also bringing together builders in news in meaningful ways, including helping newsrooms to think about how technology choices are importing someone else’s values into their ecosystems.

And there’s the open social web movement. At its worst, it’s a collection of nerds scratching their own itches. The movement doesn’t always understand its place in the current context, and why this work really matters. But at its best, the people involved deeply understand that the change they have the potential to bring about goes far beyond the internet. Projects like Mastodon and Bluesky — and a long tail that includes Bonfire, Nostr, Pixelfed, and more — provide viable alternatives to corporate-owned networks. In the same way that nobody can own the web, nobody can own the Fediverse: that means there’s no single point of failure, and no corporate strategy that can capitulate to an authoritarian.

These platforms have friction, for sure, and there’s work to be done to make them more usable, but in my opinion friction isn’t all bad. It’s what makes these networks so hard to own, and what prevents them from being the subject of the kinds of influence campaigns that led to the current authoritarianism in the first place. At FediForum, the open social web conference, I delivered a keynote that tried to galvanize the community into solving real problems and stepping up to tackle the dark place we find ourselves in societally. I hope that these platforms can be used to amplify the kinds of organizing and mutual aid that offline activists are already engaging in every day.

Each of these efforts has an implicit threat model, but could use an explicit one: a shared encapsulation of the threats that allow organizations, projects, and movements to work together to provide real solutions.

These ideas — alternatives for journalism, AI, the social web — might seem disconnected, but they are all part of a pushback on the kinds of centralized wealth and power that led us here. With a little bit more organizing effort, I believe they can be coordinated and effective. With a little luck, we might even win.

I’m scared for my kid and I’m scared for all of us.

It’s been an interesting year.

Now what?