Notable links: February 20, 2026

Why aren't newsrooms sharing and innovating? And more.

Notable links: February 20, 2026
Photo by Андрей Сизов / Unsplash

Most Fridays, I share a handful of pieces that caught my eye at the intersection of technology, media, and society.

This week: the trouble with innovation and revenue in news, and some of the societal forces (political manipulation, exploitation) that mean figuring those things out are vital.

Did I miss something important? Send me an email to let me know.


Journalism lost its culture of sharing

I agree, strongly, with this piece about (re)building an open source culture in news by Scott Klein and Ben Welsh. But then, I would: I spent over a decade working to build open source communities, and then another decade and change working alongside and then inside newsrooms.

So it’s to my chagrin that the newsroom where I currently serve as Senior Director of Technology is one of the places listed here where open source contributions have significantly dropped off:

“At ProPublica, teams published detailed white papers alongside major investigations, explaining their quantitative methodologies with scientific rigor, allowing other researchers to verify and learn from their work. Major news organizations ran active blogs where they shared techniques and lessons learned. Conference presentations at NICAR and elsewhere became venues for passing along hard-won knowledge.”

The effect of this work didn’t just lift the work of journalism, it attracted new people to it:

“This culture made newsrooms more attractive places to work for civic-minded technologists. If you had programming skills and wanted to use them to make a difference, journalism offered you the chance to build things that mattered and share them with the world.”

I think there’s a lot to be gained by collaborating on an open source basis. We typically run small, resource-constrained teams where building new software is contextually hard. And we have problems that, if they’re not identical, are at least significantly overlapping; by not collaborating on them, we further an ecosystem where low-resource organizations are all solving the same sorts of things with very few people and very little money in parallel.

I was present at the News Product Alliance Summit session described in this piece, and I think the analysis of both the causes of this decline and some of the solutions are spot on. I was particularly enamored by the idea of an Open Source Editor (or director — does everything in news need to be an editor?) and public recognition for great open technical work in the field of journalism.

I think it’s also worth saying that open source, done well, is about much more than just releasing your code. A good open source project is a community, not a package. So there’s a lot of ecosystem development and community management involved to foster the kind of real collaboration that is required for this to succeed — even after newsrooms have overcome the institutional hurdles to releasing their work in the first place.

I’m really grateful that Scott and Ben have been championing this cause. I’m right there with them, and I’ll do what I can to help. It’s a concrete way we can build a more successful, efficient news ecosystem with stronger technology capabilities, and that’s something we should all want.


Stop calling optimization "innovation."

I appreciate this distillation of the twin needs of optimizing the Engine — getting as much value as you can out of your existing business model — and the Explorer, which is all about actual innovation that seeks out new products, markets, and models.

“If your staff meetings are all about how to hit next month’s KPIs, you don’t have an Explorer. You have a very well-oiled engine. True resilience means insulating your Explorer team from the Engine. It means giving a team room to spend 6 months on a project that could totally flop without punishing them if it does.”

I think this is clearly true. At the same time, I think it’s very optimistic about where many organizations actually are: they very often don’t have those goals or KPIs to hit. The result is a kind of vibes-based strategy. Because nothing is measured, or the right things aren’t measured, it’s impossible to run an informed experiment.

In those organizations, what feels like innovation is just getting to baseline competence. Before they can optimize, they need to define a concrete strategy, with attendant metrics that you can measure as the basis for performing experiments. Buying a neat new product can be a way to absolve the team from doing the hard work of strategy-building: “look,” they can tell their boards, “we’re innovative!”

Creating a concrete strategy and deploying technology that can help serve it are vital. But they, in themselves, aren’t innovation: creating a real culture of innovative experimentation where you can try new things and fail fast is how you de-risk your business for the future. That means understanding your readers incredibly well, so you can anchor your experiments around their needs; it means giving your team the permission to fail; it means creating cross-functional teams who can be radically collaborative and draw conclusions from their experiments quickly; and it means being clear-eyed about where your business actually stands.


The political effects of X’s feed algorithm

Users who moved from a reverse-chronological social media algorithm to X’s:

“[…] were 4.7 percentage points more likely to prioritize policy issues considered important by Republicans, such as inflation, immigration and crime. They were also 5.5 percentage points more likely to believe that the investigations into Trump are unacceptable, describing them as contrary to the rule of law, undermining democracy, an attempt to stop the campaign and an attack on people like themselves.”

And even more surprisingly, once the algorithm was switched off, their views did not change again. The effect of the algorithm lingered, in part because it led users to follow more conservative influencers.

We intuitively knew that the algorithm mattered, but this is a key finding that puts numbers to it. If that number seems small to you, consider that 4.7% is more than enough to swing an election. It’s also interesting that findings for other algorithms were different; if this result holds up, it suggests that X’s algorithm may be particularly predisposed for political manipulation, even above Facebook and Instagram.

This should be a wakeup call for politically-engaged funders and anyone who cares about civil society. It’s not that we need to have less conservative algorithms; it’s that whoever controls the algorithms has a disproportionate say over the electorate’s view of the world.

We need more funding into open protocols that decentralize algorithmic ownership; open platforms that give users a choice of algorithm and platform provider; and algorithmic transparency across our information ecosystem.


Palantir vs. the "Republik": US analytics firm takes magazine to court

A series of articles by Switzerland’s Republik magazine highlighted Palantir's rejection by Swiss authorities as a potential security risk: it appears to have determined that there weren’t sufficient protections against Swiss data falling into American hands. This reporting, in turn, led other governments to question use of the firm for the same reason. Now Palantir is taking them to court to force them to make a “counterstatement” that would correct the record.

Of course, this has brought more international attention to Republik’s stories than they would otherwise have received:

“With the step to court, Palantir has generated more attention for the "Republik" reporting than the objected articles themselves could have caused – 23 years after Barbra Streisand triggered the effect named after her. And yet, there are reasons why Palantir is acting this way.”

A Swiss counterstatement doesn’t actually hinge on the correctness of the original statement: it’s apparently sufficient for another version of events to be possible. So this is more of a way for Palantir to get its own PR line out than it is to sue Republik for inaccurate reporting.

That’s important because Palantir is trying to make headway into European markets and finding it tougher than they’d like. Understandably, there’s a lot of resistance to the firm that provides surveillance powers to the likes of ICE, and whose CEO has justified “anti-woke” strategies that bolster an increasingly authoritarian regime over the last few years.


In Graphic Detail: Subscriptions are rising at big news publishers – even as traffic shrinks

This is exactly why micropayments — a model akin to Spotify’s streaming payments where each pageview receives a share from a reader’s monthly budget for all articles — are not the right solution for news.

“For a bunch, including The New York Times and The Wall Street Journal, growth isn’t just continuing, it’s speeding up, and likewise so is The Guardian’s paid reader contribution model. Meanwhile, Bloomberg’s subscription business shows signs of normalization after a 2024 spike, and Daily Mail is still ramping up its relatively new subscription business, which launched in 2024 in the U.K. and expanded to the U.S. and Canada in February 2025.”

In news, value is not necessarily tethered to popular traffic. There’s a specific demographic (typically older, wealthier, and more highly educated – see the next link) that is more likely to pay for it, and there’s a lot to be gained by news organizations if they optimize for gaining that audience. The news organizations that have doubled down on paywalls, and things like them, are often doing better than the ones that aren’t.

That can be a tough pill to swallow for the folks — like me — who believe that news should be available to all for the good of democracy. Of course, other models are available: specifically, non-profit newsrooms that operate with a philanthropic model. Like other public goods like Wikipedia and the Internet Archive, it turns out that a specific set of wealthier individuals and foundations are willing to pay to ensure that a resource can be made available for everyone.

Unlike paywalls, though, that tends to put newsrooms at the mercy of large foundations and high net worth individuals. Non-profit newsrooms have done a good job of trying to prevent funding coming with strings that might affect their decision-making (The 19th’s endowment campaign is particularly inspiring), but it inevitably must still happen. Paywalls force the issue by ensuring every reader pays, distributing the load: they democratize funding even while restricting access. On the other hand, that makes the newsroom more subject to market forces.

But none of this is about traffic. If you tether your payment model to the number of public pageviews you receive, you incentivize your newsroom to create clickbait. You’re ensuring that you have to compete for views for every single article, instead of building a direct relationship with a recurring member who is buying your product because they think it’s worth it overall.


Most Americans don’t pay for news and don’t think they need to

Only 8% of participants in a new Pew survey say that individual Americans have a responsibility to pay for news.

Some of the quotes here made me pause:

“I don’t pay to go to church, to get a spiritual message, you know? And if you’re true, and your mission is to relay facts that are fundamentally important for people’s well-being, do I need to pay you for that?”

It’s hard to know how to even begin to answer that: the comparison chafes for me, but it amounts to putting both church and news into a “public good” bucket. That people see news in that way is probably good. Providing it for free is hard, but you can see how they got there. A newspaper is a physical object that you can imagine handing over dollars for; digital news feels like it’s in the ether. It perhaps points to a philanthropic model as the best fit. So depending on wealthy donors and foundations to allow everyone to have free access to it makes some sense.

This also puts paid (so to speak) to micropayments solutions, which I’m generally skeptical of anyway. If nobody sees the need to pay for news, convincing them to fund a wallet feels like an uphill battle.

Meanwhile, the people most likely to pay directly for news are older, wealthier, liberal Democrats. Again, not a surprise, but useful to have it laid out like this; many newsrooms I’ve spoken to are trying to figure out how to move away from a base of older, wealthier, left-leaning people, and, well, it’s not just them. Maybe it’s worth leaning into that for funding and concentrating on finding a broader audience for the news itself.


Everyone is stealing TV

It makes sense that people don't want to be limited by regional geoblocks to get their content – but I don’t think these devices should be trusted.

“It’s called the SuperBox, and it’s being demoed by Jason, who also has homemade banana bread, okra, and canned goods for sale. “People are sick and tired of giving Dish Network $200 a month for trash service,” Jason says. His pitch to rural would-be cord-cutters: Buy a SuperBox for $300 to $400 instead, and you’ll never have to shell out money for cable or streaming subscriptions again.”

From a user perspective, I see the appeal: I certainly have subscription fatigue. Beyond that, geoblocks are intensely irritating to me; I’d give anything to be able to watch the UK’s Channel 4 News, or Doctor Who spinoff The War Between the Land and the Sea, which are both unavailable to me unless I want to dive into VPNs and breaking terms of service. A box that gives me what I want to watch, no questions asked, seems too good to be true.

It’s not fully clear who is manufacturing these devices, what’s on them, or who runs the services that allow people to access all this television without paying for it. We already know that some streaming boxes have been fronts for residential botnets that have been used for illicit activities that run the gamut from avoiding scraper detection to real organized crime. If I wanted to run malware inside the networks of thousands of homes and businesses, this wouldn’t be a bad way to go about it.

Which is a shame, because the allure is real. I’d pay for all that unavailable television. Just, please, let me.


Hiring in an era of fake candidates, real scams and AI slop

Andrew Losowsky discusses the impact of AI on his hiring process:

“Within 12 hours of posting the role, we received more than 400 applications. At first, most of these candidates seemed to be genuine. However, as the person who had to read them all, I quickly saw some red flags, which were all clear indicators of inauthenticity.”

These jibe with what I’ve seen lately too. I’ve had the privilege of hiring for a few technical roles over the last year, and every single time, almost everything Andrew mentions has come up.

The good news, as he points out, is that right now there are some really strong tells. One of the most important parts of any application I run is the “why are you excited about this job?” question, which is really a question about mission fit. The AI-generated answers are extremely generic, heavily reference the job description itself, and start looking very samey in a sample size of hundreds.

Here’s the thing I don’t believe I’ve encountered before:

“Someone made a fake email address similar to ours, then sent generic technical “tests” containing our logo to jobseekers, while linking to our job ad. Completing these tests led to a fake contract signed by someone claiming to be our CEO – it was at this point that the scammers requested financial information, saying they needed it to issue payments.”

The thing is, without someone telling me about it, how would I know? This is where we need stronger tools – the anti-spam protections of yore don’t work very well against AI-powered scams. Centralized repositories of scammers and stronger anti-spam filters may work, but I suspect we’re going to need to find other approaches. Impersonating to make some quick money is one thing (and bad enough), but when you consider that for both Andrew and I we’re talking about impersonating newsrooms, this could get very bad very quickly.