Skip to main content
 

Over the Edge: The Use of Design Tactics to Undermine Browser Choice

"In order to be able to choose their own browser, people must be free to download it, easily set it to default and to continue using it – all without interference from the operating system. Windows users do not currently enjoy this freedom of choice."

What's interesting to me is that this is very similar to the tactics that got Microsoft into hot anti-trust water a few decades ago. And here it is again: research that shows Microsoft is prioritizing its Edge browser in Windows. New browser, same dark pattern.

[Link]

· Links · Share this post

 

A reminder that the whole point of open source, federated technologies is that there doesn't have to be one winner. It's not a market where every vendor is trying to be a monopoly. It's about building a bigger, collaborative pie.

· Statuses · Share this post

 

New study says the world blew past 1.5 degrees of warming four years ago

"Limiting average global warming to 1.5 degrees Celsius, or 2.7 degrees Fahrenheit, above preindustrial levels has been the gold standard for climate action since at least the 2015 Paris Agreement. A new scientific study published in the peer-reviewed journal Nature Climate Change, however, suggests that the world unknowingly passed this benchmark back in 2020."

Not so great, but what's cool here is how they determined this: by analyzing strontium to calcium ratios in a species of sea sponge that lives for hundreds of years. Previously we'd only been able to determine ocean temperatures starting in 1850, when the industrial revolution was already underway.

This new analysis suggests that the pre-industrial oceans were cooler than had been previously understood, meaning we may be 20 years further along the global warming curve than we'd known. Even more reason to take dramatic action now.

[Link]

· Links · Share this post

 

Public Funding of Journalism Is the Only Way

"If your position is that public money will irrevocably taint journalism but the biggest companies in America buying ads will not, I submit that you have not thought about this issue very deeply."

I don't know how I feel about a publicly-funded media, although I couldn't be a bigger fan of independent public media entities like the BBC and Channel 4. What I do think is that we're a long way from a US government administration that will actually do that and guarantee freedom from interference.

"Today, I am just trying to make a singular, clarifying point: We need to build a large, continual public funding stream for journalism not because it is an easy task, but because it is the only way. Stop looking for magical alternative solutions."

This, on the other hand, may turn out to be true.

[Link]

· Links · Share this post

 

Semafor reporters are going to curate the news with AI

"As social traffic collapses and Google makes ominous AI-powered sounds about search, publishers across the board have started to reemphasize their websites as destinations, and that means there are a lot of new ideas about what makes websites valuable again." A lot of which look like blogging.

Semafor Signals, described in this piece, may be AI-augmented, but it really comes down to a collection of links that form an umbrella story, with some context from an editor to link it all together.

What's groundbreaking here is the newsroom tool used to produce it, not the product itself. And that's where AI - and a lot of other technology - becomes more interesting. Not as a way to replace journalists or churn out content at speed, but as a way to give them more information to work with in order to produce work (written and created by humans) that might not have been possible otherwise.

[Link]

· Links · Share this post

 

Book: The Future, by Naomi Alderman

"The only way to predict the future is to control it." An interesting idea that powers a book that has a lot to say about 21st century oligarchy and our relationship to technology. There's one conclusion that hits home particularly hard; I can't describe it without spoiling the story, but I'm glad it's there.

If I have a criticism, it's that the author has so many ideas to share that they sometimes burst the seams of the thriller that forms this novel's page-turning center. But I enjoyed every minute, nodding along and wondering what was going to happen next.

[Link]

· Links · Share this post

 

Three variations on Omelas

The Ones Who Walk Away From Omelas, by Ursula K. LeGuin:

They all know it is there, all the people of Omelas. Some of them have come to see it, others are content merely to know it is there. They all know that it has to be there. Some of them understand why, and some do not, but they all understand that their happiness, the beauty of their city, the tenderness of their friendships, the health of their children, the wisdom of their scholars, the skill of their makers, even the abundance of their harvest and the kindly weathers of their skies, depend wholly on this child’s abominable misery.

The Ones Who Stay and Fight, by N.K. Jemisin:

But this is no awkward dystopia, where all are forced to conform. Adults who refuse to give up their childhood joys wear wings, too, though theirs tend to be more abstractly constructed. (Some are invisible.) And those who follow faiths which forbid the emulation of beasts, or those who simply do not want wings, need not wear them. They are all honored for this choice, as much as the soarers and flutterers themselves—for without contrasts, how does one appreciate the different forms that joy can take?

Why Don’t We Just Kill the Kid in the Omelas Hole, by Isabel J. Kim:

So they broke into the hole in the ground, and they killed the kid, and all the lights went out in Omelas: click, click, click. And the pipes burst and there was a sewage leak and the newscasters said there was a typhoon on the way, so they (a different “they,” these were the “they” in charge, the “they” who lived in the nice houses in Omelas [okay, every house in Omelas was a nice house, but these were Nice Houses]) got another kid and put it in the hole.

· Posts · Share this post

 

Zuckerberg's Going to Use Your Instagram Photos to Train His AI Machines

During Meta's earnings call, Mark Zuckerberg said that Facebook and Instagram data is used to train the company's AI models.

“On Facebook and Instagram, there are hundreds of billions of publicly shared images and tens of billions of public videos, which we estimate is greater than the Common Crawl dataset and people share large numbers of public text posts in comments across our services as well.”

He's playing to win: one unstated competitive advantage is that Meta actually has the legal right to use training data generated on its own services. It's probably not something most users are aware of, but by posting content there, they grant the company rights to use it. If OpenAI falls afoul of copyright law, Meta's tech has a path forward.

It's a jarring thought, though. I'm certainly not keen on a generative model being trained on my son's face, for example. I'm curious how many users will feel the same way.

[Link]

· Links · Share this post

 

The four phases

A fictional mainframe

This post is part of February’s IndieWeb Carnival, in which Manuel Moreale prompts us to think about the various facets of digital relationships.

Our relationship to digital technology has been through a few different phases.

One: the census

In the first, computers were the realm of government and big business: vast databases that might be about us, but that we could never own or interrogate ourselves. Companies like IBM manufactured room-sized (and then cabinet-sized) machines that took a team of specialized technicians to operate. They were rare and a symbol of top-down power.

Punch cards were invented in the 1880s, and were machine-sortable even then, although not by anything we would recognize as a computer today. In the 1930s, a company called Dehomag, which was a 90%-owned subsidiary of IBM, used its punch card census technology to help the German Nazi party ethnically identify and sort the population. (Thomas Watson, IBM’s CEO at the time, even came to Germany to oversee the operation.)

The first general-purpose digital computer, ENIAC, was first put to use to determine the feasibility of the H bomb. Other mainframe computers were used by the US Navy for codebreaking, and by the US census bureau. By the sixties and seventies, though, they were commonplace in larger corporate offices and in universities for non-military, non-governmental applications.

Two: the desk

Personal computers decentralized computing power and put it in everybody’s hands. There was no overarching, always-on communications network for them to connect to, so every computer had its own copy of software that ran locally on it. There was no phoning home; no surveillance of our data; there were no ad-supported models. If you were lucky enough to have the not-insignificant sum of money needed to buy a computer, you could have one in your home. If you were lucky enough to have money left over for software, you could even do things with it.

The government and large institutions didn’t have a monopoly on computing power; theoretically, anyone could have it. Anyone could write a program, too, and (if you had yet more money to buy a modem) distribute it on bulletin board systems and online services. Your hardware was yours; your software was yours; once you’d paid your money, your relationship with the vendor was over.

For a while, you had a few options to connect with other people:

  • Prodigy, an online service operated as a joint venture between CBS, IBM, and Sears
  • CompuServe, which was owned and run by H&R Block
  • America Online, which was originally a way for Atari 2600 owners to download new games and store high scores
  • Independent bulletin boards, which were usually a single computer connected to a handful of direct phone lines for modems to connect to, run by an enthusiast

(My first after-school job was as a BBS system operator for Daily Information, a local information and classifieds sheet in my hometown.)

In 1992, in addition to bulletin board systems and online services, the internet was made commercially available. Whereas BBSes, AOL, etc were distinct walled gardens, any service that was connected to the internet could reach any other service. It changed everything. (In 1995, my BBS job expanded to running what became one of the first classifieds websites.)

But for a while, the decentralized, private nature of personal computing remained. For most private individuals, connecting to the internet was like visiting a PO box: you’d dial in, would upload and download any email you had pending, browse any websites you needed to, and then log off again. There was no way to constantly monitor people because internet users spent 23 hours of the day disconnected from the network.

Three: the cloud

Broadband, the iPhone, and wifi changed everything. Before the advent of broadband, most people needed to dial in to go online using their phone line. Before the iPhone, cell connections weren’t metered for data, and there was very little bandwidth to go around. Before wifi, a computer needed to physically be connected with a cable to go online.

With broadband and wifi, computers could be connected to the internet 24/7. With the iPhone, everyone had a computer in their pocket, that was permanently connected and could be constantly sending data back to online services — including your location and who was in your address book.

It was incredibly convenient and changed the world in hundreds of ways. The web in particular is a modern marvel; the iPhone is a feat of design and engineering. But what we lost was the decentralized self-ownership of our digital worlds. More than that, we lost an ability to be private that we’d had since the beginning of human civilization. It used to be that nobody needed to know where you were or what you were thinking about; that fundamental truth has gone the way of the dinosaur.

Almost immediately, our relationship to software changed in a few key ways:

  • We could access all of our data from anywhere, on any device.
  • Instead of buying a software package once, we were asked to subscribe to it.
  • Instead of downloading or installing software, the main bulk of it could be run in a server farm somewhere.
  • Every facet of our data was stored in one of these server farms.
  • More data was produced about us as we used our devices — or even as we walked through our cities, shopped at stores, and met with other people — than we created intentionally ourselves.

While computing became infinitely easier to use and the internet became a force that changed global society in ways that I still believe are a net positive, surveilling us also became infinitely easier. Companies wanted to know exactly what we were likely to buy; politicians wanted to know how we might vote; law enforcement wanted to know if we were dangerous. All paid online services to build profiles about us that could be used to sell advertising, could be mined by the right buyer, and could even be used to influence elections.

Four: the farm

Our relationship is now changing again.

Whereas in the cloud era we were surveilled in order to profile us, our data is now being gathered for another set of reasons. We’re used to online services ingesting our words and actions in order to predict our behaviors and influence us in certain directions. We’re used to Target, for example, wanting to know if we’re pregnant so they can be the first to sell us baby gear. We’re not used to those services ingesting our words and actions in order to learn how to be us.

In our new relationship, software isn’t just set up to surveil us to report on us; it’s also set up to be able to do our work. GitHub Copilot learns from software we write so that it can write software automatically. Midjourney builds stunning illustrations and near-photorealistic images. Facebook is learning from the text and photos we upload so it can create its own text and realistic imagery (unlike many models, from data it actually has the license to). Far more than us being profiled, our modes of human expression are now being farmed for the benefit of people who hope to no longer have to hire us for our unique skills.

In the first era, technology was here to catalogue us.

In the second, it was here to empower us.

In the third, it was here to observe us.

In the fourth, it is here to replace us.

We had a very brief window, somewhere between the inception of the homebrew computer club and the introduction of the iPhone, where digital technology heralded distributed empowerment. Even then, empowerment was hardly evenly distributed, and any return to decentralization must be far more equitable than it ever was. But we find ourselves in a world where our true relationship is with power.

Of course, it’s a matter of degrees, and everything is a spectrum: there are plenty of services that don’tuse your data to train generative AI models, and there are plenty that don’t surveil you at all. There are also lots of applications and organizations that are actively designed to protect us from being watched and subjugated. New regulations are being proposed all the time that would guarantee our right to privacy and our right to not be included in training data.

Those might seem like technical decisions, but they’re really about preserving our ownership and autonomy, and returning those things to us when they’ve already been lost. They’re human, democratic decisions that seek to enforce a relationship where we’re in charge. They’re becoming more and more important every day.

· Posts · Share this post

 

‘The Messenger’ Implosion Once Again Shows The Real Problem With U.S. Journalism Is Shitty Management By Visionless, Fail-Upward Brunchlords

"If you’ve spent any time in journalism, it’s completely wild to think about what a small team of smart, hungry journalists and editors could do with $50 million. It’s enough to staff a team of hard-nosed ProPublica-esque journalists for the better part of the next decade."

While we're here, might I suggest donating to ProPublica so those hard-nosed journalists can stick around to do exactly that?

[Link]

· Links · Share this post

 

P&B : Winnie Lim

A lovely interview with Winnie Lim, whose deeply human, beautifully-written blog is one of my absolute must-reads.

This spoke to me, except substitute Oxford for Singapore: "I felt very alienated and lonely as a young person in the 1990s. It was incredible to discover the internet and know there is an entire world out there, that there are actually many people living diverse lives that were not visible or encouraged in Singapore."

Winnie and I both worked at Medium at different times, and yet both have a very strong own-your-own-domain philosophy. Her blogging story is really similar to mine, even if the content of her blog is very much her own.

Just a complete pleasure to read.

[Link]

· Links · Share this post

 

I’m genuinely thinking about starting a new blog about my experiences of fatherhood. It would be good on a new domain rather than be a part of my usual tech journaling. Too much?

· Statuses · Share this post

 

I think Tim Burton is exactly the wrong person to remake Attack of the 50 Foot Woman. I'd watch Greta Gerwig's take on it in a heartbeat, though.

· Statuses · Share this post

 

OpenAI says there’s only a small chance ChatGPT will help create bioweapons

"OpenAI’s GPT-4 only gave people a slight advantage over the regular internet when it came to researching bioweapons, according to a study the company conducted itself." Uh, great?

"On top of that, the students who used GPT-4 were nearly as proficient as the expert group on some of the tasks. The researchers also noticed that GPT-4 brought the student cohort’s answers up to the “expert’s baseline” for two of the tasks in particular: magnification and formulation." Um, splendid?

"However, the study’s authors later state in a footnote that, overall, GPT-4 gave all participants a “statistically significant” advantage in total accuracy." Ah, superb?

[Link]

· Links · Share this post

 

Anti-scale: a response to AI in journalism

"It should be obvious that any technology prone to making up facts is a bad fit for journalism, but the Associated Press, the American Journalism Project, and Axel Springer have all inked partnerships with OpenAI."

The conversation about AI at the Online News Association conference last year was so jarring to me that I was angry about it for a month. As Tyler Fisher says here, it presents existential risk to the news industry - and beyond that, following a FOMO-driven hype cycle rather than building things based on what your community actually needs is a recipe for failure.

As Tyler says: "Instead of trying to compete, journalism must reject the scale-driven paradigm in favor of deeper connection and community." This is the only real path forward for journalism. Honestly, it's the only real path forward for the web, and for a great many industries that live on it.

[Link]

· Links · Share this post

 

The Messenger Shuts Down—And Some Thoughts About Why It Ever Happened

Josh Marshall on The Messenger: "It really is like if you were on a parachute jump and some cocky idiot just jumped out of the plane with no chute saying he had it covered and, obviously, plummeted to the ground and died."

Beyond the well-deserved snark, this is actually a great breakdown of what went wrong here, and why businesses like The Messenger don't work anymore. The scale-advertising-social equation is obsolete.

Forgive me if it sounds like I'm banging some sort of drum, but you really do need to build deeper relationships through community, get to know the people you're serving, and build something that meets their unmet needs incredibly well. A content farm ain't it.

[Link]

· Links · Share this post

 

The circular Tube map

Transport for London have redesigned the Tube map in concentric circles as part of a promotional partnership with a phone company. Just one of the many, many ways public transit is desperately grasping for funds all over the world.

Here in Philly, SEPTA is working to rename stations based on corporate sponsorships. The Tube actually did this once before already, renaming Bond Street to Burberry Street for London fashion week. That (as well as these new maps, presumably) was temporary; these are permanent.

I don't blame transit authorities for trying to make up for budget shortfalls however they can. But it's also sad. Public transit is an important public good; it's a real shame that we can't seem to fully fund it from the public purse. The point is not for transit to be profitable, it's to provide real infrastructure that lifts everybody up.

[Link]

· Links · Share this post

 

Our Redesigned Byline Pages

"Research has shown that the more readers know about our reporters, the more likely they are to understand the rigors of our journalistic process and trust the results." So the NYT enhanced its journalist profiles to make them more human.

People trust people, not brands. The design makes sense: it deepens the relationship between a reader and the journalist whose work they're interacting with.

I think these are just the first steps of that humanization, though. Newsrooms need to transition from thinking about "audience" to "community": a one-way broadcast relationship to the kind of two-way conversation the internet was built for.

[Link]

· Links · Share this post

 

The rebranding of DEI as an "elite" concern is incredible to me. It's literally about including people from oppressed communities.

· Statuses · Share this post

 

Stripping the web of its humanity

Tiny robot buddies

I tried Arc Search, the new mobile app from the Browser Company. Its central insight is that almost every mobile browsing session starts with a web search; rather than giving you the usual list of results, it prioritizes building a web page for you that contains all the information you asked for. That way, the theory goes, you can see the information you need and be on your way faster. (You can still fall back to a Google search, although given that Google is going down the same path, that might not be as differentiated an experience — or as correct — as you might think.)

Obviously, I searched for myself. (Admit it, you would too.) Here’s what it gave me:

Arc Search displaying a page for

I have a few notes:

  • This is a photo of my friend Tantek Çelik. I have a lot of respect for Tantek, and I think he’s done a lot to consistently work for the open web in a way that most of the rest of us haven’t always been able to. I also miss hanging out with him. But I am not him.
  • I am not the Chief Technology Officer at The 19th. I was, but I haven’t been there for almost a year. Tyler Fisher is the CTO at The 19th, where he’s doing an excellent job.
  • It’s not in the screenshot, but it also claims that I’m a contributor to The 19th. I wish I was a journalist of that calibre, but let’s be clear, I’m a web developer who blogs.

If this was an ordinary web search, you could easily see that pages that describe me as the CTO at The 19th are old, and that the photo is of Tantek. But it’s not an ordinary search at all, and Arc Search has presented the information as factual, without context or attribution, as a very simple website. There are three representative links presented further down the page, but in a way that is disconnected from the facts themselves.

I’m flattered by the comparison to Tantek, and while I’m not that keen on having my job position misrepresented, it’s not catastrophic. In other words, it could be worse. But consider if you weren’t just doing a vanity search, and instead were looking for something that actually mattered.

I’m also troubled by the role of training models themselves. When you remove attribution and display facts in this way, you give the appearance of objectivity without the requirement to actually be objective. It all comes down to which sources it checks and how the model is trained to report back — in other words, the biases of the developers.

If I search for “who should I follow in AI?” I get the usual AI influencers, with no mention of Timnit Gebru or Joy Buolamwini (who would be my first choices). If I ask who to follow in tech, I get Elon Musk. It undoubtedly has a lens through which it sees the world. That’s fine in itself — everyone does — but by removing context, you remove the clues that help you figure out what it is.

Finally, obviously, the sources themselves are automatically browsed by the app but don’t see the benefit of a human visitor. Real people sometimes pay to access content, or donate to support a nonprofit publisher, or watch an ad. Those things can be annoying but pay for the content to be produced in the first place. If we strip them away, there’s no writing, information, or expression for the app to summarize. A world where everyone uses an app like this is a death spiral to an information desert.

I guess what I’m saying is: thanks, I hate it. Give me context; give me nuance; give me the ability to think for myself. We built the world’s most incredible communication and knowledge-sharing medium, rich with diverse perspectives and alternative ideas; let’s not sanitize it through a banal filter that is designed to strip it of its humanity.

· Posts · Share this post

 

Every day I walk back from daycare with an empty stroller, its toddler cocoon open like a burst egg sac, and everyone looks at me like I’m either processing some intense grief or I’ve let something terrible out into the world.

· Statuses · Share this post

 

Why You’ve Never Been In A Plane Crash

A really great piece about blameless postmortems and how the psychological safety to tell the truth leads to fewer mistakes and - in the case of the aviation industry - fewer lives lost.

"It’s often much more productive to ask why than to ask who. [...] A just organizational culture recognizes that a high level of operational safety can be achieved only when the root causes of human error are examined; who made a mistake is far less important than why it was made."

Exactly!

[Link]

· Links · Share this post

 

Following lawsuit, rep admits “AI” George Carlin was human-written

Simon Willison called this, and it makes sense: the George Carlin AI special was human-written, because that's the only way it could possibly have happened.

It's a parlor trick; a bit. It's also a kind of advertising for AI: even as you're horrified at the idea of creating a kind of resurrected George Carlin against his will, you've accepted that idea that it was technically possibly. It isn't.

Unfortunately for the folks behind the special, it's still harmful to Carlin's legacy, and putting his name on it in order to gain attention is still a problem. We'll see how the lawsuit shakes out.

[Link]

· Links · Share this post

 

The indieweb is for everyone

Hands joining together

Tantek Çelik has posted a lovely encapsulation of the indieweb:

The is for everyone, everyone who wants to be part of the world-wide-web of interconnected people. The social internet of people, a network of networks of people, connected peer-to-peer in human-scale groups, communities of locality and affinity.

This complements the more technical description on the indieweb homepage:

The IndieWeb is a community of independent and personal websites connected by open standards, based on the principles of: owning your domain and using it as your primary online identity, publishing on your own site first (optionally elsewhere), and owning your content.

I first came across the indieweb movement when I’d just moved to California. Tantek, Kevin Marks, Aaron Parecki, Amber Case, and a band of independent developers and designers were actively working to helping people own their own websites again, at a time when a lot of people were questioning why you wouldn’t just post on Twitter and Facebook. They gathered at IndieWebCamps in Portland, and at Homebrew Website Camp in San Francisco.

One could look at the movement as kind of a throwback to the very early web, which was a tapestry of wildly different sites and ideas, at a time when everybody’s online communications were templated through web services owned by a handful of billion dollar corporations. I’d prefer to think of it as a manifesto for diversity of communications, the freedom to share your knowledge and lived experiences on your own terms, and maintaining the independence of freedom of expression from business interests.

A decade and change later and the web landscape looks very different. It’s now clear to just about everyone that it’s harmful for all of our information to be filtered through a handful of services. From the Cambridge Analytica scandal through Facebook’s culpability in the genocide against the Rohingya people in Myanmar, it’s clear that allowing private businesses to own and control most of the ways we learn about the world around us is dangerous. And the examples keep piling up, story after story after story.

While these events have highlighted the dangers, the indieweb community has been highlighting the possibilities. The movement itself has grown from strength to strength: IndieWebCamps and Homebrew Website Clubs are now held all over the world. I’ve never made it to one of the European events – to my shame, it’s been years since I’ve even been able to make it to a US event – but the community is thriving and the outcomes have been productive.

Even before the advent of the fediverse, the indieweb community had built tools to allow websites to connect to each other as a kind of independent, decentralized social web. Webmention, in conjunction with lightweight microformats that extended HTML to provide semantic hints about the purpose of content on a website, allowed anyone to reply to any website article using a post on their own site – not just that, but they could RSVP to events, send a “like”, reshare it, or use verbs that don’t have analogies in the traditional social networks. The community also created micropub, a simple API that makes it easy to build tools to help people publish to their websites, and a handful of other technologies that are becoming more and more commonplace.

In the wake of the decline of Twitter, Google’s turn towards an AI-driven erosion of the web, and a splintering of social media, many publishers have realized that they need to build stronger, more direct relationships with their communities, and that they can’t trust social media companies to be the center of gravity of their brands and networks. For them, owning their own website has regained its importance, together with building unique experiences that help differentiate them, and allow them to publish stories on their own terms. These are truly indieweb principles, and serve as validation (if validation were needed) of the indieweb movement’s foundational assumptions.

But ultimately it’s not about business, or technology, or any one technique or facet of website-building. As Tantek says, it’s about building a social internet of people: a human network of gloriously diverse lived experiences, creative modes of expression, community affinities, and personalities. The internet has always been made of people, but it has not always been people-first. The indieweb reminds us that humanity is the most important thing, and that nobody should own our ability to connect, form relationships, express ourselves, be creative, learn from each other, and embrace our differences and similarities.

I’m deeply glad it exists.

 

Also posted on IndieNews

· Posts · Share this post

 

The War on Gaza, by Joe Sacco

Joe Sacco, the graphic journalist who wrote Palestine, Footnotes in Gaza, and Safe Area Gorazde, has started a new series, The War on Gaza.

It's accompanied by this statement from Fantagraphics:

"We want to state clearly and emphatically that we stand with the innocent people of Gaza. At the same time, we emphatically condemn the massacre of innocent Israeli civilians by Hamas on October 7 as a war crime and acknowledge with deep regret the grief and trauma Jewish people are enduring in its aftermath; but this barbarous act does not warrant Israel to commit its own war crime and to inflict exponentially greater grief and trauma in return."

[Link]

· Links · Share this post