Skip to main content
 

Web 2.0 as a business model only works if the crowds making and categorize information agree to be a part of the machine. Lately, they won't.

· Statuses · Share this post

 

Reddit communities to ‘go dark’ in protest over third-party app charges

“More than 3,000 subreddits have joined the protest, and will go “private” on Monday, preventing anyone outside the community from seeing their posts.” Over 87% of subreddits joined the protest.

[Link]

· Links · Share this post

 

My first startup

Ben Werdmuller and Dave Tosh outside the JavaOne conference in San Francisco in 2008

I fell into startups by accident.

There were two paths I could have gone down at university. Next to an over-posed photo taken at a digital photo booth at a branch of Boots the Pharmacist in downtown Oxford, my high school yearbook declares that I’m likely to become a journalist. On the other hand, I’d taught myself to program, and then to write HTML, and the web seemed like an exciting medium to tell stories with.

I got into the Computer Science program at the University of Edinburgh. In England, at least at the time, you effectively picked your major two years before even going to university: at sixteen years old, you were asked to choose three or four A-level subjects that you’d study exclusively until the end of high school. In turn, those subjects would dictate which degrees you were allowed to apply for. I refused to filter myself in this way, and I was still in the early stages of trying to figure out who I was, let alone what I wanted to do — a tall order for any sixteen year old, let alone a third culture kid who felt physically and socially out of sorts with the world. Edinburgh was one of the few programs that didn’t see my spread across arts and sciences as a bad thing.

I was big and I hated it. I’d grown up to be well over six feet tall, and I wasn’t skinny in the way some tall people get to be. I couldn’t (and still can’t) catch myself in the mirror without cringing. I hated every aspect of my physicality in a way that I didn’t quite have words for, and that self-loathing translated to an overwhelming awkwardness in real life. It crossed the line into self-harm, both directly and indirectly. If I was always going to be this, what was the point in throwing myself into anything?

My mother had become a financial analyst in the telecoms industry — she’d studied the split-up of AT&T and the formation of the Baby Bells as a postgraduate at Oxford — and she saw the internet revolution coming. She tried out the emerging ISPs for work, and I devoured it. I gophered around the world. When we finally got Demon Internet at home, which turned every dialed-in user into a genuine node on its network, I discovered newsgroups, and realized I could communicate with people who were roughly my age without them ever seeing me. I could be myself freely. It radically changed my life. To this day, this is the part of the internet I really care about: not protocols or code, but the ability for people to be themselves and tell their stories. The opportunity for contexts to collide, relationships of all kinds to be built, and for learning to happen between people.

Of course, computer science has almost nothing to do with that. Edinburgh is a well-renowned program, particularly in conjunction with its AI school, and I’ve benefitted from it. But at the time, I was deeply disappointed with the focus on mathematics. The part of computing I cared about more than any other was the internet, and the internet was made of people more than it was any networking technology or algorithm. I’m still not sure why I didn’t change my degree (I’m also not sure if they would have let me). Having an honors degree in CS has helped my career, but I didn’t find the meaning in it that I’d hoped to. In retrospect, I wish I’d used my US citizenship to go to a liberal arts school, but at the time I didn’t have any interest in leaving the UK.

So I got distracted. Back in high school, I’d started a hypertext computer magazine called Spire that I distributed on various bulletin board systems. For my friends, it was a way that I could get them free review copies of games; for me, making something and putting it out there was a worthwhile project in itself. At university, I transitioned it to the web, buying my first domain name in the process and setting it up on Pair Networks hosting. It wasn’t particularly well-read, but the process of building and writing it made it worth it to me. I kept up my personal homepage; I wrote a blog; I continued to write on the newsgroups and hang out on Internet Relay Chat for hours.

Sometime towards the end of my degree, I accidentally wrote a meme that spread like wildfire across the blogs. I put it up on a Friday evening, and by Sunday it had almost a hundred thousand pageviews. I’ve written this story elsewhere, but to make a long story short: I built it into a satirical site that got millions of pageviews a day, I built a community that endures to this day, and through it all, I got a taste of how powerful the web could really be. It wasn’t commercial at all — in fact, it was militantly not — but that wasn’t the point. The point for me, as always, was to connect and feel a little bit more seen.

Edinburgh has a little bit more of a technology scene now, but when I graduated there was nothing. I looked for jobs that I might find interesting. A computer magazine was interested in hiring me as a reviewer, but the pay was abysmal: just £12,000 a year, and they really wanted me to move to London to do it. I didn’t see how you could possibly afford to take a job like that and live in London if you weren’t already rich, which I wasn’t. So I ended up getting a job back at the university, working to create an educational site for professional sports coaches as part of the sports science department.

They weren’t sure where to put me, so I wound up in a converted broom closet with a window that didn’t shut, right over the canteen kitchen. The room was freezing in winter and permanently smelled of chips. Worse, it already had someone in it: a PhD student called Dave who made no secret of the fact that he resented my being there. I’d been pre-announced to the learning technology folks as a “computer scientist”, so they all thought I was some hoity-toity egotist rather than an entry-level developer who had no real idea what he was doing.

Dave was angry a lot of the time and liked to talk about it when he wasn’t playing games on the BBC Sports website. He was studying educational technology, which hadn’t really been in use even when I was doing my degree. But through him I learned all about virtual learning environments like WebCT. Later, I transitioned from the sports science department into general e-learning development, and I got more of a hands-on look, and I understood some of his frustration. The university was a pretty rigid environment, and the software was terrible. Students hated it; teachers hated it; administrators hated it; I’m not convinced that the people who wrote the software didn’t hate it. Platforms like WebCT and Blackboard, and even their open source counterpart, Moodle, were the worst: a terrible model for learning.

The web, on the other hand, was amazing. People were learning from each other all the time. It had already been changing my life for almost a decade, and now, through more accessible social media sites like LiveJournal, the benefits were spreading. Social media was informal learning, but learning nonetheless. All of this was already happening, but the actual learning technology products weren’t built with this understanding or intent. The internet had been so freeing for me — that release from my own physicality, the hooks and hangups that came from how I looked and felt in the real world — that I wished I’d had something with the same dynamics at university. I wished I’d been free there. I didn’t express this idea at the time, but that’s what drove me.

I suggested he started blogging his ideas. He was skeptical, but I somehow convinced him to start a blog — he gave it a very official-sounding name, the E-Portfolio Research and Development Community — and both post and comment on someone else’s blog almost every day. It worked, and his blog started to be accepted into the worldwide e-portfolio community. There was obviously something here for education.

Dave and I decided to build something that did take the social web into account. First, we simply described it in a very short informal paper, and put it out on Dave’s blog. The response from the community was immediate: one very well-respected analyst called it “visionary”. Another sniffily commented that it was one thing to talk about it and another to build it — which, well. Game on.

We both built a prototype; Dave’s in Macromedia Coldfusion, mine in PHP. (Even then, I don’t know that these were the right technology choices.) I can’t remember what his was called, but I put mine on a domain name I’d bought so that I would have an official-looking email address to apply for jobs with, based on the town in Switzerland my dad’s family comes from. Of the two prototypes, we decided to go with Elgg.

We first tried to give it to the university. Dave’s supervisor ran learning technology at the time; he took it to a meeting, and the response I heard back was that “blogging is for teenage girls crying in their bedrooms.” For all these years, I’ve taken Dave’s word that this is what was said, although I’ve sometimes wondered if he just didn’t want to give it to them. Either way, it appalled me enough that I quit my job.

I moved back to Oxford and into my parents’ house. They’d moved back to California to take care of my grandmother, and their plan was to rent it out; in the end we rented out the other bedroom to a friend of mine and I was lucky to be able to live in it rent-free for six months while I figured everything out. This was a big burden on them: we didn’t have a lot of money, and while the house wasn’t exactly in a great neighborhood, renting half of it didn’t cover its costs. They essentially underwrote me while I wrote the first version.

And then I had to get a job. I became the webmaster at the University of Oxford’s Saïd Business School, where my job was to revamp the website to use a new CMS and a design that had been created by a prestigious firm in London. Instead, what happened was that I very quickly started becoming a startup resource inside the school. Students came to me to talk about their work, and would invite me to their seminars. Lecturers would ask me questions. I was allowed to attend an event called Silicon Valley Comes to Oxford, where people like Ev Williams (then CEO of Blogger), Reid Hoffman, and Craig Newmark would speak and share their experiences.

After kicking the tires for six months, we released Elgg as an open source project. Eventually, it was able to make enough money to employ me and Dave full-time, and I left to work on it. We were asked to help build the first version of MIT OpenCourseWare (which we eventually parted ways with), and consulted with a school district in upstate New York who wanted our expertise more than our software. But it was enough to get going with. My friendships at the Business School were so strong that I was allowed to come back the next year, with Dave alongside me. We asked Biz Stone to become an advisor, which he agreed to, and it felt like we were off to the races.

We had no idea what we were doing at any point, and we didn’t exactly get along. Our company was formed poorly; I was the CTO and Dave was the CEO because he’d looked me dead in the eye and said, “I’m going to put my foot down on this one.” I was still so unsure of myself and full of self-loathing that I just accepted it. Behind the scenes, we decided things together, and in some ways, the partnership worked; he had a kind of hubris that I lacked, and I understood the internet in a way that he didn’t. It helped that I could also build and write. At the same time, it didn’t make me feel good; Dave liked to tell people that we never would have been friends, which I think he meant as an odd couple style joke, but was hurtful every time. When we were in Cambridge to speak to MIT about OpenCourseWare, he took me aside to tell me that when push came to shove, he would be looking out for myself, and that I should do the same. It wasn’t the way I liked to think or act; we came from different worlds. I’m sure he was similarly perturbed by me: this maladjusted nerd who seemed to care much more about writing than about operating in the real world.

Elgg didn’t make anyone rich, but it was successful in a way I’m still proud of. The original version had over 80 translations and was used all over the world, including by non-profits who used it to organize resource allocation. A revamped version with a stronger architecture was used by the anti-austerity movement in Spain, by Oxfam to train aid workers, and by the Canadian government as a sort of intranet.

After a few years of bootstrapping, working almost 24/7, we accepted a modest investment from some executives at a large international bank, who were getting into startups on the side. They really wanted us to get into the fintech market, specifically around hedge funds, and maybe they were right from a business perspective: there’s a lot of money there. But it wasn’t why I’d started working on it, and it wasn’t what I wanted to do. Dave was more enthusiastic, and between that and the fact that our relationship had broken down to being almost antagonistic every day, I decided to leave. The day I shut down my laptop for the last time, I felt almost weightless: for the first time in the best part of a decade, I had no commitments. It had been weighing on me hard. I was 30 years old now, somehow, and it felt like I was emerging from a dark cave, blinking into the sunlight.

No other working experience has been exactly the same. I know a lot more, for one: I wouldn’t make the same mistakes. But I also wouldn’t take the same risks, exactly because I know more. My naïvety brought a kind of propulsion of its own; like many founders, I was fueled by pure Dunning-Kruger effect. But at the same time, there were days when I was dancing on my chair because of something that had happened. The startup brought incredible highs — the kind that can only come from something you’ve created yourself — as well as deep lows that interacted horribly with my already damaged self-image. It made me feel like I was worth something after all, but also that I wasn’t. It was a rollercoaster. And yes, despite everything, I would do it again.

· Posts · Share this post

 

Pageboy: A Memoir, by Elliot Page

Raw, personal, and honest: a memoir of transition and survival by someone who has been in the public eye for most of his life but never really seen. There's no sanitized veneer to his writing, and my life is better for having read his story. I hope his life is better for having written it.

[Link]

· Links · Share this post

 

Nine out of 10 people are biased against women, says ‘alarming’ UN report

“At the current rate of progress it will take 186 years to close gaps in legal protections. It also explains why, while there has been some progress on enacting laws that advance women’s rights, social norms continue to be deeply entrenched and pervasive.”

[Link]

· Links · Share this post

 

Researchers discover that ChatGPT prefers repeating 25 jokes over and over

“When tested, "Over 90% of 1,008 generated jokes were the same 25 jokes.”” We have a lot in common.

[Link]

· Links · Share this post

 

Europe: Is compulsory military service coming back?

“After the collapse of communism and the end of the Cold War in Europe, many countries abolished compulsory military service. But in the wake of the war in Ukraine, several are considering bringing it back.”

[Link]

· Links · Share this post

 

The Risks of Staying Put

“I have to remember that my health is more important than my job. And the pain that you’re used to is still a pain you should run away from.”

[Link]

· Links · Share this post

 

Is GitHub Copilot Any Good?

“The code generated by Copilot is often wrong, but always subtly so, which means that when I let it fill in any non-trivial suggestion for me, I spend a considerable amount of time doing ‘code review’ on the code it emits.”

[Link]

· Links · Share this post

 

All this unmobilized love

“Even most of the emergent gestures in our interfaces are tweaks on tech-first features—@ symbols push Twitter to implement threading, hyperlinks eventually get automated into retweets, quote-tweets go on TikTok and become duets. “Swipe left to discard a person” is one of a handful of new gestures, and it’s ten years old.”

[Link]

· Links · Share this post

 

Democrats work to protect abortion, trans rights with combined laws

“Each of us has the freedom to determine our path in life, each of us has the right to make decisions about our medical care and our bodies without government interference.”

[Link]

· Links · Share this post

 

The Horror

“100% of trans people who seek access to gender affirming care as children and are denied go through the horror. 100% of trans children who never know that gender affirming care exists go through the horror. And for what?”

[Link]

· Links · Share this post

 

America’s Suburbs Are Breeding Grounds for Fascism

“Without a massive reorganization of American life—away from privatization, car-centrism, and hyper-individualism—it’s likely the suburban ideology will remain popular, and even grow.”

[Link]

· Links · Share this post

 

How the U.S. Almost Became a Nation of Hippo Ranchers

“Great Britain has eaten the Australian kangaroo and likes him, horseflesh is a staple in continental Europe, and the people of Central America eat the lizard. Why cannot Americans absorb the hippopotamus?”

[Link]

· Links · Share this post

 

Instagram’s upcoming Twitter competitor shown in leaked screenshots

“Cox said the company already has celebrities committed to using the app, including DJ Slime.” I am old.

[Link]

· Links · Share this post

 

Google Gets Stricter About Employees’ Time in Office

“Google will consider office attendance records in performance reviews and send reminders to employees with frequent absences, becoming the latest company to urge a return to in-person collaboration following an embrace of remote work during the pandemic.” This is wretched.

[Link]

· Links · Share this post

 

Apollo will close down on June 30th.

“Reddit’s recent decisions and actions have unfortunately made it impossible for Apollo to continue.” At this point, developers shouldn’t build their apps against commercial APIs. Open standards or nothing; the risk is too great.

[Link]

· Links · Share this post

 

Of Media & Monsters

“Ihave been in Silicon Valley long enough to see it transform from a group of outlier revolutionaries to play-safe career chasers. Recently, I have watched arrivistes who, if not in technology, would be running a penny stock brokerage based somewhere in Long Island or producing B-movies.”

[Link]

· Links · Share this post

 

When deepfakes are everywhere

A network spells out the words: deep fake

I’m soliciting prompts for discussion. This piece is a part of that series.

 

Ryan Barrett asks:

It seems like the last 200 years or so - when we could use recorded media, photographs, audio, videos, as evidence or proof of anything - may have been a brief, glorious aberration, a detour in the timeline. Barely a blink of an eye, relative to the full history of civilization. Nice while it lasted, maybe it’s over now.

What does that mean? If true, how will we adapt? What techniques for evidence and proof from the pre-recorded-media era will we return to? What new techniques will we find, or need?

I’ll start by asking: could we? Or to put it another way: have previous assumptions we might have made about the trustworthiness of recorded media been warranted?

One of the most famous users of photo editing to alter the historical record was Stalin, who often edited people he deemed to be enemies of the state out of photographs. Portraits of the leader that hung in peoples’ homes were retouched so that they were more to his liking.

A few years later, the artist Yves Klein took photographs like this one of him hurling himself off a building. Obviously, they weren’t real: his intent was to demonstrate that the theatre of the future could be an empty room; arguably an accurate demonstration of our present.

Later still, a photo of Obama shaking hands with the President of Iran circulated widely on Republican social media — despite the fact that the event never happened.

And there are so many more. As the Guardian wrote a few years ago about Photoshop:

In fact, the lesson of the earliest fake photos is that technology does not fool the human eye; it is the mind that does this. From scissors and glue to the latest software, the fabrication of an image only works because the viewer wants it to work. We see what we wish to see.

Sometimes, we didn’t even need trickery. President Roosevelt tried to hide his disability by having the Secret Service rip the film out of anyone’s camera if they caught him in his wheelchair. Endless short men in the public eye — Tom Cruise, for example — have hid their height on camera by standing on boxes or having their counterparts stand in a hole.

Of course, the latest deepfake technology and generative AI make it cheaper and easier to create this kind of impossible media. Although it’s not new, it will become more prolific and more naturalistic than ever before.

The Brookings Institution points out that in addition to the proliferation of disinformation, there will be two more adverse effects:

  • Exhaustion of critical thinking: “it will take more effort for individuals to ascertain whether information is true, especially when it does not come from trusted actors.”
  • Plausible deniability: accusations of impropriety will be more easily deflected.

Trusted actors, of course, are those we already know and rely on. Most people will not think the New York Times is faking its images. So another adverse effect will be the relative inability for new sources to be taken seriously — which will particularly hurt sources from disadvantaged or underrepresented groups. For the same reason, maintaining a list of “approved” sources that we can trust is not a real solution to this problem. In addition to it censoring new and underrepresented voice, who could possibly reliably maintain this kind of list? And what will prevent them from interpreting factual data that they don’t like as disinformation?

Regarding plausible deniability, even without deepfakes, we’re already learning that many forensic evidence techniques were more limited than we were led to believe. Bite marks, hair comparisons, and blood spatter, all commonly used in cases, were shown to have a limited scientific basis and to have often been misapplied. An artifact in itself is almost never enough to prove something to be true; we simply have to ask more questions.

Context is a useful tool here. If a public figure is shown to have said something, for example, are there other corroborating sources? Were there multiple independent eyewitnesses? Is any surrounding media drawn from this one artifact, or are there other, independent stories drawn from other, separately-recorded evidence?

So the real change will need to be with respect to source analysis. We’ve been trained to be consumers of information: to trust what’s on the page or on the screen. As I tried to explain at the beginning, that was always an approach that left us open to exploitation. There is no text that should not be questioned; no source that cannot be critically examined.

Generally, I think the Guardian’s observation holds true: we see what we wish to see. The truth will have plausible deniability. We will need more information.

To be sure, technology solutions are also useful, although it will be an arms race. Intel claims to have a deepfake detector that works with 96% accuracy — which will be true until the inferred blood flow signals it uses can also be accurately faked (if that hasn’t happened already). Researchers at the University of Florida experimented with detecting audio deepfakes by modeling the human vocal tract. Again, we can expect deepfake technology to improve to a level where it surpasses this detection — and regardless, we still have to worry about the impact of false positives. We also should worry about any incentive to recreate a situation where we unquestioningly accept a source.

As IEEE Spectrum noted:

Even if a quiver of detectors can take down deepfakes, the content will have at least a brief life online before it disappears. It will have an impact. […] Technology alone can’t save us. Instead, people need to be educated about the new, nonreality-filled reality.

We will need to use all the tools at our disposal — contextual, social, and technological — to determine whether something is a true record, representative of the truth, or an outright lie. We always had to do this, but most of us didn’t. Now technology has forced our hand.

· Posts · Share this post

 

How High We Go in the Dark, by Sequoia Nagamatsu

Not what I thought it was going to be. An early chapter was so heartbreaking that I thought I would have to abandon the book; it brought up feelings of loss I hadn’t felt since my mother died. I still don’t know if I appreciate the catharsis, but that’s what this book is: the author conjures how deeply we feel in the face of the worst horrors.

[Link]

· Links · Share this post

 

Fewer than a third of Americans believe local news holds public officials accountable, poll finds

“If the primary source of local news (for many people) is local television, it’s not a shock that less than a third of people would say they think local news is holding public officials accountable.”

[Link]

· Links · Share this post

 

The most used languages on the internet

“Millions of non-native English speakers and non-English speakers are stuck using the web in a language other than the one they were born into.”

[Link]

· Links · Share this post

 

Pride Month: In conversation with The 19th's LGBTQ+ reporters

“It’s hard for me to get excited about Pride Month as a concept this month, because we are in that place where … it feels to a lot of trans people like we are being threatened to the point of genocide.”

[Link]

· Links · Share this post

 

Climate Crisis Has Stranded 600 Million Outside Most Livable Environment

“Climate change is remapping where humans can exist on the planet. As optimum conditions shift away from the equator and toward the poles, more than 600 million people have already been stranded outside of a crucial environmental niche that scientists say best supports life.” #

[Link]

· Links · Share this post

 

Leviathan Wakes: the case for Apple's Vision Pro

“Now we’ll get to answer the AR question with far fewer caveats and asterisks. The display is as good as technologically possible. The interface uses your fingers, instead of a goofy joystick. The performance is tuned to prevent motion sickness. An enormous developer community is ready and equipped to build apps for it, and all of their tools are mature, well-documented, and fully supported by that community.”

[Link]

· Links · Share this post