Our devices are working against us.
Recently, we learned that Volkswagen was falsifying its mandatory E.P.A. emissions tests. Because each test has a set of characteristics that don’t accurately match real-world driving conditions, the internal software running in 11 million cars could deduce that an official test was taking place. Under these conditions, the cars would turn on enhanced emissions controls, which allowing them to pass the tests but reduced mileage and other driver-friendly features. In the real world, the cars had better mileage and acceleration, but were spewing illegal levels of pollution.
This was far from an isolated incident. Less than a week later, it emerged that some Mercedes, BMW and Peugeot vehicles were using up to 50% more fuel than laboratory testing suggested. Meanwhile, on average, the gap between tested carbon emissions and real-world performance is 40% - up from 8% in 2001. And while Volkswagen’s software broke the law, detecting test conditions to cheat on results is a widespread practice that has become an open secret in the industry.
These practices extend far beyond cars: even our televisions are faking data. Recently, some Samsung TVs in Europe were found to use less electricity in laboratory tests than under real-world viewing conditions.
In each case, the software powering the device was unavailable to be tested. In a world where cars are heavily scrutinized to ensure passenger safety, examiners had very little way to determine what the software was doing at all. According to the Alliance of Automobile Manufacturers, providing access to this source code would create “serious threats to safety or security” (pdf). Even the Environmental Protection Agency agreed, arguing that opening the source code could lead to consumers hacking their cars to achieve better performance.
Anyone who’s worked in computer security will know that this is a spurious argument. Obscuring source code doesn't make software safer: on the contrary, more scrutiny allows manufacturers to find flaws more quickly.
Earlier this year, Wired demonstrated that hackers could remotely kill the brakes on a Jeep from a laptop in a house ten miles away from the vehicle. The same hackers had previously demonstrated that they could achieve complete control over a Ford Escape and a Toyota Prius: these vulnerabilities appear to be widespread and not limited to any particular manufacturer.
In the light of these exploits, it’s extremely likely that developers are already cheating tests by hacking their cars, without E.P.A. or manufacturer knowledge. Indeed, a cursory Google search reveals hackers talking about cheating on their emissions tests using Arduinos and other devices. To quote one participant in one of these forums: “I 100% believe that these tests are a complete joke/scam and therefore should be cheated with any and all available means.”
In a world where cars are increasingly driven by complex software, the only reliable way to test them is to inspect their source code. This is true following revelations that Volkswagen falsified their E.P.A. tests, and it will become increasingly crucial as autonomous vehicles become widespread.
McKinsey & Co predicts that autonomous vehicles will be widespread in around 15 years. The consequences of hacking your vehicle today are largely environmental: not something that should be discounted, but also not life-threatening except in aggregate. Once autonomous vehicles are commonplace, your car’s software algorithm will be the only thing keeping you from crashing into another family on the interstate.
Autonomous vehicles will eliminate an enormous percentage of car accidents, and we should not fear their introduction. However, hackers will certainly attempt to alter the software running their vehicles in order to go faster, impress their friends, perform stunts, and so on. If the software infrastructure inside a vehicle is kept secret from regulators, only the manufacturer will have any way of determining if this has taken place.
More pressingly, it’s now become apparent that manufacturers can’t be trusted to protect our interests. Even if it is impossible to hack an autonomous vehicle – which would be hard to believe – we need to ensure that the algorithms and software that power these products are as rigorously testable as the steel and rubber we sit in.
Opening source code to scrutiny does not limit its owner’s intellectual property rights. It also doesn’t prevent a manufacturer from making a profit or protecting their unique inventions. It does, however, allow us to trust their products. This is important for all connected devices, but cars are uniquely life-threatening when misused.
Legislators should act to protect our safety. In the same way that seatbelts and other safety measures were made mandatory, the source code that powers modern vehicles should be made available both to regulators and the public. Security through obscurity is no security at all.
Help is available: we’ve been doing this in the software industry for decades. The open source community should help manufacturers build more open software while retaining their intellectual property.
Open software is in the public interest, particularly when lives are at stake.