The Dark Side of Autonomous Vehicles?


Readers of this blog will know that I have a generally optimistic view of driverless cars, believing that a computer is likely to avoid many human frailties and generally reduce the number of crashes.

My viewpoint got a jolt this week. The technologist half of my brain has been enjoying the New Relic Future Talk series. While I didn’t catch it in person, this week I watched the video of Andrew Wilson (Intel’s open source compliance officer) discussing the potential role for open source software in autonomous vehicles.

Some of the potential problems he points out with software-driven vehicles:

  • Software updating – cars remain in service for many years. How do you install updates? Imagine if the car  you’re driving today was still running on Windows 95!
  • Security – Can the NSA (or even more malevolent actors) bug or sabotage your car by inserting code?

Andrew suggests that open source, with more transparent code, can address at least some of these issues. He also posits that manufacturers will have a strong incentive to collaborate since most of the software stack will not be competitive features.

Will your next car run Windows or Linux?


6 responses to “The Dark Side of Autonomous Vehicles?”

  1. Many automobile owners already have plenty of firmware on vehicles that might be engaged in unsavory reporting of activities, tracking, etc. (Think of OnStar and similar technologies, built-in navigation systems).

    One other potential issue: Gearheads and others replacing the firmware to override technical restraints on the behavior of autonomous vehicles. Right now, there’s a blooming black market in “mod chips” for engine controllers that boost performance but cause the engine to no longer comply with environmental quality regulations. (There’s also the phenomenon of “rolling coal”).

    Future autonomous car owners might modify their cars to override technical restraints on things like speed and such.

    • Mod chips were pretty big (at least in the San Diego area) even 10+ years ago. You just needed a Windows 95+ age car.

      When autonomous cars hit the market most likely they’ll only be available through a lease program until the bugs are worked out. Insurance will be a complication, so only drivers with a good insurance history will be eligible.

      That or they’ll just phase in the technology as many manufacturers have been doing. Self-driving isn’t a single step, but it’s a slow set of baby steps along the way. Automatic parking or braking is one of many steps to a self-driving car.

      I do like the market position Daimler AG has gone into with Car2Go. Having the branding and business model of point to point trips seems like a great way to walk intelligent travel systems into the market.

      A “Smart” car even just sounds like it should drive itself.

  2. I see none of this as an obstacle to autonomous vehicles. It’s just fear-mongering. “Your car could KILL YOU! Tune it at 11.”

    How is the software update any different than the need for an oil change? The mechanic will take care of it.

    Security is already a problem that needs to be addressed and it’s not specific to autonomous vehicles. They are already hacking existing cars to do things like “suddenly engaging the brakes of the Prius, yanking its steering wheel, or causing it to accelerate. On the Escape, they can disable the brakes when the SUV is driving slowly.”
    http://arstechnica.com/security/2013/07/disabling-a-cars-brakes-and-speed-by-hacking-its-computers-a-new-how-to/

    Environmental hacks by gear-heads are also not specific to autonomous vehicles. They’ve existed as long as environmental restrictions have.

  3. I don’t know how big the security/privacy issues are, but it strikes me that they look a lot different depending on who owns the vehicle. If there’s centralized ownership (with people temporarily renting them, like taxis) updates and security aren’t so scary. They could just be pulled into the garage on a regular service cycle and be checked for malware or unauthorized tampering. For inservice issues cars could have a simple kill switch accessible to the passenger, like brake lines that run through trains. Pull it and the car automatically pulls over to a safe spot and a new vehicle is dispatched to take its place. Privacy expectations are different too using someone else’s vehicle vs. your own (in ways that people may not be quick to embrace).

Leave a Reply

Your email address will not be published. Required fields are marked *