In October, Tesla brought joy to autonomous car enthusiasts with the announcement that from now on, all of its cars will ship with the hardware required to operate in full autopilot mode. Below the fold, but by no means hidden in this announcement, was a clause stating that "using a self-driving Tesla for car sharing and ride hailing for friends and family is fine, but doing so for revenue purposes will only be permissible on the Tesla Network, details of which will be released next year."
Or as ArsTechnica helpfully translated: Don't plan on using your autonomous Tesla to earn money with Uber or Lyft. Instead, Tesla vehicle owners will have to contend with whatever "Tesla Network" ridesharing option that the automaker rolls out.
But what's the precedent here? There's no real analogue with conventional (which is to say mechanical) vehicles, which will drive wherever you point the steering wheel. But with modern cars running over 100 million lines of code, licensing issues which have previously been confined to PCs, music systems or games consoles are starting to have an impact on how and where we drive.
"It's not clear yet how Tesla intends to enforce its restrictions."
Bryant Walker Smith, assistant professor of law at the University of South Carolina, and a specialist in the emerging legal frameworks for autonomous vehicles, says that this is part of a shift by Tesla and other automakers towards framing the vehicles they make as services rather than products.
Digital rights management (DRM) controls like these were at the heart of a successful petition filed last year by the digital-rights advocacy group the Electronic Frontier Foundation before the US Librarian of Congress. The EFF sought to get an exemption for researchers to be able to access and tinker with car software systems, circumventing the manufacturer's security settings, which would technically be in violation of section 12 of the Digital Millennium Copyright Act (DCMA) passed in 1998. The Librarian ended up granting a research exemption to this part of the law.
Kit Walsh, staff attorney for EFF, frames this as part of a long trend of manufacturers trying to restrict unauthorised use of their hardware through a mixture of code and copyright law (as happened in the early 2000s when printer maker Lexmark tried to sue a company that reverse engineered their ink cartridges). Ultimately it then comes down to legal challenges such as the Lexmark vs SCC case, or the EFF petition, to tip the scales back in favour of openness and consumer access.
"It's not clear yet how Tesla intends to enforce its restrictions," Walsh said, "But if they're codebase restrictions - if they write software that somehow is able to physically restrict you from using your car for certain lawful purposes - then without the exemption we won, it would be unlawful to even look at that software, let alone modify it to give you the freedoms you would otherwise enjoy."
Thanks to the terms of the 1201 exemption, says Walsh, modifying car software is permissible as long as it is being done for a lawful end – implying that if you're legally entitled to take part in a rideshare scheme, then you have the right to override Tesla's diktat in much the same way that you can legally root your phone.
More details on the legality of Tesla's proposed restrictions are sure to come to light when the Tesla Network becomes operational. In the meantime, the first person to jailbreak a car will certainly make for an interesting test case.
I reached out to Tesla via the company's press email for comment on this story, but I didn't receive any response in time for publication.
Get six of our favorite Motherboard stories every day by signing up for our newsletter.