Tech by VICE

Uber Fingerprinting Users Shows the Danger of Thinking All Technology Is Magic

When "always be hustling" catches up with you.

by Alasdair Allan
Apr 24 2017, 5:30pm

Image: igorstevanovic/Shutterstock

The thing that surprised me about the latest scandal brewing around Uber is that anybody is surprised. Accused of "fingerprinting" phones—assigning a persistent identity to the hardware and then associating this with a user of their service—its real crime is the attempt to disguise the practice from Apple using geo-fencing. Because the only reason Apple has rules about fingerprinting phones is that, in the past, it was far more commonplace than you may have realized.

For the first few years of the iPhone's life Apple even provided a method call in their Software Development Kit (SDK)—the software used to build apps for the phone—to help developers map unique hardware addresses to real names and phone numbers. Apple did this because uniquely mapping users to specific hardware simplifies a lot of backend management for app developers.

This method survived in the SDK for a number of years, and when it was finally deprecated back in 2011 there was a huge rush by developers to figure out how to generate a unique hardware fingerprint via other methods. Apple even created a drop-in replacement method to create a unique identifier when an app started for the first time, but this identity wasn't unique to the hardware—if a user deleted the app, and then reinstalled, a different unique identity was generated—so developers hated it.

So the fact Uber worked around Apple's rules doesn't surprise me in the slightest, considering the nature of its app doing so probably simplified the company's life enormously. Not least because it wasn't, at least on the face of things, using the hack to track its users but to combat driver fraud in markets like China. Its hubris, and the reason Travis Kalanick got a personal slap on the wrist from Tim Cook, was trying to disguise it from Apple. If Uber been more upfront about things it may well have gotten away with it. Anecdotally at least, it wouldn't have been the first time Apple had allowed "favored partners" to brake the App Store rules.

But as average people become more distant from the underlying mechanisms of how the technology they use every day actually works, it has become harder to explain how technology works.

Most people aren't particularly aware of the amount of data just leaks from their phones, to developers, and into the environment. I used to give a talk at big data conferences about what I call "migratory data," the hidden data you carry with you all the time, the slowly growing data sets on your movements, contacts and social interactions, generated by your phone. But as average people become more distant from the underlying mechanisms of how the technology they use every day actually works, it has become harder to explain how technology works. I've stopped giving the talk, because even for the people working in technology, staying on top of how everything works has become a huge burden only alleviated by commoditization.

As an individual technology becomes a commodity the number of people who know how it works decreases. The obvious example technology to point to here, one we're all used to, is the car. Back in the 1950's pretty much every teenager worked on their own car, knew how it worked "under the hood." Today, most teenagers don't, and due to rising insurance rates—and perhaps an awareness that self-driving cars are on the horizon—a lot of teenagers aren't even learning to drive any more. Those of us approaching our middle years in Generation X are probably the last with that particular dying skill. Being able to drive will soon go the way of being able to ride a horse, something that you no longer need to know, because it's been hidden by technology.

You can see the same sort of commoditization in the cloud computing. The ability to run your own servers is a dying skill set amongst technologists, it has been hidden away. If you need a server, you just spin up an EC2 instance, and with "serverless" computing becoming more popular, even the knowledge of how to build and deploy an EC2 instance will become hidden by another layer of technology. The very name "serverless" shows how the underlying technology of servers has been encapsulation away from the end user. Of course there are servers, but most of us don't need to understand how they work any more.

This is how the modern world works: we build something, and then we commoditize it so that it can be used by non-experts. There really isn't any way to operate in today's society without this mechanism, but it makes systems fragile. Which is why projects like the Global Village Construction Set—a set of open source designs to build all the manufacturing and agricultural tools you'd need to kickstart an industrial civilization—exist. Because if you dig deep enough eventually we all run out of knowledge. Cars, servers, microchips, it just depends when your personal technology stack runs out.

Cloud computing has spawned a whole clutch of interesting startups and tools that couldn't be built without it, however evidently they're all things that could be implemented on top of the cloud. It's therefore sort of interesting to speculate what technologies haven't arrived because it's hard, or even impossible, to implement them inside the framework of the higher level concepts that form the basis of understanding for most developers now using cloud infrastructure.

If we lose sight of the underlying workings of technology we limit our vision to the use cases that were originally envisioned when the wrappers around it were created.

If a developer doesn't understand how things work underneath they'll use them as a black box, and using tools in that fashion makes doing things that the original expert that built the high level too—doing things out of the ordinary—almost impossible. If we lose sight of the underlying workings of technology we limit our vision to the use cases that were originally envisioned when the wrappers around it were created.

You can do a lot of interesting things by shrugging off the underlying complexity and using the black boxes other people have built. But you can do entirely different interesting things when you fundamentally understand what's inside the boxes. The next level down. Because you can make the technology do things that people working at the black box level can't.

Of course these days, in this century, the level below the black box is usually another level of black boxes. It's pretty much black boxes all the way down. For instance, it has now actually become impossible to design a modern microprocessor by hand; to do that, you need a computer. Think about that for a bit in the dead of night, and about how fragile that makes us as a society.

The modern world just wouldn't be possible without commoditization of knowledge. But you should at least try and be aware of what you don't know, and a lot of people aren't. Which to me, is the only thing the Uber story goes to prove.

Subscribe to Science Solved It , Motherboard's new show about the greatest mysteries that were solved by science.