Technology Is Making Us Stupid
Artwork by Katie Horwich

FYI.

This story is over 5 years old.

Tech

Technology Is Making Us Stupid

In Nicholas Carr's the Glass Cage, the Pulitzer-shortlisted author makes a compelling argument on the way technology has failed us—from aircraft autopilot to GPS maps—and the perils of being forever trapped in the beam of our smartphones.

Face it, you're an addict.

You can't go a minute without checking your iMessages, stalking that friend-of-a-friend you like on Facebook, or checking to see who's favorited that Tweet about the hilarious and highly embellished scenario involving an old lady and a dog you saw on your way to work this morning. What's weirder is that you can't remember how you got to this point. One day you're slotting the Jurassic Park VHS into your Panasonic NV-HV60 in the solitude of your living room; the next you're live-streaming your sordid little sex parties to the entire planet via the wonders of Google Glass.

Advertisement

We live in a world where you simply don't need to think to do anything. Opinions, facts, and locations are just an internet search away. It's a real-life dystopia where stumbling across absolute mindless shit like this is more likely than actually speaking to a real-life person on the telephone. Everything we do, online and off, is recorded, broadcast, and tracked, for the gain of world-devouring corporations who want to turn you into a battery for their gargantuan vibrator.

The Glass Cage, written by Nicholas Carr ("one of the most lucid, thoughtful and necessary thinkers alive," according to Jonathan Safran Foer) is all about this. Exploring our increasingly needy relationship with automation technology (think: Google for every question, GPS to find your own flat, taxis that drive themselves) the book presents some ultimately worrying repercussions—like losing the skills to fly a plane, for example. Or a future where everyone owns a machine that makes the perfect spaghetti Bolognese but nobody knows how to cook it themselves (I may have made that one up).

I tracked Carr down for a conversation about how we got so damn dependent and what on earth we can do to stop it.

VICE: Isn't the internet just so naughty?
Nicholas Carr: It puts convenience and expedience above everything else, but it's not naughty.

What's the effect of that?
I think it's manipulating us to become more impatient. We've become less capable to screen out distraction, despite becoming better at gathering information.

Advertisement

Is the way we interact with computers affecting the way we think?
If you're interacting with them all the time, then your brain will optimize itself for gathering and following information and stimuli through computers. It'll also begin to lose some of the subtle functions we have to interact with the actual world.

What's technology ever done to you?
What inspired the book is the recognition that in very short order we've come to rely on computers to do, well, most of the things we actually do. Human history is about striking a balance between what we look to our tools to do and what we do ourselves. Computers are involved in our lives at a level beyond any other technology that has come before.

Do you draw a line between the physical and virtual worlds?
There's an interesting experiment done on mice, actually.

God damn mice, always on their smartphones.
The scientists monitored the brains of mice as they navigated a real maze and then when they navigated a computer simulation. What they found is much less of the mouse's brain was active when they went through the simulation.

Where do you stand on apps like Freedom or Moment that are on your computer and your phone but are designed primarily to prevent you from using your computer or your phone?
It's looking to software to solve a problem that software has created. Our own willpower is somehow insufficient to deal with this.

We're all idiots, basically.
But on the other hand, I find it encouraging, because it reveals that there are people out there shaking their heads at their monitors, thinking, Jeez, this is going too far

Advertisement

Isn't it human nature for us to invent tools to make our lives easier? Aren't we all, deep inside, a big fat guy on the great, flea-infested sofa of life?
There's an interesting theory on this called the paradox of work.

What is that?
Deep in our psyche there's the conflict of thinking we'll be happier if we have nothing to do and an impulse to do less. But in reality that makes us miserable. We think we'll be happier and more fulfilled if we don't have anything to do, but it turns out that we're most satisfied when we actually work.

Oh.
That means, on a personal level, why we're so quick to kick back and rely on technology. Airplanes are a good example of why this is bad. There's the potential for an erosion of skill. When flight disasters occur we blame it on human error, but really it's human error provoked by an over-dependence on technology.

Right. But isn't being able to compress information and fire it out the social media cannon to like-minded people, all over the world, pretty amazing?
My problem isn't with social media at all—it's with how the more we become dependent on the internet to collect information the more we train ourselves to only take information in this manner. Ultimately, the computer reflects a perspective on human thought that actually resembles the way computers work, which is all about a utilitarian processing of information.

So we're all becoming computers. This is upsetting me, Nick.
No, I think that most people on this planet would make pretty rubbish computers, thank goodness. What I fear is that, instead of keeping a nice balance, we're simply saying, "Lets just let the computer do it."

Advertisement

We now run the gamut of being able to design our own Domino's pizza on an app to Google declaring they want to make an all-knowing AI. Should we be worried?
There remains a problem that the algorithms that we depend on are invisible to us. Just because Google is quick, we think that's sufficient. We take for granted how it manipulates our information feed.

There's this assumption that we use the technologies we use because they're the best available. But this isn't true. There are all sorts of interests—economic, social, and even military—that dictate what we end up getting.

At the beginning of automation, there were these two alternatives: one, where you create software from on high and push it down on the worker, and two, where you recognize that the worker has rich skills and intuition so you let them do the programming. Guess which we chose.

Are we all a bit naïve toward a company like Google's intent, do you think?
To a very large extent, the motivations of these companies are economic, and it's in their economic interest to keep us distracted, scrolling, showing you ads. The faster they can get you to move from page to page, the more optimal their economic system because they're gathering information on you and showing you more ads.

But if their ultimate aim is to create an AI, and they're funding this by creating a system that drives us to distraction and instant gratification and then records it, what it's actually done is create an idea of man that is at least half wrong, right?
Well, now you're onto something that seems absolutely right but goes beyond what I was thinking about. The AI's understanding will be a very narrow portrait that reflects the economic interest of the creator's company and also the way we behave when we're online.

Advertisement

So what we're actually going to create is a system that provides us constantly with our lowest desires—porn, pizza, pop culture. What is the likelihood a computer in Google knows my porn preference?
Excruciatingly high.

Should we be scared of the future?
I think we should be worried of the future. We are putting ourselves passively into the hands of those who design the systems. We need to think critically about that, even as we maintain our enthusiasm of the great inventions that are happening. I'm not a Luddite. I'm not saying we should trash our laptops and run off to the woods.

We're basically living out Freud's death drive, trying our best to turn ourselves into inorganic lumps.
Even before Freud, Marx made the point that the underlying desire of technology seemed to be to create animate technology and inanimate humans. If you look at the original radios, they were transmission as well as reception devices, but before long most people just stopped transmitting and started listening.

Are we destined to invent technology that will ultimately kill us?
I'd resist that idea. The way I end the Glass Cage, I look at technology that deepens our relationship with the world. Like the telescope, or a scythe. We're horrible grass cutters without a scythe. It's a hell of a tool. But then there's the clock…

Please don't say the clock is evil.
It's a double-edged sword. I'm always wary pointing out the negatives of this because I worry I'll become categorized as this person who is against clocks.

Advertisement

I promise that won't happen.
OK. Right. Clocks transformed time from a flow into precisely measurable units, which was absolutely essential for industrialization. But it does mean that, when you become completely acclimated to the clock, you lose that sense of being yourself in part of the natural flow of time. We became a little more routinized, we get up at X o'clock, we go to work at Y o'clock, we go to bed at Z o'clock.

OK. Are there any good modern inventions?
Er.

None at all?
A ladder? They're good.

I'm not going to be the guy who says ladders are bullshit. But I wouldn't say they're modern. What about a pogo stick?
I'd consider that alongside the telescope, sure. But if someone spent all day pogoing, then it would also have negative consequences. Something like a DJ mixset, a tool to create art. Digital recording technology, too.

Is quiet contemplation due for a comeback?
I hope so.

Follow David on Twitter.