The theme of Augmented World Expo 2015 was "Superpowers to the People," and from the banners hanging in the Santa Clara Convention Center displaying comic book-style silhouettes of heroic men and women with futuristic goggles, you'd think that augmented reality could give you the ability to fly or shoot lasers out of your eyes.
It won't, not in real life at least, but via varying combinations of head-mounted cameras, motion sensors, and personal displays, augmented reality (AR) could give a nurse the ability to see a patient's veins before taking blood, which is kind of like X-ray vision.
"Because of you, we're enabling people to practice surgeries before they go in to perform them for the first time, significantly decreasing their chance of error," said Ori Inbar, the CEO of AugmentedReality.org, which operates the event, often shortened to AWE.
Inbar was addressing the 208 speakers, 102 exhibitors, and nearly 3,000 attendees gathered in Hall B for his opening keynote and a complimentary breakfast of West Coast bagels and suspect fruit. Inbar referred to the speakers and exhibitors as The Avengers.
"Ultimately, everything you're doing is allowing us to interact with computers in a human way so I want to thank you guys for that," Inbar said. "You're really bringing superpowers to the people."
In a session about designing AR for real world interaction, Sam Haddad, an engineering manager from Design Interactive, showed the audience how the company was using Canon's MREAL mixed reality headset to train people to put out a fire on a submarine. Previously, this training was done in front of a monitor with a joystick.
With MREAL, and a special room designed with pipes mimicking those on a submarine, you could practice putting out fires with a hose and shutting off all the right valves. The hose and the pipes are there so you're interacting with physical objects, gaining muscle memory, and getting accustomed to a stressful situation, but the fire and water aren't.
After trying on a number of hugely embarrassing headsets and smart glasses, I was ready to call myself a Muggle
At another session, DAQRI, which makes a smart helmet designed specifically for industrial applications, showed how a connected worker could solve problems she wasn't necessarily trained for. In a concept video that DAQRI's VP of product Matt Kammerait insisted the company could achieve, I saw how the smart helmet overlaid a virtual interface over a series of pipes, instructing the worker through a number of steps which had her turning valves and replacing a filter.
Basically, if your pipes need attention, augmented reality has the hook up, though the same principle applies to almost any kind of job—construction, healthcare, oil and gas, education, whatever. Augmented reality can train you for the job or help you do it better. Smart glasses will help you see things you couldn't see with your boring human eyes, or fetch the information you don't have from someone smarter.
Before playing DAQRI's concept video, Kammerait joked that all AR companies should sign a treaty where they agree not to show concept videos of anything they can't currently accomplish. Virtual reality has a similar problem with concept videos. The first-person perspective is intrinsic to both AR and VR, and the videos often aspire more than they represent what they can achieve today.
Meghan Athavale, CEO at Lumo Play, a startup that makes the kind of large, interactive projections you see at the mall, gave a talk on how these videos need to walk the line between what is possible and science fiction. The rule, she said, is that concept videos should show what the company knows it can do, but hasn't yet.
"It's been our responsibility as designers, engineers, and programmers to take what's converging in technology and turn it into a vision that people can adopt and see. That's really difficult," Athavale said. "I don't know if it's offensive that I call people who don't really see that vision Muggles [a pejorative for non-magic users in _Harry Potter_], but that's kind of how I see them."
After trying on a number of hugely embarrassing headsets and smart glasses on the AWE show floor, I was ready to call myself a Muggle.
Oliver Aalami, a vascular surgeon, showed me how the ODG R-6 glasses could help a doctor in the operating room. The glasses, which look like a large pair of Oakleys, showed me an X-ray of the chest cavity when I looked down at a dummy patient, and three monitors with vital signs when I looked up. But the images didn't always appear where and when they were supposed too, forcing me to tilt my head awkwardly to see them.
Basically, if your pipes need attention, augmented reality has the hook up
The device is also noticeably hot with processing power, and I know from watching television that brow-sweat is a big issue for surgeons, but Aalami insists that the R-6 is a godsend. "We're used to wearing crazy headgear with lights and loupes, so why not make it smart?" he said.
I has similarly uncomfortable demos with Epson's Moverio smart glasses, which support a portable touchpad that behaves like a mouse in augmented reality, and Sony's SmartEyeglass, which walked me through some more pipe maintenance. The Lumus DK-32 used a smartphone's screen for a mouse and allowed me to experience the nightmare scenario in which all I have to do to check Twitter is look up and to the right. The Meta AR eyewear allowed me to interact with three dimensional objects floating in front of me with hand gestures à la Minority Report.
Some are better than others, but the results are generally the same: not very good, and light-years behind whatever bullshit concept video the company is showing. The devices feel weird on my head, the image is annoying in the periphery, and not clear enough when I tried to focus on it. The virtual interrupted the physical and vice versa.
That the conference itself felt haphazard didn't help. Sessions were rescheduled, rooms were switched, and A/V equipment failed. On the show floor, Beam Pro units, which allow people to attend the conference remotely via a screen and camera on wheels, struggled to maneuver over power cables. Most of the smart glasses, I noted, weren't cleaned between demos. At some point, a 40-inch flat screen fell off a booth wall and landed on its face. On multiple occasions, people talked on the phone at full volume during sessions, not 10 feet away from the speakers.
The entire event started to seem delusional, but then I put on Technical Illusions' CastAR. It worked. The headset fit comfortably over my pair of prescription glasses, and when I looked down at a table I saw a cascading tower of virtual blocks. Someone handed me a Wii-like controller, which in my augmented reality appeared like a lightsaber I could use to push the blocks around.
As we've explained shortly after CastAR exploded on Kickstarter, the glasses are fitted with a mini-projector fixed above each lens and a tiny camera in the middle. The surface, which you can roll out on the floor or a table, has RFID coils sitting underneath it that allow the camera to scan and track your head position, so the hologram being projected to your eyes makes sense with the physical world you're looking at.
Watch the parts of this CastAR video where users interact with virtual objects on a desk. That's not far off from what I saw. I don't think I want to play games that way, but it was impressive.
CastAR also won the best smart glasses Auggie Award at AWE. It's not surprising given that the company was founded by Jeri Ellsworth and Rick Johnson, both of whom previously worked at the industry-leading video game developer Valve. Ellsworth founded Valve's research and development department, which will give us the Vive virtual reality headset later this year. When the company lost interest in AR, it generously allowed them to take CastAR with them.
If CastAR can pull it off, others can follow. The question is, what do they do with it, besides tell you which pipes to fiddle with?
AR could also identify your friends' face and tell you that it's their birthday so it'll seem like you care, a greatly under-appreciated superpower I wish I had during several AWE sessions. But while AR is a powerful idea, the average person doesn't seems to care, at least not as much as the speakers and exhibitors at AWE indicate that they should.
Apple's WWDC was held in San Francisco, which is one of the most beautiful and opulent cities in America. AWE opened its doors on the same day in Santa Clara, which is neither. WWDC is more glamorous because what's announced there, mundane as it is, ultimately trickles down to a huge consumer market.
Inbar's research shows that around 1 million smart glasses will ship to tech enthusiasts by the end of 2015. He predicts a "major shake up" in 2017 that will eventually lead to more than 1 billion shipments around 2020. Apple alone, by comparison, has sold 700 million iPhones to date, so something drastic has to happen in AR for it to catch up.
An augmented reality headset that can help someone on an oil rig from losing his limbs in some piece of machinery might be useful and commendable, but it's not as profitable as an iPhone, a device we all want, need, or use anyway despite much protest. It does porn, games, and email. It's a part of our lives. It's the kind of device every tech company, including Apple itself, is trying to make again.
Google, which haunts AWE with its absence, tried making such a device with Google Glass, and it was promptly booed out of town. There are two main reasons. The first is that you look like a goddamn tool wearing any of these devices. The second is that augmented reality devices, at least the kind you wear and would theoretically use regularly, rely on an out-facing camera.
We're a culture of selfies and personal brands, we yearn for body cameras as an easy fix for police brutality, but we're paradoxically offended by the idea that someone with a pair of Google Glasses could be recording us. We'll physically assault these glassholes.
"The face is sacred real estate," Optinvent CEO Kayvan Mirza said. "We're not going to put batteries and micro processors and displays on and walk around in public."
Mirza thinks that Optinvent's upcoming Ora X Smart AR Headset solves both problems by escaping the "paradigm prison" of smart glasses. The Ora X is shaped like bulky headphones, which are socially acceptable in his target market of 18-35 year olds, and the intrusive camera and personal display that fits over the eye can be tucked away by flipping it up, much like a microphone on a gaming headset.
Most of the speakers at AWE skipped the question of whether AR will ever hit the mainstream and jumped straight to the moral implications that'll come up when they will.
Avi Bar-Zeev, who worked on the initial prototypes for Microsoft's AR device HoloLens, worried about how companies will use devices that could so easily monitor what we're looking at, what we're doing, even how we're feeling.
"We're going to have face tracking, gesture tracking, emotional sensing through pulse, pupil dilation, EEG, all these things are super cheap," Bar-Zeev said. "There are brain sensing devices already out there. They're intended for good, but depending on who's using them and what the purpose is, they could be used to exploit us, even to enslave us."
Mark Billinghurst, another pioneer in the field of AR who also spent five months working on Google Glass, pleaded with the audience to think beyond games and marketing, which according to him are already dominating revenue in the AR space.
"The face is sacred real estate."
"You guys are The Avengers," Inbar told the audience during his opening keynote. "You are leading this movement, and you know how the saying goes. 'With great power comes great responsibility.' I have one question for you: are you going to use these superpowers for good?"
After a long pause, somewhere in the back of hall B, a singular voice answered weakly, barely audible over the hum emanating from the line of Beam Pro units behind me.