Illustration by Jacob Livengood
If you asked a stranger on the street to describe what the stock market looks like, most would probably mention a bunch of sweaty white-shirted types shouting and furiously gesticulating in a Wall Street trading pit. The more erudite might include references to retired rich people playing with their money over the computer or offices full of overworked geeks glued to multiscreen terminals.
These days the reality is that the average trader doesn’t have eyes or hands or emotions. They only have the numbers. Commodities markets the world over have been hijacked by robots or, more specifically, algorithms that can scan data and trade stocks so quickly that their meat-brained creators often can’t keep up with what they’re doing.
High-frequency trading (HFT) accounted for about half of US stock-exchange trades in 2012—approximately 1.6 billion shares a day, according to estimates cited by Bloomberg Businessweek. In many ways, these algorithms mimic human traders’ transactions buying and selling stocks among themselves, though to make trades as quickly as possible, they are equipped with only the most rudimentary analytic tools. Unlike human traders, whose actions are often undergirded by real-world data like a company’s reported quarterly profits or losses, algorithms react only to real-time market movement, and some scientists and analysts now say that all their unsupervised activity might be a problem.
In September, researchers at the University of Miami published a paper that examined the effects of the widespread use of these narrowly focused algorithms. Using data gathered by Nanex, a company specializing in software that provides real-time market information to investors, the team looked at stock trades that occurred at time scales under a second, an interval at which only robots can act. They made a startling discovery: from January 2006 to February 2011, there were more than 18,000 spikes and crashes in individual stock prices that resolved themselves almost instantaneously and that have gone unnoticed until now.
Despite the market’s being able to right itself in milliseconds, these extreme fluctuations are “huge crashes,” according to Neil Johnson, the paper’s lead author.
“Not just 10 percent of a stock or 20 percent of a stock, but almost 100 percent of the value—within a second,” he said. “Even though they’re in it for themselves, [the robots] form into groups. You get this kind of mob behavior, where a whole bunch of them have exactly the same opinion at exactly the same moment. That’s why they kick in these huge spikes and crashes that you don’t see in the human world.”
Before the release of Johnson’s paper, titled “Abrupt Rise of New Machine Ecology Beyond Human Response Time,” even the companies that sent the bots out into the world were unaware of the almost imperceptible, ultrarapid downturns and upswings left in the wake of their trading decisions. While they trade much faster than humans, algorithms also share a weakness with us: groupthink. This influences not just individual stocks but occasionally entire markets—packs of robots with similar objectives competing against one another in the subsecond market sometimes start trading in a falling-domino-like fashion that can bubble up and manifest itself in the human world in a big way.
Hence the phenomenon of “flash crashes,” which are at least partly caused by HFT. The most famous one hit on May 6, 2010, when the Dow Jones lost 9 percent of its value in five minutes and then recovered most of it 20 minutes later. Other robot-related oddities include the “mysterious algorithm” that accounted for 4 percent of the entirety of quote traffic in the US stock market one week in October 2012 and the stupid, singular glitch that resulted in Knight Capital buying and selling $7 billion worth of shares on the New York Stock Exchange in 45 minutes on August 1, 2012, which led to the firm losing $440 million (40 percent of its value).
HFT critics have said that algorithms make markets more volatile, reduce profits for regular investors who compete against these machines, and have no social utility beyond making a bunch of very rich people richer. Italy, of all places, was the first country to pass tax legislation with the specific aim of curbing HFT. Wondering how such trends might influence the global economy, and if I should start digging an underground bunker in preparation for an algorithmic financial apocalypse, I got in touch with Professor Johnson.
VICE: How worried should we be that software is causing microspikes and crashes in the stock market?
Neil Johnson: Right—if every time I look at [the market] it’s doing OK, because it’s done something nasty when I blink, is it really OK? Well, we decided to track these ultrafast spikes and crashes through the 2008 crash everybody knows about, and we were quite amazed by what we found: not only did these spikes and crashes accelerate, they started escalating up to about a year or a year and a half before the crash. And if you look at the companies whose stocks had the most severe escalation of spikes and dips, it’s Morgan Stanley, Goldman Sachs, Wells Fargo, J. P. Morgan, Bank of America, Lehman Brothers…
Was it that these algorithms foresaw that these companies were in trouble, like animals before an earthquake, or is it possible that they contributed to the crisis?
What we believe is happening is that these algorithms, which have of course been written by humans but [operate independently], worked it out that there was a weakness in the financial sector almost a year beforehand. I’m not necessarily saying they caused it, but they certainly saw the gaps starting to appear and getting wider… So those mobs were then forming more and more frequently in the buildup to the crash.
That sounds convoluted—a chicken-before-the-egg-type scenario. But you’re saying the machine ecology definitely helps cause these flash crashes that pop up occasionally in the stock market that humans can perceive, correct?
We’ve checked around these flash crashes and, again, we do see a pickup of the ultrafast spikes and dips around those. So yes, I think it’s all part of the same thing. [Economists] always come up with some kind of good reason why a certain flash crash occurred; there’s a tendency to always point the finger at one object. But an ecology gets pulled down not because of one object but because of interactions and their collective behavior. If I go out onto the [highway] and I’m a car in a huge pileup, is it my fault? Yeah, maybe—I’m there. But so is everyone else.
What do you think of the HFT tax that Italy just passed? Will it be a problem if other countries adopt it?
I think that the tax idea is a bad one. It raises all sorts of questions, like who will be taxed? Entities that trade a stock of a company which is based in that country, or the owner of an algorithm who happens to have an office in that country, or owners of algorithms whose trade is implemented on an exchange in that country? And what happens when trading fully migrates to a “cloud” form in which no one really knows where anything “is”?
It’s been reported that you are looking at how your work could apply to cyberwarfare. Does this mean that you’re supposing similar algorithms could be applied to different sets of data and that this might be a major problem?
I think there’s a distinct possibility of algorithms being used as a weapon. Instead of having one person or one program attack an infrastructure, why not send a swarm of these algorithms? If I were an infrastructure and I was being attacked by a loose group of algorithms that weren’t really connected to one another—almost like a cyberinsurgency—how do I best defend against that? And if that attack is happening at a scale of under a second, I can’t be making decisions. I’ve got to write an algorithm, or a set of algorithms that do that.
It sounds like in those circumstances, unlike in the markets, algorithms could get more complex and thornier.
Absolutely. The technology exists in computer science to write what’s called genetic algorithms—it’s very easy to have an algorithm, when something hasn’t worked out for it, combine with another algorithm to produce an algorithm that is optimized when compared with the first two. You combine two pieces and you get something “better” in the new generation… Then you’ve got the problem of, if something goes wrong, you ask, “Who put that algorithm in there?” And everyone says, “Not me.” Who knows where that would head? But that’s what happens in an ecology—in the end, species combine, you get mutants.
If you were in charge of the stock market, what sort of regulations would you put into place?
I’d get people who did this kind of algorithmic coding, and I’d build a lab that mimics the market. Just as biologists build miniversions [of ecologies] in labs to try to understand the complex things at play, that’s what I’d do. I wouldn’t try to regulate [machine trading] out, I wouldn’t try to tax it out, I’d try to go with it and try to see one step ahead of it.