FYI.

This story is over 5 years old.

Tech

Parallel Computing Allows Us to Process Incredible Amounts of Data

Multicore processors have quickly and quietly become extraordinarily powerful, making our lives more efficient, almost without us noticing. But where is it all going?

Rapid advances in the computing power of processors is making possible scientific and engineering breakthroughs in just about every field, every day. From weather forecasting to fan design, and traffic patterns to healthcare, it's quietly making our lives more efficient by allowing us to collect and crunch massive amounts of raw data. The breakthroughs are coming so thick and fast no one can really predict where it's all going.

Advertisement

But it's fun to try.

Motherboard asked James Reinders—an author, Intel® Software Evangelist, and expert in what's known as "parallel programming"—how this computing brawn is changing the world, and what we should expect in the future.

Motherboard: Let's start from the beginning. What is "parallel computing"?

James Reinders: Computer systems have historically done one thing at a time, but about 10 years ago, individual microprocessors started going what we call "multicore"—they had multiple processing cores in each processor. So, for the first time they could do more than one thing at a time. It started kind of gradually around 2005 with processors coming out with two cores in them. But today, it's pretty common for servers to have parts that have 2, 4 or 8, even as many as 18 cores in them are common place. We also have a new processor, certain to be a hit for supercomputers, coming out this year that has 72. Phones and tablets commonly have 4 or more cores in them.

MB: So in 2005, having two processing cores on one chip was a major breakthrough. Now we can have 72? That sounds insanely powerful.

JR: Yes, on one chip. It's taken off. The computational power is incredible. And that brings us to the challenge of parallel programming, because these chips can be used several ways. One, is you can just do lots of things on them. When I use a quadcore laptop, and when I'm running PowerPoint, or Excel, or mail or so on; most of those programs take one core, maybe two. But when I'm running multiple programs, I can get my computer busy and it's much more responsive than back in the old days where it would fake that you're running the programs at the same time, but in reality it would be switching back and forth. It's just like people: You can imagine 1 person doing 5 tasks isn't quite as responsive and fast as, say, having 72 people doing those tasks.

Advertisement

Parallel programming is what a software developer does to get all those cores helping do one program faster. You can think of multicore processors, or supercomputers with tens or hundreds of thousands of multicore processors, as either being able to do a bunch of things at once, or to be able to do one thing faster. When you try to do one thing faster you have to do what we call parallel programming. You have to write a program that understands the concepts of concurrency or parallelism in the application itself.

And there's something else going on, too. I worked on the world's first TeraFlop supercomputer; "tera" meaning it was able to do a trillion math computations every second. It had over 9,000 processors and consumed about 850 kilowatts of power. To give you an idea, to run that machine, with that amount of power, would cost a little shy of a million dollars a year for just the electric bill. As of about 3 years ago, we could do that same computation—a trillion math computations a second—on a device that consumes only 300 watts, or about the power of three 100W light bulbs.

MB: So, this changes things considerably?

JR: More than anything it opens up so much more discovery. Most of us can't afford a million-dollar-a-year electric bill, so that level of computation was inaccessible. Now that it's only 300 watts, we can all play with something like that. It has two effects. For the people who can afford the huge electric bill, they can build machines that reach farther than we ever could. For the rest of us, we can have this computing power that was out of our reach.

Advertisement

MB: What are some examples of what we can do with this insane amount of processing power?

JR: There are a lot of things. On one hand we can work on things like drug discovery, including analysis of what drugs might cure a certain type of cancer or disease. Or we might explore the deformation of materials: crashes, collisions, etc. Or we might analyze wind flow over a new aircraft design. There are seemingly unending applications in science and engineering fields alone, and I'm sure there are many other fields that will benefit.

One thing I find fascinating is that we've generally had to settle for "approximations". We know the world works a certain way, and we have very complex equations that explain how it works, but we haven't had computer powerful enough to do the number crunching to use the real equations. So we come up with approximations. They've helped do a lot of cool stuff leading to many break-throughs. But, today is even better–because now that computers have gotten powerful enough, we see a lot of these applications going back and using the computational power to calculate things more accurately.

Similarly, this rise in compute power has had a profound impact on weather forecasting. I think a lot of people haven't noticed, but weather forecasting has come a long way from being kind of so-so; you're never sure if you can trust the weatherman. In the last five years, it's become really reliable. By any metric, weather predictions have gotten awfully precise, able to give you hour-by-hour rain precipitation, etc. It might be raining at my house, but someone who lives 10 miles from here will be told on their phone that it's not going to rain. It's not completely accurate, but it's far beyond anything we had five years ago–and that's happening because the weather models are using this high degree of computing to be much more accurate. What you've been calling an "insane" amount of compute power is making our weather predictions much more "sane"–what we've wanted all along!

Advertisement

MB: What are some other examples?

JR: One of the little things I love is that the effect of extreme compute power is affecting every area. When I worked on the TeraFlop machine in the late '90s, the only organizations who would spend that kind of money were government interests. In this particular case, it's what they called "nuclear stockpile stewardship," which was making sure our nukes blow up when you push the button and don't when you don't push the button. That's a complicated problem and it was worth a lot of money to the U.S.

But now, because it is much more affordable, we can use this level of compute power for lots of things. Things like theoretical physicists wondering about the origin of the universe. Stephen Hawkins' group has been analyzing data from satellites looking at cosmic background radiation, which is the earliest evidence of the Big Bang. You've got the Hadron Collider, the world's largest science project to date. That's enabled by immense computer power.

Or oil exploration. I think a lot of people don't know how we find oil, but it's found by doing complex numerical computations–including what they call "seismic analysis". Geologists build computer models that could analyze and guess what structures are underground; certain structures tend to be more oil bearing than others. If you're going to spend tens of hundreds of millions of dollars drilling an oil well, you want to have high probability of hitting oil. Computers enable that.

Advertisement

One of the areas getting a lot of attention these days is something called Big Data, which is a concept that the world is full of information, and if you apply computers to sifting through it you can actually make a surprising amount of new discoveries. It took humans a long time to pay attention to patterns. John Snow in England, in the mid-1800s, started putting dots on a map of London where cholera deaths were happening, and where those people lived. He found out there were clusters. And then he found out it was because water was contaminated. No one else had figured it out; before that, they viewed disease as kind of random. It's a great example of data.

We've struggled as humans to take in all the data and start to see trends. Things like; if you're a child and you consume fresh strawberries before you're two, there's a high correlation to developing childhood allergies. This has been known for a few decades now. Imagine if you had a computer looking at all these facts and trying to correlate, what might it notice?

MB: There's been a lot written about the impending "Internet of Things," where sensors in everyday objects are collecting huge amounts of raw data that we can use to identify patterns and make conclusions. Do we have the computational power to make that happen now?

JR: Absolutely. The term a lot of us use is "the cloud." Some real visionaries started using that term more than a decade ago. The way I'd put it is this: You can have an incredible amount of computer power somewhere in the cloud, and we found a way now to connect ourselves to it. That means we can build these tiny little devices that power the Internet of Things, and they don't have much intelligence. They don't store huge amounts of data, but because they can talk to a computer of huge compute capability, with huge amounts of data, that little device can ask a question and we can throw an immense amount of compute power at it for a tiny amount of time and get an answer very quickly.

Advertisement

A lot of us have used Google maps on our phones to navigate, for instance. It's collecting this immense amount of data from traffic sensors and from other people's telephones to figure out where traffic jams are, and compute hundreds of routes, compare them all, and say, "Well, there are three routes. This one will take you one extra minute. This one will take you three extra minutes." That's because of the compute power in the cloud. It makes our phones look really powerful. I don't want to diminish what the phones are doing, but most of what makes that good is the compute power in the cloud. You get the same thing with little Internet of Things sensors.

MB: I understand all this is developing rapidly, but what are some major breakthroughs you see coming down the pipeline?

JR: One thing is the invention of new materials and new designs. I don't know if you've ever heard of Dyson; it's a pretty famous company that makes things like vacuum cleaners. They have a fan that is bladeless. It fascinates people. It creates what is basically a jet stream—not sure if that's the exact right word for it—but it's similar to how jet engines work. It's a highly engineered fan. They used supercomputing capabilities to do wind flow analysis. It's not something someone just carved out in their workshop. We don't know it, but we're seeing more and more products designed that are just way beyond the ability of someone sitting in a workshop hammering something out.

Advertisement

And then there's medicine. We're going to see the Internet of Things bring efficiency to a lot of everyday problems we deal with, ranging from traffic congestion to pollution. But probably the biggest area is medicine. The application of supercomputer power and parallelism is going to continue to revolutionize medicine as we know it.

The medical field talks about it as individualized medicine. What they mean by that is we now have enough compute power that we can start to think about analyzing you, as opposed to just applying statistics of what happens when a certain disease hits people in general. We can start to collect the data of what that disease did to different people, and try to correlate it to their genes, their environment, their condition, to start to more accurately predict what the disease will do inside you, and what we can do about it.

MB: Or drug interactions. That stuff can get incredibly complicated.

JR: Yeah, it's amazing, because drugs behave differently in different people, the same way food does. There are some people that better not be in the same room as a peanut, but the rest of us can chow down on them at the ballpark. Medicines can be the same way.

Yes, medicines that have cured people have not been as nice to other people—the same medicine for the same disease. It's because we're all a little bit different. But we're getting to the point now where we might be able to efficiently analyze the differences in people, and get better at coming up with individualized cures. That's the next evolution of medicine, and it's being enabled by immense compute power. It's not possible any other way.

To learn more about Intel® Software Evangelists, please visit evangelists.intel.com.