Pretty cool, huh?
What you are looking at is a visualization from the Dark Sky simulations, one of the largest and most sophisticated models of how dark matter evolves that has ever been produced.
Each point traces the density of dark matter, color-coded according to its instantaneous velocity—the speed and direction of the particle at a certain point in time—and interacting with other particles according to a complex algorithm which simulates the effects of gravity.
Simulations like Dark Sky, made by a small conglomerate of scientists from Los Alamos National Laboratory, Stanford, Paris Institute of Astrophysics, University of Illinois, and the University of Chicago, help fill in the gaps not only between particle physics and cosmology, but also the first moments of our universe to the present day. They provide crucial insight into the mystery of dark matter, which is generally considered to be the cosmological scaffolding upon which everything else in the universe is built.
In the standard cosmological model, dark energy comprises roughly 68 percent of our universe, dark matter about 27 percent, and baryonic matter—which we might consider as "normal" or visible matter—accounts for a measly 5 percent of the makeup of the universe. Clearly, dark matter and dark energy are integral to the existence of life as we know it, but despite their importance, we know remarkably little about them.
This visualization by Ralf Kaehler shows a slice of one of the dark sky simulations, where the particles are built up in Morton-order, just like they are stored in memory and on a disk.
This is where Dark Sky comes in, hoping to shed some light on the situation, so to speak, by rendering the interactions between dark matter and dark energy visible.
"[Dark Sky] is a statistically similar universe to our own, but it's not actually simulating our universe," said Sam Skillman, a postdoctoral researcher at Stanford's Kavli Institute for Particle Astrophysics and Cosmology. "We run simulations, each of which is representative of what a universe like ours should look like. Then we compare it to our universe and see if it's consistent."
Dark matter was first postulated by Jan Oort in 1932 based on his observation of the stellar motion within the Milky Way. The measurements which led him to the conclusion that dark matter must exist were later discovered to be entirely erroneous, however, so the real theoretical origins of dark matter reside with Fritz Zwicky. In 1933, Zwicky discovered that the motion of a galaxy cluster he was observing was not conducive to the amount of mass calculated to be present in that cluster.
This "missing mass" observed by Zwicky is now referred to as dark matter, the invisible scaffolding of the universe which makes the formation of baryonic structures like galaxies possible. Understanding dark matter will provide crucial insight into early galactic formation, an area of our universal history about which we know relatively little.
The Dark Sky simulations provide a statistical baseline against which researchers can measure their observations of the universe
It is clear that dark matter plays a crucial role in our understanding of the cosmos, but despite the first evidence of dark matter's existence being postulated nearly a century ago, scientists have yet to directly observe it. This is because dark matter, to our knowledge, neither emits nor absorbs light, which makes it impossible to see with conventional observational tools. As such, cosmologists must deduce its existence from the way light bends (gravitational lensing) and the movement of stars around their galaxies, which is much faster than is allowed with conventional gravitational models which do not account for dark matter and dark energy.
In short, the Dark Sky simulations are providing cosmologists and astrophysics with a crucial statistical baseline against which they can measure their observations in order to deduce just how dark matter and dark energy are interacting with baryonic matter.
According to the Dark Sky team's report, "predictions from numerical models are now critical to almost every aspect of precision studies of dark matter and dark energy." The mathematical complexity that emerges from the non-linear nature of gravity makes calculating its properties analytically, without the aid of simulations, basically impossible.
"What the simulations are giving you is a very detailed look at what gravity is doing," said Daniel Holz, a senior member of the University of Chicago's Kavli Institute for Cosmological Physics. "Then you compare the observations to the simulations, from which you can infer the properties of dark energy and dark matter. Very little of this would be possible to do with just pen and paper."
The Dark Sky Simulations are N-body simulations, which is shorthand for dynamic modeling in which particles interact with one another according to a pre-determined set of physical laws. In the case of Dark Sky, the team used data collected from the Planck and WMAP spacecraft, probes which were launched to measure temperature fluctuations in the cosmic microwave background, the leftover thermal radiation from the Big Bang. The data collected by these spacecraft helped shape a picture of the universe when it was just a few hundred thousand years old and this data was used to set the initial conditions for the Dark Sky simulation.
Once these initial conditions were determined, the team plugged in the code that accounts for gravitational force and watched to see how particles interact. Despite the fact that the team was only attempting to simulate how a single force (gravity) interacted with the particles, the algorithm needed to produce a realistic simulation from this force is immensely complex.
"On the largest scales, when you're talking about the universe itself, gravity is the only force that matters," Holz told me over the phone. "The equation is just gravity, but it's very hard to solve because gravity is non-linear. This means that gravity causes things to 'clump' and the more they're clumping the stronger the gravity will be, which will in turn cause them to clump more. As this process continues, it becomes very hard to describe what is happening mathematically just from first principles."
It is precisely this non-linearity of gravity which accounts for the complexity of the code that underlies the Dark Sky simulations. The first 50 terabytes of data which resulted from the initial simulations were made available to the public free for the first time last year because, as one Dark Sky team member said, "there's no way we can do all the science that should be done with it."
The codes undergirding the simulation began with Dark Sky's lead investigator Michael Warren of the Los Alamos National Laboratory in the late 80s, although then the algorithm wasn't specifically designed for Dark Sky. Warren's codes have been used in a number of cosmological simulations throughout the years, their complexity only limited by the computing power available.
When Warren was initially starting out, the number of particles he was using was on the order of 10^6. Keeping in pace with Moore's Law, the complexity of these simulations (in terms of the number of particles) would double roughly every 18 months, so that by the time the team was ready to run their Dark Sky simulations last year, they were operating with particles on the order of 10^12.
"There's no way we can do all the science that should be done with it."
Obviously, for this many particles, they were going to need a lot of space for them to move around in. According to Skillman, when creating a simulation like Dark Sky there is always a trade-off between the volume covered and the number of particles contained within this space. Due to the fact that the Dark Sky simulations were using 1 trillion particles, they had to make their "box" sufficiently large enough to handle the particles without messing up the gravitational dynamics. The result was a massive box which measures 8 gigaparsecs, about 25 billion light years, on each side.
In order to run their massive simulation, the group needed to gain access to equally massive computing power. Fortunately, the Dark Sky team managed to secure the use of Oak Ridge National Laboratory's Titan, the second most powerful supercomputer in existence, for 80 million computing hours with a grant through the US Department of Energy.
The group made use of two-thirds of Titan's nodes for the ds14 simulation above, using up 10 million cpu hours over the course of 33 hours in April of 2014. After all 80 million cpu hours were exhausted by the team in December of 2014 and resulted in a total of 500-terabytes of simulation data. Although the earliest results were released on arXiv last July, the remaining 450 terabytes of data have yet to be released in any significant quantity, due to lack of time and funding for the project.
"The Dark Sky simulations and others like it are an important pillar to our current understanding of cosmology, said Skillman. "By releasing our data to the public, we enable comparison between ongoing surveys and our theoretical understanding of the universe."
The simulations run by the Dark Sky team represent a major step forward in understanding how we got here and where we are going. There is a lot to be learned from these simulations and according to the Dark Sky team, they have just barely begun to scratch the surface of the science enabled by their modeling. If the amount of science enabled by the release of just 1/10 of the entire data set is any indication, the possibilities engendered by Dark Sky are virtually endless, just like the universe they are simulating.
Correction: An earlier version of this story stated the order of magnitude for the number of particles Michael Warren worked with initially as 10^3; the sentence has been corrected to say 10^6.
Perfect Worlds is a series on Motherboard about simulations, imitations, and models. Follow along here.