Supercomputer-Powered Models Are About to Make Weather Forecasts More Accurate
The UK Met Office has an improved forecasting model, but it's already awaiting the next revolutionary supercomputer system.
The UK's Met Office has updated its forecasting systems to give more accurate predictions of every Brit's favourite topic of conversation: the weather.
The "major change" is a code with the rather ominous-sounding nickname of ENDGame (short for "Even Newer Dynamics for General atmospheric modelling of the environment), which gives meteorologists more accurate data. But while ENDGame has only been in place since the summer, it already seems that the next update will offer an even more revolutionary change to how we predict the weather.
ENDGame is an upgrade to the dynamical core underpinning the models the Met Office uses, and it essentially maps weather across the country with greater resolution than its predecessors.
The basic problem the system has to work with remains the same. "Suppose we knew exactly the state of the atmosphere today—the temperatures and velocity and moisture contents of all the fluid particles in the atmosphere—how would we use that information to predict what it's going to be like in six hours' time or two or three days' time?" said John Thuburn, a mathematician from the University of Exeter who worked on the project.
Mathematical equations can give you those answers, but they're not easy to solve—so supercomputer models are used to give the best approximations. "To keep these models running smoothly, they need a certain amount of damping," explained Thuburn.
This essentially removes some detail from the data so it can be processed faster and more stably, and it's where ENDGame represents an improvement. The new system has less of the artificial damping, which, according to a Met Office report on the model, "provides a significant improvement to our current prediction capability."
But despite the hype around ENDGame, which the Met Office is already using for global predictions and intends to use in regional predictions from next year, mathematical minds are already working on the next dynamical core with an equally badass name: GungHo.
"A key limitation to the scalability of the current Unified Model is the latitude-longitude grid."
"That might well be a bit more of a revolution," said Thuburn. The GungHo project would require a whole new model that would change the way weather equations are solved.
"To solve these equations that we talked about earlier, we have to store the data that the describes the atmosphere on some sort of grid," explained Thuburn. "The obvious way to put a grid on a spherical Earth is to use a latitude-longitude grid system."
But cross-hatching the Earth like that results in a "clustering" of the grid at the poles. Thuburn said that the real bottleneck in processing the information is not in the amount of data itself, but in moving the information around different processors. "Getting those points near the pole to tai to each other, they've got to move a lot of data around between processors," he said.
To get around this, researchers are considering a whole new way of dividing the Earth into a grid. They're looking at hexagon and pentagon-type grids, similar to a geodesic dome, and other non-quadrilateral ways to dissect the globe. You can see some of the resulting pretty shapes in the office's 10-year roadmap study.
"They all require quite a lot of a rethink of the methods we use for solving our equations," said Thuburn, explaining they'd "really have to go back to basics" if they change that fundamental aspect of their forecasting models. But the new model will be necessary if it's going to scale onto the next generation of supercomputers and be "be efficiently solved on hundreds of thousands of processors," as one outline puts it.
"A key limitation to the scalability of the current Unified Model is the latitude-longitude grid and in particular the singular nature of that grid at the two poles," researchers Andy Brown and Nigel Wood explain in a report.
GungHo is currently slated for use in around 2020. In the meantime, there's plenty of opportunity to blame it on the weatherman—or at least on his model's dynamical core.