Researchers have released images of the Martian surface at five times greater resolution than previously seen, and they didn't need to send a new camera into space to get them.
In February, University College London researchers Yu Tao and Jan-Peter Muller published a paper in Planetary and Space Science detailing an image processing technique that compares several images of the same area of Mars from different angles to glean resolution down to 5cm as opposed to the previous 25cm. On Tuesday, they released images taken from orbit that use the technique to focus on specific Martian objects—including what looks like the long-lost British lander Beagle 2, which was deployed from the Mars Express orbiter in 2003 but failed to make contact on landing.
"We wanted to establish whether or not we could see the surface of Mars—or any other planet, including our own—at much higher resolution by stacking a lot of images together that looked at the same [area] from different angles," Muller explained in a Skype call.
They used images from HiRISE, a high-resolution camera on NASA's Mars Reconnaissance Orbiter (MRO) which produces pretty awesome images from around 300km away from the planet surface. By "stacking" similar images, their technique essentially takes the best bits from each picture to pick out more detail.
"Every single picture you take with any kind of camera at all, there are bits and pieces of detail which are at higher resolution," Muller explained.
As the Mars surface doesn't really change owing to the lack of atmosphere, the researchers were able to use images over an eight-year period. Their image processing technique, which they call Gotcha-PDE-TV, or GPT, could match pixels from different angles to pull out "sub-pixel" information and create a crisper picture.
They were able to compare some of their "super-resolution" images with images taken by NASA's Spirit rover on the surface, to verify that they had created an accurate picture. In one example, the processed image reveals the rover's tracks, which are indecipherable in the unprocessed images.
Another image shows an unusual-looking object in greater detail: the proposed Beagle 2 landing site, which was spotted last year. The researchers think their images add more evidence of the fate of the failed rover (it's not known exactly what went wrong with Beagle 2, but it's thought that its solar panels didn't fully deploy). Muller described the object as having the characteristic "Y" shape of Beagle 2 and appearing much more convincing than previously-touted landing sites. "It's much more convincing that there is indeed something strange and artificial about this feature on the surface," he said. "I've never seen anything remotely like that anywhere on the surface of Mars, and I've looked at a large number of pictures."
Ultimately, the researchers claim the tool could offer a rover-quality "bird's-eye view" of planet surfaces without the costs—or risks—of actually sending a robot into space. Alternatively, it could help to better plan a rover's landing to help avoid situations like poor Beagle 2 by providing more detailed information about a landing site before deployment.
Muller also emphasised that the image processing technique could be applied to other images, for example of Earth. He suggested it could be used with images captured by Terra Bella (previously called Skybox), a Google-owned company that takes satellite images.
"If you're interested, for example, in environmental applications, or you're interested in assessing how much coal is being dug out the ground to make sure people aren't cheating on global climate targets, you need that kind of resolution," he said.