Sometime between March 2010 and May 2012, a meteor streaked across the Martian sky and broke into pieces, slamming into the planet's surface. The resulting craters were relatively small - just 13 feet (4 meters) in diameter. The smaller the features, the more difficult they are to spot using Mars orbiters. But in this case - and for the first time - scientists spotted them with a little extra help: artificial intelligence (AI).
It's a milestone for planetary scientists and AI researchers at NASA's Jet Propulsion Laboratory in Southern California, who worked together to develop the machine-learning tool that helped make the discovery. The accomplishment offers hope for both saving time and increasing the volume of findings.
Typically, scientists spend hours each day studying images captured by NASA's Mars Reconnaissance Orbiter (MRO), looking for changing surface phenomena like dust devils, avalanches, and shifting dunes. In the orbiter's 14 years at Mars, scientists have relied on MRO data to find over 1,000 new craters. They're usually first detected with the spacecraft's Context Camera, which takes low-resolution images covering hundreds of miles at a time.
Only the blast marks around an impact will stand out in these images, not the individual craters, so the next step is to take a closer look with the High-Resolution Imaging Science Experiment, or HiRISE. The instrument is so powerful that it can see details as fine as the tracks left by the Curiosity Mars rover. (The HiRISE team allows anyone, including members of the public, to request specific images through its HiWish page.)
The process takes patience, requiring 40 minutes or so for a researcher to carefully scan a single Context Camera image. To save time, JPL researchers created a tool - called an automated fresh impact crater classifier - as part of a broader JPL effort named COSMIC (Capturing Onboard Summarization to Monitor Image Change) that develops technologies for future generations of Mars orbiters.
Click on this interactive visualization of the Mars Reconnaissance Orbiter and take it for a spin. The "HD" button in the lower right offers more detailed textures. The full interactive experience is at Eyes on the Solar System. Image credit: NASA/JPL-Caltech
Learning the Landscape
To train the crater classifier, researchers fed it 6,830 Context Camera images, including those of locations with previously discovered impacts that already had been confirmed via HiRISE. The tool was also fed images with no fresh impacts in order to show the classifier what not to look for.
Once trained, the classifier was deployed on the Context Camera's entire repository of about 112,000 images. Running on a supercomputer cluster at JPL made up of dozens of high-performance computers that can operate in concert with one another, a process that takes a human 40 minutes takes the AI tool an average of just five seconds.
One challenge was figuring out how to run up to 750 copies of the classifier across the entire cluster simultaneously, said JPL computer scientist Gary Doran. "It wouldn't be possible to process over 112,000 images in a reasonable amount of time without distributing the work across many computers," Doran said. "The strategy is to split the problem into smaller pieces that can be solved in parallel."
But despite all that computing power, the classifier still requires a human to check its work.
"AI can't do the kind of skilled analysis a scientist can," said JPL computer scientist Kiri Wagstaff. "But tools like this new algorithm can be their assistants. This paves the way for an exciting symbiosis of human and AI 'investigators' working together to accelerate scientific discovery."
On Aug. 26, 2020, HiRISE confirmed that a dark smudge detected by the classifier in a region called Noctis Fossae was in fact the cluster of craters. The team has already submitted more than 20 additional candidates for HiRISE to check out.
While this crater classifier runs on Earth-bound computers, the ultimate goal is to develop similar classifiers tailored for onboard use by future Mars orbiters. Right now, the data being sent back to Earth requires scientists to sift through to find interesting imagery, much like trying to find a needle in a haystack, said Michael Munje, a Georgia Tech graduate student who worked on the classifier as an intern at JPL.
"The hope is that in the future, AI could prioritize orbital imagery that scientists are more likely to be interested in," Munje said.
Ingrid Daubar, a scientist with appointments at JPL and Brown University who was also involved in the work, is hopeful the new tool could offer a more complete picture of how often meteors strike Mars and also reveal small impacts in areas where they haven't been discovered before. The more craters that are found, the more scientists add to the body of knowledge of the size, shape, and frequency of meteor impacts on Mars.
"There are likely many more impacts that we haven't found yet," she said. "This advance shows you just how much you can do with veteran missions like MRO using modern analysis techniques."
For more information about MRO:
www.nasa.gov/mission_pages/MRO/main/index.html
JPL, a division of Caltech in Pasadena, California, manages the MRO mission for NASA's Science Mission Directorate in Washington. The University of Arizona, in Tucson, operates HiRISE, which was built by Ball Aerospace & Technologies Corp., in Boulder, Colorado. MARCI and the Context Camera were both built and are operated by Malin Space Science Systems in San Diego.