Home » NASA’s Perseverance Mars rover Lands on Mars

NASA’s Perseverance Mars rover Lands on Mars

by Fahid Safdar
1452 views
Perseverance Mars rover

After the successful landing on Mars on February 18, the Perseverance Mars rover launched by NASA has received vital help from a variety of airborne artificial intelligence systems designed to guide the Mars rover carried out a two-year exploratory mission on Mars.

The Perseverance Mars rover completed a 293 million-mile interstellar voyage from Earth to Mars and landed safely on the surface of Mars at 15:55 Eastern Standard Time on February 18th. At that time, the speed of the spacecraft carrying it to Mars dropped from 12,100 miles per hour to zero when it entered the Martian atmosphere, and the Perseverance Mars rover was entering the atmosphere. After about 7 minutes, it landed safely on the surface of Mars.

NASA engineers and exploration mission controllers then checked the various systems of the Perseverance Mars rover to ensure that everything was in good condition after the long voyage and then implemented planned scientific experiments aimed at finding billions of Traces of microscopic life on Mars two years ago.

Raymond Francis, a scientific operations engineer at NASA’s Jet Propulsion Laboratory, said that to complete these experiments and missions, the Perseverance Mars rover uses more artificial intelligence functions than the previous four Mars rover, including 2012 The “Curiosity” Mars rover that arrived on Mars on August 6, 2005, and is still in operation.

Francis said: “Compared with Curiosity, which implements the Mars Science Laboratory (MSL) mission, the artificial intelligence system of Perseverance on Mars has been upgraded. Some of the functions come from our improvements and upgrades to Curiosity. .”

The role of artificial intelligence of Perseverance Mars rover

Francis said that the main artificial intelligence system onboard the Perseverance Mars rover enhanced the landing measures, allowing it to successfully land near the 28-mile wide Jezero Crater. Because there are river deltas, cliffs, dunes, boulders, and smaller craters in the on-site environment, the landing process is very dangerous.

He said that this is where the Terrain Relative Navigation (TRN) system using artificial intelligence technology plays an important role. He said: “The Perseverance Mars rover has a camera that allows it to take one or more images when it lands on the landing site. The rover has a topographic map that can be compared with the captured images. Match and identify its landing location. Then it will calculate the location where the image was just taken and where it will land.”

Francis said that these autonomous functions are essential for the probe’s landing because the spacecraft can only receive orders from the mission engineer after 5 to 40 minutes (depending on the situation) due to the long distance between the Earth and Mars. This means that the “Perseverance” Mars rover safely landed in a dangerous landing zone requires the use of the Terrain Relative Navigation (TRN) system because it is impossible to rely on mission engineers to execute delayed manual control commands.

He said: “If it realizes that it may land in an unsafe place, it will automatically turn to land to a safe landing site while the supersonic speed drops to zero.”

This is what NASA astronaut Neil Armstrong completed during the Apollo 11 lunar landing on July 20, 1969. In that mission, Armstrong used manual control to land the world’s first manned lunar module “Eagle” on the surface of the moon, because the lunar module automation system was guiding the lunar module to a dangerous landing site.

Francis. said: “Due to unmanned control, the artificial intelligence system can make the Perseverance Mars rover land near the Jezero Crater, which is not safe for Curiosity, and Perseverance’s landing The system uses artificial intelligence technology to land safely.”

Application of Artificial Intelligence in Aiming Instruments

Artificial intelligence is also applied in the Mars exploration of the Perseverance Mars rover through the autonomous exploration and collection enhanced scientific system (AEGIS). AEGIS is an intelligent targeting software that allows mission engineers to remotely aim and control the rover’s SuperCam camera. The “Curiosity” Mars rover uses a ChemCam camera and uses an earlier version of the AEGIS system, but this updated version has been enhanced to work with the latest updated SuperCam camera.

Francis said: “The will start using it shortly after landing.” Francis is the chief system engineer who developed AEGIS.

He explained: “The ChemCam camera of Curiosity and the SuperCam camera of Perseverance is also a kind of laser spectrometer, which can emit a powerful laser beam, usually on a rock within 7 meters of the detector. Evaporate part of the rock surface. Then observe the generated rock plasma to determine the element of the rock.”

These experiments are designed to help scientists understand the composition of volcanic rocks and use other measurement methods to determine the formation, origin, and other details of the rocks.

Francis said: “Usually, we let scientists on Earth choose a certain rock for research. They select the rock of interest from the photos taken from the Mars rover. But because the rover is moving and the image is transmitted back to the earth It takes a long time to miss the best rock. We can use the onboard artificial intelligence system to let the probe choose the most suitable rock around. Since the exploration time on Mars is very precious, we usually let the artificial intelligence system choose. “

Use artificial intelligence to improve autonomous navigation

Francis said that NASA’s “Curiosity” probe has adopted an artificial intelligence autonomous navigation system, and the navigation system of the “Perseverance” probe has been greatly improved for this.

He said: “We need to be able to travel faster and farther on Mars, and use higher-performance computers to perform calculations faster. On the curiosity probe, we must use automatic navigation to determine more For a short distance, three-dimensional images are taken and calculated to determine which obstacles are and which are safe paths, and then drive along the safe paths. But the actual journey is very short, only one or two meters.”

He added that these processes have now been significantly accelerated.

He said: “We have simplified the algorithm of the Perseverance detector and improved the overall function so that it can be driven continuously. We can take pictures and process the data while traveling so that we can automatically navigate faster and farther. “

Francis said that all of these enhanced artificial intelligence functions and upcoming new features will make it easier for future probes to reach Mars and beyond.

He said: “Many people have seen the use of autonomy and intelligent technology to complete various tasks such as dispatching on the International Space Station. These technologies are not surprising. However, the technology of spacecraft like the International Space Station is extremely complex, and there are a large number of them. Systems and functions have many different dependencies and must be done synchronously at some point. Completing these things is very complicated, especially when something changes.”

He said, “This is where artificial intelligence plays an important role in helping the International Space Station complete its mission. We have completed similar developments in robotic missions, and I think this will be more important in future missions, especially as missions become more and more complex. when.”

These requirements will also play a role, because the spacecraft will continue to sail without waiting for instructions from scientists, and it will become better and more productive.

Francis said: “The efficiency of issuing commands from the Earth to the probe on Mars is very inefficient, but it can be implemented without such autonomy. But the farther the probe enters the solar system, the limitation of communication time means or Everything must be prepared in advance, or the spacecraft must have an autonomous mechanism. If the spacecraft can quickly respond to unexpected events, it must have autonomous decision-making power. I think this will become more and more important.”

He pointed out that exploring planets outside the solar system or completing space missions in harsh environments will increasingly rely on the autonomous technology of artificial intelligence.

He said: “Part of the challenge is to make people more trust in autonomous systems to make the right decisions about spacecraft or select the right scientific data. AEGIS is an example of how the team of scientists is very satisfied with this and uses it frequently because Provides them with good scientific data. Therefore, we must prove that autonomous systems can either promote scientific development or be safe and effective for spacecraft.”

Another thing that happened in the space field on February 20 is that HPE and Microsoft Azure will provide powerful artificial intelligence, edge computing, and cloud computing tools to the International Space Station. The new artificial intelligence and other tools are part of ongoing technological experiments designed to prepare NASA for future manned missions to Mars.

These new equipment and software include HPE’s second-generation spaceborne computer 2 (SBC-2), which marks the first time that a wide range of artificial intelligence and edge computing functions have been provided to the International Space Station.

The new hardware, software, and services will be sent to the International Space Station via Northrop Grumman’s NG-15 spacecraft at 12:36 noon on February 20. The NG-15 spacecraft was launched from the Wallops Launch Center facility on Wallops Island, Virginia, USA, to transport the materials needed for the International Space Station.

You may also like