Currently, NASA utilizes wheeled rovers to travel the surface of Mars and perform planetary science, but Texas A&M University researchers will test the viability of new surface-exploration technology: walking robots.
The project is led by Feifei Qian, a WiSE Gabilan Assistant Professor at the University of Southern California Viterbi School of Engineering, and includes Ryan Ewing, Robert R. Berg Professor in the Department of Geology and Geophysics at Texas A&M, and Marion Nachon, an associate research scientist in geology and geophysics. The goal of the project is to develop and test walking, or "legged," robots that can more easily travel frozen surfaces, crusted sand, and other difficult-to-navigate settings, considerably improving scientists' ability to obtain data from planetary bodies.
While the Mars Exploration Rovers and other robots have been successfully sent into space, they normally function on pre-programmed agendas that need human scientists and engineers to provide comprehensive instructions about where to go and what to accomplish before the robots arrive. As a result, when the robot faces unexpected events or learns intriguing measurements, its ability to adjust its strategy is restricted. This can make it difficult for robots and rovers to explore unfamiliar settings and potentially cause them to lose out on scientific possibilities.
Ewing believes that a better knowledge of how to integrate robotics technology with both planetary science and cognitive science would improve robot-assisted planetary exploration. This project seeks to test next-generation, high-mobility robots that can travel quickly across planetary surfaces and serve scientific exploration goals in a variety of ways.
We will perform this research at two critical planetary analog locations that have well-defined gradients in soil types ranging from crusty sand at White Sands Dune Field in New Mexico to ice rock mixes at Mt. Hood in Oregon, according to Ewing. Our goal is to examine the geotechnical features of these soils by combining high-mobility legged robots with integrated terrain-sensing technology and cognitive human decision models.
The project makes use of "bio-inspired" robots with legs, which are designed to mimic animals' particular skills to move successfully on difficult terrain such as soft sand. These robots can "direct-drive" the terrain (e.g., sand softness and rock forms) using the newest "feel" actuator technology. This skill enables legged robots to interact with their surroundings in the same way as animals do, altering their movement as needed.
As Qian puts it, these robots are designed in such a way that they can not only replicate how animals look but also comprehend what makes them effective on diverse terrains. The capacity of these robots to "feel" the terrain with their legs also allows them to readily absorb information about the environment as they move about and modify exploration techniques based on this knowledge.
According to Ewing, "we'll be working to determine how surface crusts, rock-covered soils, and ice content affect the friction and erodibility of different soils." The direct-drive-legged robots will be sent to record soil strength at two locations that resemble terrain on the Moon, Mars, and other worlds. We will monitor environmental elements that influence soil strength at the same time, such as particle size and shape, soil moisture, chemical composition, and ice content.
As scientists strive to investigate planetary habitats, Qian emphasizes the need of deploying robots and rovers on early missions to gather data before sending people.
Even in situations where it is safe to deploy astronauts, mobile robots may combine scientific instruments and assist in taking exact measurements while moving about, according to Qian.
Scientists from the University of Pennsylvania, Georgia Institute of Technology, and NASA's Johnson Space Center are also part of the study team.
According to Qian, this is the dream team and a once-in-a-lifetime opportunity to bring a team with all the components into one project.