Humans Have a Hidden “Seventh Sense” That Lets Them Feel Objects Without Touching Them
A team of researchers from Queen Mary University of London and University College London has demonstrated that humans can detect buried objects in sand without direct contact, relying solely on subtle vibrations and pressure shifts. The discovery reveals a previously undocumented sensory capacity—referred to as remote touch—which challenges conventional scientific understanding of the tactile system.
The study, presented at the 2025 IEEE International Conference on Development and Learning, draws inspiration from shorebirds like sandpipers and plovers. These birds use their beaks to detect prey hidden under sediment by interpreting mechanical cues through their environment. In humans, this sensory capability had never before been observed or quantified—until now.
Using a controlled experimental setup with fine sand and concealed objects, researchers measured how effectively human fingertips could identify the location of buried shapes without making physical contact. The implications go far beyond biological curiosity. This newly revealed perceptual range could have significant applications in robotics, archaeology, and planetary exploration, especially in environments where vision or direct access is limited.
“It’s the first time that remote touch has been studied in humans and it changes our conception of the perceptual world—what is called the receptive field—in living beings, including humans,” said Dr. Elisabetta Versace, Senior Lecturer in Psychology at Queen Mary and lead author of the human study.
Study Reveals Tactile Detection up to 7 Centimeters Without Contact
In the human trial, twelve participants were instructed to move their fingertips gently across the surface of sand to detect a small hidden cube, without pressing deeply or making direct contact. Remarkably, participants correctly located the object in 70.7% of trials at an average distance of 6.9 centimeters (2.72 inches), according to the published study in IEEE Xplore. The median detection distance was 2.7 centimeters.
These findings aligned closely with physical models based on granular media particle interaction theory, which suggest that tactile cues can extend up to 7 centimeters through reflected displacements in the surrounding material. “This sensitivity approaches the theoretical physical threshold of what can be detected from mechanical ‘reflections’ in granular material,” the researchers wrote.
As noted by Queen Mary University, this suggests human touch is far more mechanically sensitive than previously assumed, capable of decoding small environmental disturbances—much like the beak-based systems found in birds.
Robots Trained to Mimic Human Sense Fell Short on Precision
To explore how this phenomenon might inform robotic sensing, the team conducted a parallel experiment using a robotic tactile sensor equipped with a Long Short-Term Memory (LSTM) algorithm. The robot was trained on tactile patterns and tested on the same task: detecting hidden cubes within sand based on mechanical cues alone.
The robot was able to detect objects from 7.1 centimeters on average, slightly farther than the human participants. However, its precision dropped to just 40%, and it frequently generated false positives . The disparity in performance, despite the robot’s enhanced range, highlighted a key difference in how biological and artificial systems process environmental noise.
“What makes this research especially exciting is how the human and robotic studies informed each other,” said Dr. Lorenzo Jamone, Associate Professor in Robotics and AI at UCL. “The human experiments guided the robot’s learning approach, and the robot’s performance provided new perspectives for interpreting the human data”.
Implications for Planetary Exploration and Assistive Robotics
The research team envisions a range of future applications for remote touch sensing—particularly in environments where vision is compromised, such as underground or extraterrestrial terrains. Robots with remote-touch capabilities could perform non-invasive archaeological digs, or explore Martian soil without relying on direct contact or visual identification.
“The discovery opens possibilities for designing tools and assistive technologies that extend human tactile perception,” said Zhengqi Chen, PhD student in the Advanced Robotics Lab at Queen Mary. “These insights could inform the development of advanced robots capable of delicate operations—for example, locating archaeological artifacts without damage, or exploring sandy or granular terrains such as Martian soil or ocean floors”.
The broader value lies in how this sensory mechanism could transform the design of robotic systems, particularly those used in search-and-rescue, planetary landers, and even prosthetic devices. By embedding subtle tactile algorithms, machines may one day navigate complex terrains without relying solely on cameras or contact sensors.
Redefining the Limits of Perception—And the Machines We Build From It
This study forces a fundamental rethinking of how sensory boundaries are defined in humans. Touch has long been classified as a proximal sense, tied exclusively to direct physical interaction. But the emergence of remote touch reframes that boundary, revealing that perception may begin at a distance, mediated by mechanical displacements rather than direct contact.
The results also present an intriguing paradox. Despite the rapid advancement of AI-driven tactile systems, human fingertips still outperform machines in interpreting fine, context-specific sensory data in dynamic environments. That performance gap—biological versus artificial—is now measurable, and it sets a new benchmark for robotic sensitivity.
First Appeared on
Source link