Scheduled to be presented at the Conference on Robot Learning (CoRL 2024) in Munich, Germany, SonicSense enables robots to interact with objects using sound-based feedback. The system features a robotic hand equipped with contact microphones embedded in its fingertips. These microphones detect vibrations when the robot taps, grasps, or shakes an object, allowing the robot to tune out background noise and focus on the specific item it's handling.
"Robots today mostly rely on vision to interpret the world," said Jiaxun Liu, lead author of the study and a Ph.D. student at Duke. "We wanted to create a solution that could work with complex and diverse objects found on a daily basis, giving robots a much richer ability to 'feel' and understand the world."
SonicSense uses the collected vibration data to analyze the object's material and shape. If the system has never encountered the object before, it may take up to 20 interactions to identify it. However, for objects stored in its database, it can make accurate identifications in as few as four interactions.
"SonicSense gives robots a new way to hear and feel, much like humans, which can transform how current robots perceive and interact with objects," said Boyuan Chen, professor of mechanical engineering and materials science at Duke, and supervisor of the research.
The researchers demonstrated SonicSense's capabilities by performing tasks such as counting dice in a box, determining the liquid level in a bottle, and building a 3D model of an object's shape and material through taps. The system's combination of multiple fingers, touch-based microphones, and AI techniques enables it to outperform previous methods, especially with objects that have complex surfaces or are made from multiple materials.
A critical advantage of SonicSense is its affordability. By utilizing low-cost components like 3D printing and microphones commonly used by musicians, the system is constructed for just over $200.
Looking ahead, the research team aims to improve SonicSense by integrating object-tracking algorithms, enabling robots to handle cluttered environments. Future iterations of the system will also explore advanced robotic hands with enhanced dexterity, making robots capable of performing more nuanced tasks.
"This is only the beginning," added Chen. "We're excited to explore how this technology can be further developed to integrate multiple sensory modalities, such as pressure and temperature, for even more complex interactions."
Research Report:SonicSense: Object Perception from In-Hand Acoustic Vibration
Related Links
Duke University
All about the robots on Earth and beyond!
Subscribe Free To Our Daily Newsletters |
Subscribe Free To Our Daily Newsletters |