26 June 2024

SonicSense:

Object Perception from In-Hand Acoustic Vibration

Conference on Robot Learning (CoRL 2024)

Jiaxun Liu
Jiaxun Liu Duke University www.jiaxunliu.com
Boyuan Chen
Boyuan Chen Duke University boyuanchen.com

Overview

We introduce SonicSense, a holistic design of hardware and software to enable rich robot object perception through in-hand acoustic vibration sensing. While previous studies have shown promising results with acoustic sensing for object perception, current solutions are constrained to a handful of objects with simple geometries and homogeneous materials, single-finger sensing, and mixing training and testing on the same objects. SonicSense enables container inventory status differentiation, heterogeneous material prediction, 3D shape reconstruction, and object re-identification from a diverse set of 83 real-world objects. Our system employs a simple but effective heuristic exploration policy to interact with the objects as well as end-to-end learning-based algorithms to fuse vibration signals to infer object properties. Our framework underscores the significance of in-hand acoustic vibration sensing in advancing robot tactile perception.

Video (Click to YouTube)

Video Figure

Paper

Check out our paper linked here.

Codebase

Check out our codebase at https://github.com/generalroboticslab/SonicSense

Citation

@inproceedings{
liu2024sonicsense,
title={SonicSense: Object Perception from In-Hand Acoustic Vibration},
author={Jiaxun Liu and Boyuan Chen},
booktitle={8th Annual Conference on Robot Learning},
year={2024},
url={https://openreview.net/forum?id=CpXiqz6qf4}
}       

Acknowledgment

This work is supported by ARL STRONG program under awards W911NF2320182 and W911NF2220113, by DARPA FoundSci program under award HR00112490372, and DARPA TIAMAT program under award HR00112490419.

Contact

If you have any questions, please feel free to contact Jiaxun Liu.

Categories

Multimodal Perception Robot Learning