Building autonomous machines that sense, adapt, and connect to advance embodied intelligence, drawing inspiration from natural intelligence in humans and animals. We take a "full-stack" approach to study both the "body" and "brain" of machines.
We focuses on multimodal perception systems that enable machines to understand complex environments. Various modalities such as vision, sound, vibration, smell, and touch can enable a more complete understanding of the world and a more robust perception system.
We explore how how machines can learn and evolve in response to changing environments and tasks. Example work includes the study of self-aware robots, computational design of robots, and robot policy transfer and sharing.
Our research investigates how machines can effectively communicate and collaborate with humans and other machines. We develop techniques for natural interaction, knowledge sharing, and coordinated action in mixed human-machine teams.
We are excited to leverage AI and robotics to accelerate scientific discovery and applications across broad domains such as healthcare and material science.
Our research is sometimes best described by brief videos and animations. Here are some of them. If you are interested in more details, please see full publication list. Click the bottom list to view more videos.