Yixuan Huang

PhD Student - Computer Science

Email: yixuan.huang@utah.edu
Office: MEB 2160 (Map)

[ Curriculum Vitae ]

Hi there, I'm Yixuan Huang!

I'm a second-year Ph.D. student in Computer Science at the University of Utah. I am fortunate to be advised by Prof. Tucker Hermans. I am currently working on Graph Meural Networks for robot manipulation tasks. I am also working closely with Prof. Alan Kuntz on medical robots. Before coming to Utah, I worked on sample-efficient reinforcement learning and safe autonomous driving at UC San Diego for two years advised by Prof. Sicun Gao. I got my bachelor degree in Computer Science and Engineering from Northeastern University (China) in 2020.

Stacking and unstacking tasks are important aids in our daily life. For example, we need to arrange the kitchen and bookshelf, which contains a bunch of stacking and unstacking tasks. However, object stacking and unstacking tasks are challenging for robots to perform autonomously due to the difficulty in modeling the non-trivial contact dynamics and support relations of the objects being stacked. In this work, we leverage Graph Neural Networks to reason about object interactions in the object stacking and unstacking tasks and show it can generalize to a variable number of objects because of the relational inductive bias of GNNs, and the different sizes of the object. We are the first one to learn graph latent dynamics in block stacking and unstacking tasks and perform planning with multiple objects, multiple action primitives, and multiple goal relations.

Project page

Tendon-driven robots, a type of continuum robot, have the potential to reduce the invasiveness of surgery by enabling access to difficult-to-reach anatomical targets. In the future, the automation of surgical tasks for these robots may help reduce surgeon strain in the face of a rapidly growing population. However, directly encoding surgical tasks and their associated context for these robots is infeasible. In this work we take steps toward a system that is able to learn to successfully perform context-dependent surgical tasks by learning directly from a set of expert demonstrations. We present three models trained on the demonstrations conditioned on a vector encoding the context of the demonstration. We then use these models to plan and execute motions for the tendon-driven robot similar to the demonstrations for novel context not seen in the training set. We demonstrate the efficacy of our method on three surgery-inspired tasks.


- "Toward Learning Context-Dependent Tasks from Demonstration for Tendon-Driven Surgical Robots", Yixuan Huang, Michael Bentley, Tucker Hermans, and Alan Kuntz, 2021 International Symposium on Medical Robotics (ISMR) 2021. Best Paper Award Finalist, Best Student Paper Award Finalist

Project page