~~NOCACHE~~
~~NOTOC~~
====== Research Projects ======
===== In-Hand Manipulation =====
{{project:in_grasp.jpeg?nolink&200}}
We investigate the problem of a robot autonomously moving an object relative to its hand. This project focuses on multi-fingered, in-hand manipulation of novel objects. Objects from the YCB dataset are used with the Allegro robotic hand to verify approaches. Benchmarking schemes are also introduced to compare with other methods. // //
[[project:in_hand_manipulation|Project page]]
===== Multi-fingered Grasp Planning =====
{{:grasp_mustard.png?400|}}
We propose a novel approach to multi-fingered grasp planning leveraging learned deep neural network models. We train a convolutional neural network to predict grasp success as a function of both visual information of an object and grasp configuration. We can then formulate grasp planning as inferring the grasp configuration which maximizes the probability of grasp success. Our work is the first to directly plan high quality multi-fingered grasps in configuration space using a deep neural network without the need of an external planner. We validate our inference method performing both multi-finger and two-finger grasps on real robots. // //
* [[project:grasp_inference|Grasp Inference Project page]]
* [[project:grasp_type|Grasp Learning with Grasp Types]]
* [[https://sites.google.com/view/reconstruction-grasp/|Learning Continuous 3D Reconstructions for Geometrically Aware Grasping]]
* [[https://arxiv.org/abs/2006.05264|Multi-Fingered Active Grasp Learning]]
* [[https://arxiv.org/abs/2001.09242|Multi-Fingered Grasp Planning via Inference in Deep Neural Networks]]
===== Tactile Perception and Manipulation =====
{{::project:tactile-mirror.jpg?175&nolink|}}
We are interested in enabling robots to use the sense of touch--tactile sensing--to improve both their perception and manipulation of the world.
Publications:
* [[http://www.cs.utah.edu/~thermans/papers/yi-iros2016-gp-active-touch.pdf| "Active Tactile Object Exploration with Gaussian Processes"]]
* [[http://www.cs.utah.edu/~thermans/papers/hoelscher_ichr2015.pdf|"Evaluation of Tactile Feature Extraction for Interactive Object Recognition"]]
* [[http://www.cs.utah.edu/~thermans/papers/veiga-iros2015-slip-control.pdf|"Stabilizing Novel Objects by Learning to Predict Tactile Slip"]]
* [[http://www.cs.utah.edu/~thermans/papers/vanhoof-ichr2015-in-hand-learning.pdf|"Learning Robot In-Hand Manipulation with Tactile Features"]]
===== Magnetic Manipulation =====
{{::project:katieintestinestest1.png?175&nolink|}}
In cooperation with [[https://www.telerobotics.utah.edu/index.php/People/JakeAbbott|Dr. Jake Abbott]] and the [[https://www.telerobotics.utah.edu/index.php|Utah TeleRobotics Lab]] we investigate how advances in autonomous robot manipulation can be applied to the domain of magnetic manipulation for medical devices, micro-robots, and other applications.// //
* [[http://www.cs.utah.edu/~thermans/papers/Popek_ICRA_2017-capsule-closed-loop-control.pdf|"First Demonstration of Simultaneous Localization and Propulsion of a Magnetic Capsule in a Lumen using a Single Rotating Magnet"]]
===== Manipulation of Object Collections =====
{{::project:multi-push-good0.jpg?175&nolink}}
{{::project:multi-push-good1.jpg?175&nolink}}
The research goal of this project is to enable robots to manipulate and reason about groups of objects en masse. The hypothesis of this project is that treating object collections as single entities enables data-efficient, self-supervised learning of contact locations for pushing and grasping grouped objects. This project investigates a novel neural network architecture for self-supervised manipulation learning. The convolutional neural network model takes as input sensory data and a robot hand configuration. The network learns to predict as output a manipulation quality score for the given inputs. When presented with a novel scene, the robot can perform manipulation inference by evaluating the current sensory data, while directly optimizing the manipulation score predicted by the network over different hand configurations. The project supports the development of experimental protocols and the collection of associated data for dissemination to stimulate research activity in manipulation of object collections.
[[https://matwilso.github.io/projects/object_collections|Learning to Manipulate Object Collections]]
===== Non-prehensile Manipulation =====
{{::project:composite-push-vector-square.png?175&nolink |}}
While grasping is the dominant form of manipulation for robots, pushing provides an interesting alternative in many contexts. In particular when interacting with previously unseen objects, pushing is less likely to cause dramatic failures such as dropping and breaking objects compared with grasping.
* [[http://www.cs.utah.edu/~thermans/papers/hermans-ichr2013.pdf|"Learning Contact Locations for Pushing and Orienting Unknown Objects"]]
* [[http://www.cs.utah.edu/~thermans/papers/hermans-icra2013.pdf|"Decoupling Behavior, Perception, and Control for Autonomous Learning of Affordances"]]
* [[http://www.cs.utah.edu/~thermans/papers/hermans-iros2012.pdf|"Guided Pushing for Object Singulation"]]
* [[http://www.cs.utah.edu/~thermans/papers/cosgun-iros2011.pdf|"Push Planning for Object Placement on Cluttered Table Surfaces"]]