~~NOTOC~~ ~~NOCACHE~~ ====== In-Hand Manipulation ====== ===== Summary ===== Solving the general in-hand manipulation problem using real world robotic hands requires a variety of manipulation skills. We focus on a task that can be solved using in-hand manipulation: in-hand object reposing. We explore methods to repose an object with reference to the palm without dropping the object. ===== Publications ===== - {{:project:sundaralingam_rss_2017-in-grasp-opt.pdf|"Relaxed-Rigidity Constraints: In-Grasp Manipulation using Purely Kinematic Trajectory Optimization."}}, Balakumar Sundaralingam and Tucker Hermans, [[http://roboticsconference.org|Robotics Science and Systems]] (RSS) 2017. - {{:project:sundaralingam_grasp_gaits.pdf |"Geometric In-Hand Regrasp Planning: Alternating Optimization of Finger Gaits and In-Grasp Manipulation."}} Balakumar Sundaralingam, and Tucker Hermans, ICRA 2018. - {{:project:sundaralingam_auro_2018_in-grasp.pdf |"Relaxed-Rigidity Constraints: Kinematic Trajectory Optimization and Collision Avoidance for In-Grasp Manipulation."}}; Balakumar Sundaralingam, Tucker Hermans; Autonomous Robots, 2019. ===== Source code & Data ===== Please cite our RSS paper when using the source code or data. Here is the corresponding bibtex entry: @INPROCEEDINGS{Sundaralingam-RSS-17, AUTHOR = {Balakumar Sundaralingam AND Tucker Hermans}, TITLE = {Relaxed-Rigidity Constraints: In-Grasp Manipulation using Purely Kinematic Trajectory Optimization}, BOOKTITLE = {Proceedings of Robotics: Science and Systems}, YEAR = {2017}, ADDRESS = {Cambridge, Massachusetts}, MONTH = {July}, DOI = {10.15607/RSS.2017.XIII.015} } - Source code: [[https://bitbucket.org/robot-learning/relaxed_rigidity_in_grasp/overview|git]]. - Dataset download link: [[http://robot-learning.cs.utah.edu/data_archive/in_grasp_rss2017.tar.gz|tar]]. The dataset associated with this work consists of [[http://wiki.ros.org/rosbag|bag]] files containing the following topics: |Topic | Description| |/tf | contains the rigid body transformations between the palm, object, and camera frames| |/allegro_hand_right/joint_states| The current state of the robot joints| |/allegro_hand_right/joint_cmd| The commanded joint angles of the robot| |/camera/rgb/camera_info| Asus Xtion Pro camera's meta information| |/camera/rgb/image_raw| RGB Video of the experiment| The dataset has the following internal structure: /bag_files /pc.tar.gz /relaxed-position.tar.gz /relaxed-position-orientation.tar.gz /relaxed-rigidity.tar.gz /trajectories /pc.tar.bz2 /relaxed-position.tar.bz2 /relaxed-position-orientation.tar.bz2 /relaxed-rigidity.tar.bz2 /ik-rigid.tar.bz2 /*.csv To run a trajectory, use relaxed_rigidity_in_grasp/scripts/run_trajectory.py from the source code repository. Each .csv file in the trajectories folder contains information about the experiments. The data can be read by using the 'TrajList' class in relaxed_rigidity_in_grasp/scripts/object_list.py. Example is in the run_trajectory.py script. The name of the bag files are formatted as follows: [object_name]_[traj_number]_[trial_number].bag The bag file names are also in the csv files to allow for easy matching. ===== Video ===== {{ youtube>Gn-yMRjbmPE?640x360&rel=0 }} ===== Experiment Protocol ===== Objects from the [[http://www.ycbbenchmarks.com/|YCB]] dataset were chosen to test our planner. Trajectories are run on an Allegro hand. We quantified position error and orientation error between the desired pose from the planner and the reached pose from our experiments. To account for variance and validate robustness, five trials are run for each trajectory. {{ :project:objects.png?nolink&400 |}} ===== Results ===== /*Convergence plot: {{:project:in_grasp_convergence_err.png?nolink&500|}}*/ Errors from running the generated trajectories on the Allegro hand. {{ :project:in_grasp_position_err.png?nolink&750 |}}{{ :project:in_grasp_orientation_err.png?nolink&750 |}}