Title: Multimodal Haptic Perception during Incidental Contact.
Date: Wednesday, September 21st, 2016
Time: 10.30 AM to 12.30 PM (EDT)
Location: MiRC 102B
Robotics Ph.D. Student
Department of Biomedical Engineering
Georgia Institute of Technology & Emory University
Dr. Charles C. Kemp (Advisor), Biomedical Engineering, Georgia Institute of Technology & Emory University
Dr. James M. Rehg, School of Interactive Computing, Georgia Institute of Technology
Dr. Lena H. Ting, Biomedical Engineering, Georgia Institute of Technology & Emory University
Dr. C. Karen Liu, School of Interactive Computing, Georgia Institute of Technology
Dr. Henrik I. Christensen, Computer Science and Engineering, University of California San Diego
Robotics research has often focused on avoiding contact with the environment, except at well-modeled locations. Most haptics research has used information from contact based on deliberate exploration where the actions are optimized for sensing. However, during manipulation in unstructured environments many opportunities arise to exploit incidental contact. By incidental contact, we mean contact that is not central to the robot’s current actions and may occur unexpectedly or unintentionally. We are developing methods for haptic perception using force, motion, thermal, and visual sensing during incidental contact. We plan to evaluate our perception methods during reaching tasks in cluttered environments and in assistive scenarios. In similar scenarios, data-driven methods for haptic perception have shown promise, but suitable training data is lacking. Also, generalizing the performance of haptic perception to new situations is challenging due to varying contact conditions.
To address the shortage of data, we are developing various physics-based models to generate synthetic data, multiple simplified robots to collect data from real objects, and a portable handheld human-operated device to collect data from real objects in their natural settings. Using data-driven methods and physics-based models, our algorithms have successfully categorized various objects into relevant haptic categories.
To address the challenge of generalizing the performance of haptic perception, we are developing methods to extend our algorithm’s performance to different incidental contact conditions with varying robot-arm stiffness, robot-arm velocity, and time-scales of interaction. We are also developing strategies to simplify the problem of perception by having the robot distinguish among a small number of task-relevant categories such as tactile foreground vs. background.