Show simple item record

dc.contributor.advisorKemp, Charles C.
dc.contributor.authorAsoka Kumar Shenoi, Ashwin Kumar
dc.date.accessioned2016-05-27T13:24:31Z
dc.date.available2016-05-27T13:24:31Z
dc.date.created2016-05
dc.date.issued2016-05-02
dc.date.submittedMay 2016
dc.identifier.urihttp://hdl.handle.net/1853/55027
dc.description.abstractWe consider the problem of enabling a robot to efficiently obtain a dense haptic map of its visible surroundings Using the complementary properties of vision and tactile sensing. Our approach assumes that visible surfaces that look similar to one another are likely to have similar haptic properties. In our previous work, we introduced an iterative algorithm that enabled a robot to infer dense haptic labels across visible surfaces in an RGB-D image when given a sequence of sparse haptic labels. In this work, we describe how dense conditional random fields (CRFs) can be applied to this same problem and present results from evaluating a dense CRF’s performance in simulated trials with idealized haptic labels. We evaluated our method using several publicly available RGB-D image datasets with indoor cluttered scenes pertinent to robot manipulation. In these simulated trials, the dense CRF substantially outperformed our previous algorithm by correctly assigning haptic labels to an average of 93% (versus 76% in our previous work) of all object pixels in an image given the highest number of contact points per object. Likewise, the dense CRF correctly assigned haptic labels to an average of 81% (versus 63% in our previous work) of all object pixels in an image given a low number of contact points per object. We compared the performance of dense CRF using uniform prior with a dense CRF using prior obtained from the visible scene using a Fully Convolutional Network trained for visual material recognition. The use of the convolutional network further improves the performance of the algorithm. We also performed experiments with the humanoid robot DARCI reaching in a cluttered foliage environment while using our algorithm to create a haptic map. The algorithm correctly assigned the label to 82.52% of the scenes with trunks and leaves after 10 reaches into the environment.
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.publisherGeorgia Institute of Technology
dc.subjectTactile
dc.subjectVision
dc.subjectHaptic mapping
dc.subjectCNN
dc.subjectCRF
dc.titleA CRF that combines tactile sensing and vision for haptic mapping
dc.typeThesis
dc.description.degreeM.S.
dc.contributor.departmentElectrical and Computer Engineering
thesis.degree.levelMasters
dc.contributor.committeeMemberVela, Patricio A.
dc.contributor.committeeMemberHays, James
dc.date.updated2016-05-27T13:24:31Z


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record