• Login
    View Item 
    •   SMARTech Home
    • Georgia Tech Theses and Dissertations
    • Georgia Tech Theses and Dissertations
    • View Item
    •   SMARTech Home
    • Georgia Tech Theses and Dissertations
    • Georgia Tech Theses and Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    A multimodal execution monitor for assistive robots

    Thumbnail
    View/Open
    PARK-DISSERTATION-2018.pdf (24.18Mb)
    Date
    2018-03-16
    Author
    Park, Daehyung
    Metadata
    Show full item record
    Abstract
    Assistive robots have the potential to serve as caregivers, providing assistance with activities of daily living to people with disabilities. Monitoring when something has gone wrong could help assistive robots operate more safely and effectively around people. However, the complexity of interacting with people and objects in human environments can make challenges in monitoring operations. By monitoring multimodal sensory signals, an execution monitor could perform a variety of roles, such as detecting success, determining when to switch behaviors, and otherwise exhibiting more common sense. The purpose of this dissertation is to introduce a multimodal execution monitor to improve safety and success of assistive manipulation services. To accomplish this goal, we make three main contributions. First, we introduce a data-driven anomaly detector, a part of the monitor, that reports anomalous task executions from multimodal sensory signals online. Second, we introduce a data-driven anomaly classifier that recognizes the type and cause of common anomalies through an artificial neural network after fusing multimodal features. Lastly, as the main testbed of the monitoring system, we introduce a robot-assisted feeding system for people with disabilities, using a general-purpose mobile manipulator (a PR2 robot). We evaluate the monitoring system with haptic, visual, auditory, and kinematic sensing during household tasks and human-robot interactive tasks including feeding assistance. We show multimodality improves the performance of monitoring methods by detecting and classifying a broader range of anomalies. Overall, our research demonstrates the multimodal execution monitoring system helps the assistive manipulation system to provide safe and successful assistance for people with disabilities.
    URI
    http://hdl.handle.net/1853/59860
    Collections
    • College of Computing Theses and Dissertations [1071]
    • Georgia Tech Theses and Dissertations [22401]
    • School of Interactive Computing Theses and Dissertations [106]

    Browse

    All of SMARTechCommunities & CollectionsDatesAuthorsTitlesSubjectsTypesThis CollectionDatesAuthorsTitlesSubjectsTypes

    My SMARTech

    Login

    Statistics

    View Usage StatisticsView Google Analytics Statistics
    facebook instagram twitter youtube
    • My Account
    • Contact us
    • Directory
    • Campus Map
    • Support/Give
    • Library Accessibility
      • About SMARTech
      • SMARTech Terms of Use
    Georgia Tech Library266 4th Street NW, Atlanta, GA 30332
    404.894.4500
    • Emergency Information
    • Legal and Privacy Information
    • Human Trafficking Notice
    • Accessibility
    • Accountability
    • Accreditation
    • Employment
    © 2020 Georgia Institute of Technology