Show simple item record

dc.contributor.authorLaViers, Amy
dc.contributor.authorEgerstedt, Magnus
dc.date.accessioned2014-10-07T20:02:51Z
dc.date.available2014-10-07T20:02:51Z
dc.date.issued2014-04
dc.identifier.citationA. LaViers and M. Egerstedt. Style-based Abstractions for Human Motion Classification. International Conference on Cyber-Physical Systems, CPSWEEK, Berlin, Germany, April 2014, pp. 84-91.en_US
dc.identifier.isbn978-1-4799-4931-1
dc.identifier.urihttp://hdl.handle.net/1853/52412
dc.description© 2014 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.descriptionPresented at the International Conference on Cyber-Physical Systems, CPSWEEK, April 14-17 2014, Berlin, Germany.
dc.descriptionDOI: 10.1109/ICCPS.2014.6843713
dc.description.abstractThis paper presents an approach to motion analysis for robotics in which a quantitative definition of "style of motion" is used to classify movements. In particular, we present a method for generating a "best match" signal for empirical data via a two stage optimal control formulation. The first stage consists of the generation of trajectories that mimic empirical data. In the second stage, an inverse problem is solved in order to obtain the "stylistic parameters" that best recreate the empirical data. This method is amenable to human motion analysis in that it not only produces a matching trajectory but, in doing so, classifies its quality. This classification allows for the production of additional trajectories, between any two endpoints, in the same style as the empirical reference data. The method not only enables robotic mimicry of human style but can also provide insights into genres of stylized movement, equipping cyberphysical systems with a deeper interpretation of human movement.en_US
dc.language.isoen_USen_US
dc.publisherGeorgia Institute of Technologyen_US
dc.subjectCyberphysical systemsen_US
dc.subjectHuman motionen_US
dc.subjectMotion analysisen_US
dc.subjectRoboticsen_US
dc.subjectTrajectoriesen_US
dc.titleStyle-based Abstractions for Human Motion Classificationen_US
dc.typePre-printen_US
dc.typeProceedingsen_US
dc.contributor.corporatenameGeorgia Institute of Technology. School of Electrical and Computer Engineeringen_US
dc.contributor.corporatenameGeorgia Institute of Technology. Institute for Robotics and Intelligent Machinesen_US
dc.contributor.corporatenameUniversity of Virginia. Systems and Information Engineeringen_US
dc.publisher.originalInstitute of Electrical and Electronics Engineers
dc.identifier.doi10.1109/ICCPS.2014.6843713
dc.embargo.termsnullen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record