Show simple item record

dc.contributor.authorYe, Yutingen_US
dc.date.accessioned2013-06-15T02:32:27Z
dc.date.available2013-06-15T02:32:27Z
dc.date.issued2012-02-23en_US
dc.identifier.urihttp://hdl.handle.net/1853/47540
dc.description.abstractThe goal of this thesis is to synthesize believable motions of a character interacting with its surroundings and manipulating objects through physical contacts and forces. Human-like autonomous avatars are in increasing demand in areas such as entertainment, education, and health care. Yet modeling the basic human motor skills of locomotion and manipulation remains a long-standing challenge in animation research. The seemingly simple tasks of navigating an uneven terrain or grasping cups of different shapes involve planning with complex kinematic and physical constraints as well as adaptation to unexpected perturbations. Moreover, natural movements exhibit unique personal characteristics that are complex to model. Although motion capture technologies allow virtual actors to use recorded human motions in many applications, the recorded motions are not directly applicable to tasks involving interactions for two reasons. First, the acquired data cannot be easily adapted to new environments or different tasks goals. Second, acquisition of accurate data is still a challenge for fine scale object manipulations. In this work, we utilize data to create natural looking animations, and mitigate data deficiency with physics-based simulations and numerical optimizations. We develop algorithms based on a single reference motion for three types of control problems. The first problem focuses on motions without contact constraints. We use joint torque patterns identified from the captured motion to simulate responses and recovery of the same style under unexpected pushes. The second problem focuses on locomotion with foot contacts. We use contact forces to control an abstract dynamic model of the center of mass, which sufficiently describes the locomotion task in the input motion. Simulation of the abstract model under unexpected pushes or anticipated changes of the environment results in responses consistent with both the laws of physics and the style of the input. The third problem focuses on fine scale object manipulation tasks, in which accurate finger motions and contact information are not available. We propose a sampling method to discover contact relations between the hand and the object from only the gross motion of the wrists and the object. We then use the abundant contact constraints to synthesize detailed finger motions. The algorithm creates finger motions of various styles for a diverse set of object shapes and tasks, including ones that are not present at capture time. The three algorithms together control an autonomous character with dexterous hands to interact naturally with a virtual world. Our methods are general and robust across character structures and motion contents when testing on a wide variety of motion capture sequences and environments. The work in this thesis brings closer the motor skills of a virtual character to its human counterpart. It provides computational tools for the analysis of human biomechanics, and can potentially inspire the design of novel control algorithms for humanoid robots.en_US
dc.publisherGeorgia Institute of Technologyen_US
dc.subjectPhysics-based simulationen_US
dc.subjectMotion captureen_US
dc.subjectOptimal controlen_US
dc.subject.lcshAvatars (Virtual reality)
dc.subject.lcshComputer animation
dc.subject.lcshComputer simulation
dc.subject.lcshInteractive computer graphics
dc.subject.lcshComputer graphics
dc.titleSimulation of characters with natural interactionsen_US
dc.typeDissertationen_US
dc.description.degreePhDen_US
dc.contributor.departmentComputingen_US
dc.description.advisorCommittee Chair: Liu, C. Karen; Committee Member: Christensen, Henric I.; Committee Member: Ting, Lena; Committee Member: Turk, Greg; Committee Member: Zodan, Victor B.en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record