Show simple item record

dc.contributor.authorKwatra, Viveken_US
dc.date.accessioned2005-09-16T15:12:53Z
dc.date.available2005-09-16T15:12:53Z
dc.date.issued2005-07-19en_US
dc.identifier.urihttp://hdl.handle.net/1853/7214
dc.description.abstractThis thesis explores synthesis by example as a paradigm for rendering real-world phenomena. In particular, phenomena that can be visually described as texture are considered. We exploit, for synthesis, the self-repeating nature of the visual elements constituting these texture exemplars. Techniques for unconstrained as well as constrained/controllable synthesis of both image and video textures are presented. For unconstrained synthesis, we present two robust techniques that can perform spatio-temporal extension, editing, and merging of image as well as video textures. In one of these techniques, large patches of input texture are automatically aligned and seamless stitched with each other to generate realistic looking images and videos. The second technique is based on iterative optimization of a global energy function that measures the quality of the synthesized texture with respect to the given input exemplar. We also present a technique for controllable texture synthesis. In particular, it allows for generation of motion-controlled texture animations that follow a specified flow field. Animations synthesized in this fashion maintain the structural properties like local shape, size, and orientation of the input texture even as they move according to the specified flow. We cast this problem into an optimization framework that tries to simultaneously satisfy the two (potentially competing) objectives of similarity to the input texture and consistency with the flow field. This optimization is a simple extension of the approach used for unconstrained texture synthesis. A general framework for example-based synthesis and rendering is also presented. This framework provides a design space for constructing example-based rendering algorithms. The goal of such algorithms would be to use texture exemplars to render animations for which certain behavioral characteristics need to be controlled. Our motion-controlled texture synthesis technique is an instantiation of this framework where the characteristic being controlled is motion represented as a flow field.en_US
dc.format.extent32446095 bytes
dc.format.extent141769454 bytes
dc.format.extent89126156 bytes
dc.format.extent51089412 bytes
dc.format.mimetypeapplication/pdf
dc.format.mimetypevideo/quicktime
dc.format.mimetypevideo/mpeg
dc.format.mimetypevideo/mpeg
dc.language.isoen_US
dc.publisherGeorgia Institute of Technologyen_US
dc.subjectFlow visualizationen_US
dc.subjectTexture animation
dc.subjectEnergy minimization
dc.subjectMarkov random fields
dc.subjectVideo-based rendering
dc.subjectImage-based rendering
dc.subjectTexture synthesis
dc.subjectNatural phenomena
dc.subjectImage and video processing
dc.titleExample-based Rendering of Textural Phenomenaen_US
dc.typeDissertationen_US
dc.description.degreePh.D.en_US
dc.contributor.departmentComputingen_US
dc.description.advisorCommittee Chair: Bobick, Aaron; Committee Co-Chair: Essa, Irfan; Committee Member: Raskar, Ramesh; Committee Member: Rossignac, Jarek; Committee Member: Seitz, Steve; Committee Member: Turk, Gregen_US


Files in this item

Thumbnail
Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record