Show simple item record

dc.contributor.authorMartinson, Eric Beowulfen_US
dc.date.accessioned2008-02-07T18:13:09Z
dc.date.available2008-02-07T18:13:09Z
dc.date.issued2007-11-12en_US
dc.identifier.urihttp://hdl.handle.net/1853/19724
dc.description.abstractWith the growth of successes in pattern recognition and signal processing, mobile robot applications today are increasingly equipping their hardware with microphones to improve the set of available sensory information. However, if the robot, and therefore the microphone, ends up in a poor location acoustically, then the data will remain noisy and potentially useless for accomplishing the required task. This is compounded by the fact that there are many bad acoustic locations through which a robot is likely to pass, and so the results from auditory sensors often remain poor for much of the task. The movement of the robot, though, can also be an important tool for overcoming these problems, a tool that has not been exploited in the traditional signal processing community. Robots are not limited to a single location as are traditionally placed microphones, nor are they powerless over to where they will be moved as with wearable computers. If there is a better location available for performing its task, a robot can navigate to that location under its own power. Furthermore, when deciding where to move, robots can develop complex models of the environment. Using an array of sensors, a mobile robot can build models of sound flow through an area, picking from those models the paths most likely to improve performance of an acoustic application. In this dissertation, we address the question of how to exploit robotic movement. Using common sensors, we present a collection of tools for gathering information about the auditory scene and incorporating that information into a general framework for acoustical awareness. Thus equipped, robots can make intelligent decisions regarding control strategies to enhance their performance on the underlying acoustic application.en_US
dc.publisherGeorgia Institute of Technologyen_US
dc.subjectSound fieldsen_US
dc.subjectEvidence gridsen_US
dc.subjectStealthy approachen_US
dc.subject.lcshMobile robots
dc.subject.lcshDetectors
dc.subject.lcshSound
dc.subject.lcshMotion
dc.titleAcoustical Awareness for Intelligent Robotic Actionen_US
dc.typeDissertationen_US
dc.description.degreePh.D.en_US
dc.contributor.departmentComputingen_US
dc.description.advisorCommittee Chair: Arkin, Ronald; Committee Member: Anderson, Dave; Committee Member: Balch, Tucker; Committee Member: Dellaert, Frank; Committee Member: Starner, Thaden_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record