Show simple item record

dc.contributor.authorDrayer, Gregorio E.en_US
dc.contributor.authorHoward, Ayanna M.en_US
dc.date.accessioned2013-07-18T20:31:06Z
dc.date.available2013-07-18T20:31:06Z
dc.date.issued2012-07
dc.identifier.citationG. Drayer, A. Howard, “A Granular Multi-Sensor Data Fusion Method for Life Support Systems that Enhances Situation Awareness,” 42nd International Conference on Environmental Systems (ICES), 15-19 July 2012, San Diego, California.en_US
dc.identifier.urihttp://hdl.handle.net/1853/48467
dc.description©2012 by the American Institute of Aeronautics and Astronautics, Inc.en_US
dc.descriptionPresented at the 2012 Global Space Exploration Conference, 22-24 May 2012, Washington DC, USA.en_US
dc.descriptionDOI: 10.2514/6.2012-3434en_US
dc.description.abstractSlow-changing characteristics of controlled environmental systems and the increasing availability of data from sensors and measurements offer opportunities for the development of computational methods to enhance situation observability, decrease human workload, and support real-time decision making. Multi-sensor data fusion, which combines observations and measurements from di_erent sources to provide a complete description of a system and its environment, can be used in user-centered interfaces in support situation awareness and observability. Situation observability enables humans to perceive and comprehend the state of the system at a given instant, and helps human operators to decide what actions to take at any given time that may affect the projection of such state into the near future. This paper presents a multi-sensor data fusion method that collects discrete human-inputs and measurements to generate a granular perception function that supports situation observability. These human-inputs are situation-rich, meaning they combine measurements defining the operational condition of the system with a subjective assessment of its situation. As a result, the perception function produces situation-rich signals that may be employed in user-interfaces or in adaptive automation. The perception function is a fuzzy associative memory (FAM) composed of a number of granules equal to the number of situations that may be detected by human-experts; its development is based on their interaction with the system. The human-input data sets are transformed into a granular structure by an adaptive method based on particle swarms. The paper proposed describes the multi-sensor data fusion method and its application to a ground-based aquatic habitat working as a small-scale environmental system.en_US
dc.language.isoen_USen_US
dc.publisherGeorgia Institute of Technologyen_US
dc.subjectData fusionen_US
dc.subjectSituation observabilityen_US
dc.subjectLife support systemsen_US
dc.subjectFuzzy associative memoryen_US
dc.titleA Granular Multi-Sensor Data Fusion Method for Situation Observability in Life Support Systemsen_US
dc.typeProceedingsen_US
dc.typePost-printen_US
dc.contributor.corporatenameGeorgia Institute of Technology. Human-Automation Systems Laben_US
dc.contributor.corporatenameGeorgia Institute of Technology. School of Electrical and Computer Engineeringen_US
dc.contributor.corporatenameGeorgia Institute of Technology. Center for Robotics and Intelligent Machinesen_US
dc.publisher.originalAmerican Institute of Aeronautics and Astronautics, Inc.en_US
dc.identifier.doi10.2514/6.2012-3434


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record