dc.contributor.author | Drayer, Gregorio E. | en_US |
dc.contributor.author | Howard, Ayanna M. | en_US |
dc.date.accessioned | 2013-07-18T20:31:06Z | |
dc.date.available | 2013-07-18T20:31:06Z | |
dc.date.issued | 2012-07 | |
dc.identifier.citation | G. Drayer, A. Howard, “A Granular Multi-Sensor Data Fusion Method for Life Support Systems that Enhances Situation Awareness,” 42nd International Conference on Environmental Systems (ICES), 15-19 July 2012, San Diego, California. | en_US |
dc.identifier.uri | http://hdl.handle.net/1853/48467 | |
dc.description | ©2012 by the American Institute of Aeronautics and Astronautics, Inc. | en_US |
dc.description | Presented at the 2012 Global Space Exploration Conference, 22-24 May 2012, Washington DC, USA. | en_US |
dc.description | DOI: 10.2514/6.2012-3434 | en_US |
dc.description.abstract | Slow-changing characteristics of controlled environmental systems and the increasing availability of data from sensors and measurements offer opportunities for the development of computational methods to enhance situation observability, decrease human workload, and support real-time decision making. Multi-sensor data fusion, which combines observations and measurements from di_erent sources to provide a complete description of a system and its environment, can be used in user-centered interfaces in support situation awareness and observability. Situation observability enables humans to perceive and comprehend the state of the system at a given instant, and helps human operators to decide what actions to take at any given time that may affect the projection of such state into the near future. This paper presents a multi-sensor data fusion method that collects discrete human-inputs and measurements to generate a granular perception function that supports situation observability. These human-inputs are situation-rich, meaning they combine measurements defining the operational condition of the system with a subjective assessment of its situation. As a result, the perception function produces situation-rich signals that may be employed in user-interfaces or in adaptive automation. The perception function is a fuzzy associative memory (FAM) composed of a number of granules equal to the number of situations that may be detected by human-experts; its development is based on their interaction with the system. The human-input data sets are transformed into a granular structure by an adaptive method based on particle swarms. The paper proposed describes the multi-sensor data fusion method and its application to a ground-based aquatic habitat working as a small-scale environmental system. | en_US |
dc.language.iso | en_US | en_US |
dc.publisher | Georgia Institute of Technology | en_US |
dc.subject | Data fusion | en_US |
dc.subject | Situation observability | en_US |
dc.subject | Life support systems | en_US |
dc.subject | Fuzzy associative memory | en_US |
dc.title | A Granular Multi-Sensor Data Fusion Method for Situation Observability in Life Support Systems | en_US |
dc.type | Proceedings | en_US |
dc.type | Post-print | en_US |
dc.contributor.corporatename | Georgia Institute of Technology. Human-Automation Systems Lab | en_US |
dc.contributor.corporatename | Georgia Institute of Technology. School of Electrical and Computer Engineering | en_US |
dc.contributor.corporatename | Georgia Institute of Technology. Center for Robotics and Intelligent Machines | en_US |
dc.publisher.original | American Institute of Aeronautics and Astronautics, Inc. | en_US |
dc.identifier.doi | 10.2514/6.2012-3434 | |