A Granular Multi-Sensor Data Fusion Method for Situation Observability in Life Support Systems
Drayer, Gregorio E.
Howard, Ayanna M.
MetadataShow full item record
Slow-changing characteristics of controlled environmental systems and the increasing availability of data from sensors and measurements offer opportunities for the development of computational methods to enhance situation observability, decrease human workload, and support real-time decision making. Multi-sensor data fusion, which combines observations and measurements from di_erent sources to provide a complete description of a system and its environment, can be used in user-centered interfaces in support situation awareness and observability. Situation observability enables humans to perceive and comprehend the state of the system at a given instant, and helps human operators to decide what actions to take at any given time that may affect the projection of such state into the near future. This paper presents a multi-sensor data fusion method that collects discrete human-inputs and measurements to generate a granular perception function that supports situation observability. These human-inputs are situation-rich, meaning they combine measurements defining the operational condition of the system with a subjective assessment of its situation. As a result, the perception function produces situation-rich signals that may be employed in user-interfaces or in adaptive automation. The perception function is a fuzzy associative memory (FAM) composed of a number of granules equal to the number of situations that may be detected by human-experts; its development is based on their interaction with the system. The human-input data sets are transformed into a granular structure by an adaptive method based on particle swarms. The paper proposed describes the multi-sensor data fusion method and its application to a ground-based aquatic habitat working as a small-scale environmental system.