Abstract
Cross-modal data analytics—that can be rendered for experience
through vision, hearing, and touch—poses a fundamental
challenge to designers. Non-linguistic sonification is a
well-researched means for non-visual pattern recognition but
higher density datasets pose a challenge. Because human hearing
is optimized for detecting locations on a horizontal plane, our
approach recruits this optimization by employing an immersive
binaural horizontal plane using auditory icons. Two case studies
demonstrate our approach: A sonic translation of a map and a
sonic translation of a computational fluid dynamics simulation.