Show simple item record

dc.contributor.authorSeldess, Zachary
dc.contributor.authorYamaoka, So
dc.contributor.authorKuester, Falko
dc.date.accessioned2014-05-21T16:44:19Z
dc.date.available2014-05-21T16:44:19Z
dc.date.issued2011-06
dc.identifier.citationProceedings of the 17th International Conference on Auditory Display (ICAD2011), Budapest, Hungary. 20-23 June, 2011. International Community for Auditory Display, 2011.en_US
dc.identifier.urihttp://hdl.handle.net/1853/51766
dc.descriptionPresented at the 17th International Conference on Auditory Display (ICAD2011), 20-23 June, 2011 in Budapest, Hungary.en_US
dc.description.abstractWe present “Sonnotile”, a multi-modal rendering framework to enhance scientific data exploration, representation, and analysis within tiled-display visualization environments. Sonnotile aims to assist researchers in the customization and embedding of sound objects within their data sets. These sound objects may act as way-finding markers within a media space, as well as allow researchers to attach and recall various sonic descriptions or representations of an arbitrary number of regions within a data set. In designing the software, our initial efforts have been centered on the challenges of sound “annotation” within large- scale pyramidal TIFF files.en_US
dc.language.isoen_USen_US
dc.subjectAuditory displayen_US
dc.subjectMultimodal interfaceen_US
dc.subjectSonificationen_US
dc.titleSonnotile: Audio Annotation and Sonification for Large Tiled Audio/Visual Display Environmentsen_US
dc.typeProceedingsen_US
dc.contributor.corporatenameKing Abdullah University of Science and Technology. Visualization Laben_US
dc.contributor.corporatenameUniversity of California, San Diego. Calit2 Center of Graphics Visualization and Virtual Realityen_US
dc.publisher.originalInternational Community for Auditory Displayen_US
dc.embargo.termsnullen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record