Collaborative Exploration and Sensemaking of Big Environmental Sound Data
Many ecologists are using acoustic monitoring to study animals and the health of ecosystems. Technological advances mean acoustic recording of nature can now be done at a relatively low cost, with minimal disturbance, and over long periods of time. Vast amounts of data are gathered yielding environmental soundscapes which requires new forms of visualization and interpretation of the data. Recently a novel visualization technique has been designed that represents soundscapes using dense visual summaries of acoustic patterns. However, little is known about how this visualization tool can be employed to make sense of soundscapes. Understanding how the technique can be best used and developed requires collaboration between interface, algorithm designers and ecologists. We empirically investigated the practices and needs of ecologists using acoustic monitoring technologies. In particular, we investigated the use of the soundscape visualization tool by teams of ecologists researching endangered species detection, species behaviour, and monitoring of ecological areas using long duration audio recordings. Our findings highlight the opportunities and challenges that ecologists face in making sense of large acoustic datasets through patterns of acoustic events. We reveal the characteristic processes for collaboratively generating situated accounts of natural places from soundscapes using visualization. We also discuss the biases inherent in the approach. Big data from nature has different characteristics from social and informational data sources that comprise much of the World Wide Web. We conclude with design implications for visual interfaces to facilitate collaborative exploration and discovery through soundscapes.