Interactional Order and Constructed Ways of Seeing with Touchless Imaging Systems in Surgery

dc.contributor.authorO’Hara, Kenton
dc.contributor.authorGonzalez, Gerardo
dc.contributor.authorPenney, Graeme
dc.contributor.authorSellen, Abigail
dc.contributor.authorCorish, Robert
dc.contributor.authorMentis, Helena
dc.contributor.authorVarnavas, Andreas
dc.contributor.authorCriminisi, Antonio
dc.contributor.authorRouncefield, Mark
dc.contributor.authorDastur, Neville
dc.contributor.authorCarrell, Tom
dc.date.accessioned2020-06-06T13:06:47Z
dc.date.available2020-06-06T13:06:47Z
dc.date.issued2014
dc.date.issued2014
dc.description.abstractWhile surgical practices are increasingly reliant on a range of digital imaging technologies, the ability for clinicians to interact and manipulate these digital representations in the operating theatre using traditional touch based interaction devices is constrained by the need to maintain sterility. To overcome these concerns with sterility, a number of researchers are have been developing ways of enabling interaction in the operating theatre using touchless interaction techniques such as gesture and voice to allow clinicians control of the systems. While there have been important technical strides in the area, there has been little in the way of understanding the use of these touchless systems in practice. With this in mind we present a touchless system developed for use during vascular surgery. We deployed the system in the endovascular suite of a large hospital for use in the context of real procedures. We present findings from a study of the system in use focusing on how, with touchless interaction, the visual resources were embedded and made meaningful in the collaborative practices of surgery. In particular we discuss the importance of direct and dynamic control of the images by the clinicians in the context of talk and in the context of other artefact use as well as the work performed by members of the clinical team to make themselves sensable by the system. We discuss the broader implications of these findings for how we think about the design, evaluation and use of these systems.de
dc.identifier.doi10.1007/s10606-014-9203-4
dc.identifier.pissn1573-7551
dc.identifier.urihttp://dx.doi.org/10.1007/s10606-014-9203-4
dc.identifier.urihttps://dl.eusset.eu/handle/20.500.12015/3885
dc.publisherSpringer
dc.relation.ispartofComputer Supported Cooperative Work (CSCW): Vol. 23, No. 3
dc.relation.ispartofseriesComputer Supported Cooperative Work (CSCW)
dc.subjectcollaborative practices of surgery
dc.subjectgestural interaction
dc.subjectoperating theatre
dc.subjectsterility
dc.subjecttouchless interaction
dc.subjectwork practice
dc.titleInteractional Order and Constructed Ways of Seeing with Touchless Imaging Systems in Surgeryde
dc.typeText/Journal Article
gi.citation.endPage337
gi.citation.startPage299
gi.citations.count31
gi.citations.elementElishiah Miller, Zheng Li, Helena Mentis, Adrian Park, Ting Zhu, Nilanjan Banerjee (2020): RadSense: Enabling one hand and no hands interaction for sterile manipulation of medical images using Doppler radar, In: Smart Health, doi:10.1016/j.smhl.2019.100089
gi.citations.elementAmbreen Zaman, Anup Roy, Kanij Fatema, Nusrat Jahan Farin, Tanja Doring, Rainer Malaka (2019): Explore Voice and Foot-based Interaction Techniques to Navigate 2D Radiological Images in the Virtual Reality Operation Theatre, In: 2019 22nd International Conference on Computer and Information Technology (ICCIT), doi:10.1109/iccit48885.2019.9038175
gi.citations.elementKatja Krug, Ricardo Langner, Konstantin Klamka (2023): Discussing Facets of Hybrid User Interfaces for the Medical Domain, In: 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), doi:10.1109/ismar-adjunct60411.2023.00059
gi.citations.elementBenjamin Hatscher, Christian Hansen (2018): Hand, Foot or Voice, In: Proceedings of the 20th ACM International Conference on Multimodal Interaction, doi:10.1145/3242969.3242971
gi.citations.elementWang Shi Hui, Weihong Huang, Jianzhong Hu, Kun Tao, Yonghong Peng (2020): A New Precise Contactless Medical Image Multimodal Interaction System for Surgical Practice, In: IEEE Access, doi:10.1109/access.2019.2946404
gi.citations.elementDebaleena Chattopadhyay, Francesca Salvadori, Kenton O’Hara, Sean Rintel (2017): Beyond Presentation: Shared Slideware Control as a Resource for Collocated Collaboration, In: Human–Computer Interaction 5-6(33), doi:10.1080/07370024.2017.1388170
gi.citations.elementKenton O’Hara, Abigail Sellen, Juan Wachs (2016): Introduction to Special Issue on Body Tracking and Healthcare, In: Human–Computer Interaction 3-4(31), doi:10.1080/07370024.2016.1151712
gi.citations.elementHelena M. Mentis (2017): Collocated Use of Imaging Systems in Coordinated Surgical Practice, In: Proceedings of the ACM on Human-Computer Interaction CSCW(1), doi:10.1145/3134713
gi.citations.elementAnke Verena Reinschluessel, Joern Teuber, Marc Herrlich, Jeffrey Bissel, Melanie van Eikeren, Johannes Ganser, Felicia Koeller, Fenja Kollasch, Thomas Mildner, Luca Raimondo, Lars Reisig, Marc Ruedel, Danny Thieme, Tobias Vahl, Gabriel Zachmann, Rainer Malaka (2017): Virtual Reality for User-Centered Design and Evaluation of Touch-free Interaction Techniques for Navigating Medical Images in the Operating Room, In: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, doi:10.1145/3027063.3053173
gi.citations.elementKenton O’Hara, Cecily Morrison, Abigail Sellen, Nadia Bianchi-Berthouze, Cathy Craig (2016): Body Tracking in Healthcare, In: Synthesis Lectures on Assistive, Rehabilitative, and Health-Preserving Technologies, doi:10.2200/s00702ed1v01y201602arh009
gi.citations.elementFlorian Heinrich, Kai Bornemann, Laureen Polenz, Kai Lawonn, Christian Hansen (2023): Clutch & Grasp: Activation gestures and grip styles for device-based interaction in medical spatial augmented reality, In: International Journal of Human-Computer Studies, doi:10.1016/j.ijhcs.2023.103117
gi.citations.elementSeán Cronin, Gavin Doherty (2018): Touchless computer interfaces in hospitals: A review, In: Health Informatics Journal 4(25), doi:10.1177/1460458217748342
gi.citations.elementSean Cronin, Euan Freeman, Gavin Doherty (2022): Investigating Clutching Interactions for Touchless Medical Imaging Systems, In: CHI Conference on Human Factors in Computing Systems, doi:10.1145/3491102.3517512
gi.citations.elementFlorian Heinrich, Kai Bornemann, Laureen Polenz, Kai Lawonn, Christian Hansen (2022): Clutch & Grasp: Activation Gestures and Grip Styles for Device-Based Interaction in Medical Spatial Augmented Reality, In: SSRN Electronic Journal, doi:10.2139/ssrn.4163379
gi.citations.elementDuncan Stevenson, Henry Gardner, Wendell Neilson, Edwin Beenen, Sivakumar Gananadha, James Fergusson, Phillip Jeans, Peter Mews, Hari Bandi (2016): Evidence from the surgeons: gesture control of image data displayed during surgery, In: Behaviour & Information Technology 12(35), doi:10.1080/0144929x.2016.1203025
gi.citations.elementKarina Yukari Kimura, Karoline Harummy Romero Moriya, Guilherme Corredato Guerino, Heloise Manica Paris Teixeira (2021): Gesture-based Interaction Systems in Hospital Critical Environment, In: Proceedings of the XX Brazilian Symposium on Human Factors in Computing Systems, doi:10.1145/3472301.3484350
gi.citations.elementHelena M. Mentis, Kenton O'Hara, Gerardo Gonzalez, Abigail Sellen, Robert Corish, Antonio Criminisi, Rikin Trivedi, Pierre Theodore (2015): Voice or Gesture in the Operating Room, In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, doi:10.1145/2702613.2702963
gi.citations.elementAzin Semsar, Hannah McGowan, Yuanyuan Feng, H. Reza Zahiri, Adrian Park, Andrea Kleinsmith, Helena Mentis (2019): How Trainees Use the Information from Telepointers in Remote Instruction, In: Proceedings of the ACM on Human-Computer Interaction CSCW(3), doi:10.1145/3359195
gi.citations.elementCecily Morrison, Kit Huckvale, Bob Corish, Jonas Dorn, Peter Kontschieder, Kenton O’Hara, ASSESS MS Team, Antonio Criminisi, Abigail Sellen (2016): Assessing Multiple Sclerosis With Kinect: Designing Computer Vision Systems for Real-World Use, In: Human–Computer Interaction 3-4(31), doi:10.1080/07370024.2015.1093421
gi.citations.elementRegina Bernhaupt, Katherine Isbister, Sara de Freitas (2015): Introduction to this Special Issue on HCI and Games, In: Human–Computer Interaction 3-4(30), doi:10.1080/07370024.2015.1016573
gi.citations.elementJosefine Schreiter, Florian Heinrich, Benjamin Hatscher, Danny Schott, Christian Hansen (2024): Multimodal human–computer interaction in interventional radiology and surgery: a systematic literature review, In: International Journal of Computer Assisted Radiology and Surgery 4(20), doi:10.1007/s11548-024-03263-3
gi.citations.elementTorgeir K. Haavik (2015): Keep your coats on: augmented reality and sensework in surgery and surgical telemedicine, In: Cognition, Technology & Work 1(18), doi:10.1007/s10111-015-0353-z
gi.citations.elementCecily Morrison, Marcus D'Souza, Kit Huckvale, Jonas F Dorn, Jessica Burggraaff, Christian Philipp Kamm, Saskia Marie Steinheimer, Peter Kontschieder, Antonio Criminisi, Bernard Uitdehaag, Frank Dahlke, Ludwig Kappos, Abigail Sellen (2015): Usability and Acceptability of ASSESS MS: Assessment of Motor Dysfunction in Multiple Sclerosis Using Depth-Sensing Computer Vision, In: JMIR Human Factors 1(2), doi:10.2196/humanfactors.4129
gi.citations.elementDominic Furniss, Aisling Ann O’Kane, Rebecca Randell, Aisling Ann O’Kane, Svetlena Taneva, Helena Mentis, Ann Blandford (2015): Fieldwork for Healthcare, In: Synthesis Lectures on Assistive, Rehabilitative, and Health-Preserving Technologies, doi:10.2200/s00606ed1v02y201410arh007
gi.citations.elementNikola Nestorov, Peter Hughes, Nuala Healy, Niall Sheehy, Neil O'Hare (2016): Application of Natural User Interface Devices for Touch-Free Control of Radiological Images During Surgery, In: 2016 IEEE 29th International Symposium on Computer-Based Medical Systems (CBMS), doi:10.1109/cbms.2016.20
gi.citations.elementYuanyuan Feng, Hannah McGowan, Azin Semsar, Hamid R. Zahiri, Ivan M. George, Timothy Turner, Adrian Park, Andrea Kleinsmith, Helena M. Mentis (2018): A virtual pointer to support the adoption of professional vision in laparoscopic training, In: International Journal of Computer Assisted Radiology and Surgery 9(13), doi:10.1007/s11548-018-1792-9
gi.citations.elementSheena Visram, Laura Potts, Neil J Sebire, Yvonne Rogers, Emma Broughton, Linda Chigaru, Pratheeban Nambyiah (2021): Making the invisible visible: New perspectives on the intersection of human-environment interactions of clinical teams in intensive care, doi:10.1101/2021.05.10.21256688
gi.citations.elementFernando Alvarez-Lopez, Marcelo Fabián Maina, Francesc Saigí-Rubió (2019): Use of Commercial Off-The-Shelf Devices for the Detection of Manual Gestures in Surgery: Systematic Literature Review, In: Journal of Medical Internet Research 5(21), doi:10.2196/11925
gi.citations.elementAleksandra Sarcevic, Ivan Marsic, Randall S. Burd (2018): Dashboard Design for Improved Team Situation Awareness in Time-Critical Medical Work, In: Designing Healthcare That Works, doi:10.1016/b978-0-12-812583-0.00007-9
gi.citations.elementStuart Reeves, Martin Porcheron, Joel E. Fischer, Heloisa Candello, Donald McMillan, Moira McGregor, Robert J. Moore, Rein Sikveland, Alex S. Taylor, Julia Velkovska, Moustafa Zouinar (2018): Voice-based Conversational UX Studies and Design, In: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, doi:10.1145/3170427.3170619
gi.citations.elementTudor Ilies, Nicholas Camic, Aditya Tadinada (2023): Ease of use in accessing electronic dental records with a touchless interface compared with a conventional mouse, In: JADA Foundational Science, doi:10.1016/j.jfscie.2023.100024

Files