Item

A Comparison of Usage Evaluation and Inspection Methods for Assessing Groupware Usability

Loading...
Thumbnail Image

Fulltext URI

Document type

Additional Information

Date

Journal Title

Journal ISSN

Volume Title

Publisher

Association for Computing Machinery

Abstract

Many researchers believe that groupware can only be evaluated by studying real collaborators in their real contexts, a process that tends to be expensive and time-consuming. Others believe that it is more practical to evaluate groupware through usability inspection methods. Deciding between these two approaches is difficult, because it is unclear how they compare in a real evaluation situation. To address this problem, we carried out a dual evaluation of a groupware system, with one evaluation applying user-based techniques, and the other using inspection methods. We compared the results from the two evaluations and concluded that, while the two methods have their own strengths, weaknesses, and trade-offs, they are complementary. Because the two methods found overlapping problems, we expect that they can be used in tandem to good effect, e.g., applying the discount method prior to a field study, with the expectation that the system deployed in the more expensive field study has a better chance of doing well because some pertinent usability problems will have already been addressed.

Description

Steves, Michelle Potts; Morse, Emile; Gutwin, Carl; Greenberg, Saul (2001): A Comparison of Usage Evaluation and Inspection Methods for Assessing Groupware Usability. Proceedings of the 2001 ACM International Conference on Supporting Group Work. DOI: 10.1145/500286.500306. New York, NY, USA: Association for Computing Machinery. pp. 125–134. Boulder, Colorado, USA

Keywords

usage evaluation techniques, groupware usability, inspection evaluation techniques, evaluation

Citation

URI

Collections

Endorsement

Review

Supplemented By

Referenced By


Number of citations to item: 33

  • Carmelo Ardito, Rosa Lanzilotti, Marcin Sikorski, Igor Garnik (2014): Can Evaluation Patterns Enable End Users to Evaluate the Quality of an e-learning System? An Exploratory Study, In: Lecture Notes in Computer Science, doi:10.1007/978-3-319-07440-5_18
  • C.T. Blake, L. Rapanotti (2000): Usability evaluation of distributed groupware in distance learning, In: Information Technology Based Proceedings of the FIfth International Conference onHigher Education and Training, 2004. ITHET 2004., doi:10.1109/ithet.2004.1358224
  • Steven R. Haynes, Sandeep Purao, Amie L. Skattebo, Steven R. Haynes (2009): Scenario-Based Methods for Evaluating Collaborative Systems, In: Computer Supported Cooperative Work (CSCW) 4(18), doi:10.1007/s10606-009-9095-x
  • (2010): Bibliography, In: Interacting with Geospatial Technologies, doi:10.1002/9780470689813.biblio
  • Paolo Davoli, Matteo Monari, Kerstin Severinson Eklundh (2008): Peer activities on Web-learning platforms—Impact on collaborative writing and usability issues, In: Education and Information Technologies 3(14), doi:10.1007/s10639-008-9080-x
  • Gregorio Convertino, Dennis C. Neale, Laurian Hobby, John M. Carroll, Mary Beth Rosson (2004): A laboratory method for studying activity awareness, In: Proceedings of the third Nordic conference on Human-computer interaction, doi:10.1145/1028014.1028063
  • Jacques Wainer, Claudia Barsottini (2007): Empirical research in CSCW — a review of the ACM/CSCW conferences from 1998 to 2004, In: Journal of the Brazilian Computer Society 3(13), doi:10.1007/bf03192543
  • Pedro Antunes, Marcos R. S. Borges, Jose A. Pino, Luis Carriço (2006): Analytic Evaluation of Groupware Design, In: Lecture Notes in Computer Science, doi:10.1007/11686699_4
  • Kevin Baker, Saul Greenberg, Carl Gutwin (2002): Empirical development of a heuristic evaluation methodology for shared workspace groupware, In: Proceedings of the 2002 ACM conference on Computer supported cooperative work, doi:10.1145/587078.587093
  • Matthias Krauß, Kai Riege, Marcus Winter, Lyn Pemberton (2009): Remote Hands-On Experience: Distributed Collaboration with Augmented Reality, In: Lecture Notes in Computer Science, doi:10.1007/978-3-642-04636-0_22
  • Dalma Geszten, Balázs Péter Hámornik, Károly Hercegfi (2024): Team usability testing: development and validation of a groupware usability evaluation method, In: Cognition, Technology & Work 3(26), doi:10.1007/s10111-024-00759-5
  • Lu Liang, Yong Tang, Na Tang (2006): Determinants of Groupware Usability for Community Care Collaboration, In: Lecture Notes in Computer Science, doi:10.1007/11610113_45
  • Anders Bruun, Peter Gull, Lene Hofmeister, Jan Stage (2009): Let your users do the testing, In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, doi:10.1145/1518701.1518948
  • Rahat Iqbal, Janienke Sturm, Olga Kulyk, Jimmy Wang, Jacques Terken (2005): User-centred design and evaluation of ubiquitous services, In: Proceedings of the 23rd annual international conference on Design of communication: documenting & designing for pervasive information, doi:10.1145/1085313.1085346
  • David Pinelle, Carl Gutwin (2007): Evaluating teamwork support in tabletop groupware applications using collaboration usability analysis, In: Personal and Ubiquitous Computing 3(12), doi:10.1007/s00779-007-0145-4
  • Kristin Dew, Anne M. Turner, Loma Desai, Nathalie Martin, Katrin Kirchhoff (2015): Evaluating Groupware Prototypes with Discount Methods, In: Proceedings of the 18th ACM Conference Companion on Computer Supported Cooperative Work & Social Computing, doi:10.1145/2685553.2699002
  • Minseok Lee, Jungpil Hahn, Gavriel Salvendy (2010): Developing and validating a methodology for discount usability evaluation of collaboration technologies, In: Theoretical Issues in Ergonomics Science 3(11), doi:10.1080/14639220802613434
  • James Tam, Saul Greenberg (2004): A Framework for Asynchronous Change Awareness in Collaboratively-Constructed Documents, In: Lecture Notes in Computer Science, doi:10.1007/978-3-540-30112-7_7
  • Natalia Sales Santos, Lidia Silva Ferreira, Raquel Oliveira Prates (2012): An Overview of Evaluation Methods for Collaborative Systems, In: 2012 Brazilian Symposium on Collaborative Systems, doi:10.1109/sbsc.2012.29
  • Claudio Sapateiro, Nelson Baloian, Pedro Antunes, Gustavo Zurita (2009): Developing collaborative peer-to-peer applications on mobile devices, In: 2009 13th International Conference on Computer Supported Cooperative Work in Design, doi:10.1109/cscwd.2009.4968091
  • Pedro Antunes, Valeria Herskovic, Sergio F. Ochoa, Jose A. Pino (2008): Structuring dimensions for collaborative systems evaluation, In: ACM Computing Surveys 2(44), doi:10.1145/2089125.2089128
  • Pedro Antunes, Claudio Sapateiro, Gustavo Zurita, Nelson Baloian (2010): Development of a Mobile Situation Awareness Tool Supporting Disaster Recovery of Business Operations, In: Annals of Information Systems, doi:10.1007/978-1-4419-7406-8_17
  • David Pinelle, Carl Gutwin (2002): Groupware walkthrough, In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, doi:10.1145/503376.503458
  • Purvi Saraiya, Chris North, Karen Duca (2010): Comparing benchmark task and insight evaluation methods on timeseries graph visualizations, In: Proceedings of the 3rd BELIV'10 Workshop: BEyond time and errors: novel evaLuation methods for Information Visualization, doi:10.1145/2110192.2110201
  • M.P. Steves, E. Morse (2000): Looking at the whole picture: a case study of analyzing a virtual workplace, In: Proceedings Tenth IEEE International Workshop on Enabling Technologies: Infrastructure for Collaborative Enterprises. WET ICE 2001, doi:10.1109/enabl.2001.953396
  • Soraia Reis, Raquel Prates (2012): An initial analysis of communicability evaluation methods through a case study, In: CHI '12 Extended Abstracts on Human Factors in Computing Systems, doi:10.1145/2212776.2223845
  • Steven R. Haynes, Sandeep Purao, Amie L. Skattebo (2004): Situating evaluation in scenarios of use, In: Proceedings of the 2004 ACM conference on Computer supported cooperative work, doi:10.1145/1031607.1031624
  • Zheng (Eric) Chang, Songnian Li (2012): Geo‐Social Model: A Conceptual Framework for Real‐time Geocollaboration, In: Transactions in GIS 2(17), doi:10.1111/j.1467-9671.2012.01352.x
  • Raquel Oliveira Prates, Alberto Barbosa Raposo (2006): Desafios para testes de usuários em sistemas colaborativos - lições de um estudo de caso, In: Proceedings of VII Brazilian symposium on Human factors in computing systems, doi:10.1145/1298023.1298051
  • Cláudio Miguel Sapateiro, Sérgio Grosso (2000): Capturing Distributed Contributions to an Informal Work Process, In: Handbook of Research on Developments in E-Health and Telemedicine, doi:10.4018/978-1-61520-670-4.ch054
  • Chris North, Purvi Saraiya, Karen Duca (2011): A comparison of benchmark task and insight evaluation methods for information visualization, In: Information Visualization 3(10), doi:10.1177/1473871611415989
  • R. Lanzilotti, C. Ardito, M.F. Costabile, A. De Angeli (2011): Do patterns help novice evaluators? A comparative study, In: International Journal of Human-Computer Studies 1-2(69), doi:10.1016/j.ijhcs.2010.07.005
  • Dalma Geszten, Balázs Péter Hámornik, Károly Hercegfi (2020): Empirical study of Team Usability Testing: a laboratory experiment, In: Cognition, Technology & Work 4(23), doi:10.1007/s10111-020-00647-8
Please note: Providing information about citations is only possible thanks to to the open metadata APIs provided by crossref.org and opencitations.net. These lists may be incomplete due to unavailable citation data.source: opencitations.net, crossref.org