Text Document

Soft peer review. Social software and distributed scientific evaluation

Fulltext URI

Document type

Text

Additional Information

Date

2008

Journal Title

Journal ISSN

Volume Title

Publisher

ACM Press

Abstract

The debate on the prospects of peer-review in the Internet age and the increasing criticism leveled against the dominant role of impact factor indicators are calling for new measurable criteria to assess scientific quality. Usage-based metrics offer a new avenue to scientific quality assessment but face the same risks as first generation search engines that used unreliable metrics (such as raw traffic data) to estimate content quality. In this article I analyze the contribution that social bookmarking systems can provide to the problem of usage-based metrics for scientific evaluation. I suggest that collaboratively aggregated metadata may help fill the gap between traditional citation-based criteria and raw usage factors. I submit that bottom-up, distributed evaluation models such as those afforded by social bookmarking will challenge more traditional quality assessment models in terms of coverage, efficiency and scalability. Services aggregating user-related quality indicators for online scientific content will come to occupy a key function in the scholarly communication system.

Description

Taraborelli, Dario (2008): Soft peer review. Social software and distributed scientific evaluation. From CSCW to Web 2.0: European Developments in Collaborative Design Selected Papers from COOP08. ACM Press. pp. 99-110. Full Papers. Carry-le-Rouet, France. 2008-05-20

Citation

DOI

Tags