Tampere University of Technology

TUTCRIS Research Portal

Panel discussion does not improve reliability of peer review for medical research grant proposals

Research output: Contribution to journalReview ArticleScientificpeer-review

Details

Original languageEnglish
Pages (from-to)47-52
Number of pages6
JournalJOURNAL OF CLINICAL EPIDEMIOLOGY
Volume65
Issue number1
DOIs
Publication statusPublished - Jan 2012
Publication typeA2 Review article in a scientific journal

Abstract

Objective: Peer review is the gold standard for evaluating scientific quality. Compared with studies on inter-reviewer variability, research on panel evaluation is scarce. To appraise the reliability of panel evaluations in grant review, we compared scores by two expert panels reviewing the same grant proposals. Our main interest was to evaluate whether panel discussion improves reliability. Methods: Thirty reviewers were randomly allocated to one of the two panels. Sixty-five grant proposals in the fields of clinical medicine and epidemiology were reviewed by both panels. All reviewers received 5-12 proposals. Each proposal was evaluated by two reviewers, using a six-point scale. The reliability of reviewer and panel scores was evaluated using Cohen's kappa with linear weighting. In addition, reliability was also evaluated for the panel mean scores (mean of reviewer scores was used as panel score). Results: The proportion of large differences (at least two points) was 40% for reviewers in panel A, 36% for reviewers in panel B, 26% for the panel discussion scores, and 14% when the means of the two reviewer scores were used. The kappa for panel score after discussion was 0.23 (95% confidence interval: 0.08, 0.39). By using the mean of the reviewer scores, the panel coefficient was similarly 0.23 (0.00, 0.46). Conclusion: The reliability between panel scores was higher than between reviewer scores. The similar interpanel reliability, when using the final panel score or the mean value of reviewer scores, indicates that panel discussions per se did not improve the reliability of the evaluation.

ASJC Scopus subject areas

Keywords

  • Consistency, Funding, Inter-reviewer reliability, Interpanel reliability, Peer review, Quality assurance