UCL Discovery Stage
UCL home » Library Services » Electronic resources » UCL Discovery Stage

Inter-rater reliability in systematic review methodology: exploring variation in coder decision-making

Belur, J; Tompson, L; Thornton, A; Simon, M; (2019) Inter-rater reliability in systematic review methodology: exploring variation in coder decision-making. Sociological Methods and Research (In press). Green open access

[thumbnail of Accepted version of paper July 2018.pdf]
Preview
Text
Accepted version of paper July 2018.pdf - Accepted Version
Available under License : See the attached licence file.

Download (125kB) | Preview

Abstract

A methodologically sound systematic review is characterized by transparency, replicability and a clear inclusion criteria. However, little attention has been paid to reporting the details of inter-rater reliability (IRR) when multiple coders are used to make decisions at various points in the screening and data extraction stages of a study. Prior research has mentioned the paucity of information on IRR, including number of coders involved, at what stages and how IRR tests were conducted, and how disagreements were resolved. This paper examines and reflects on the human factors that affect decision-making in systematic reviews via reporting on three IRR tests, conducted at three different points in the screening process, for two distinct reviews. Results of the two studies are discussed in the context of inter rater and intra rater reliability in terms of the accuracy, precision and reliability of coding behaviour of multiple coders. Findings indicated that coding behaviour changes both between and within individuals over time, emphasising the importance of conducting regular and systematic inter and intra-rater reliability tests, especially when multiple coders are involved, to ensure consistency and clarity at the screening and coding stages. Implications for good practice while screening/coding for systematic reviews are discussed.

Type: Article
Title: Inter-rater reliability in systematic review methodology: exploring variation in coder decision-making
Open access status: An open access version is available from UCL Discovery
Publisher version: http://journals.sagepub.com/toc/smr/0/0
Language: English
Additional information: This version is the author accepted manuscript. For information on re-use, please refer to the publisher’s terms and conditions.
Keywords: Interrater reliability, systematic review, screening, coding, decision-making, kappa statistic
UCL classification: UCL
UCL > Provost and Vice Provost Offices > UCL BEAMS
UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science
UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science > Dept of Security and Crime Science
URI: https://discovery-pp.ucl.ac.uk/id/eprint/10052482
Downloads since deposit
89,376Downloads
Download activity - last month
Download activity - last 12 months
Downloads by country - last 12 months

Archive Staff Only

View Item View Item