Background: Medical practitioners have unmet information needs. Health care research dissemination suffers from both “supply” and “demand” problems. One possible solution is to develop methodologic search filters (“hedges”) to improve the retrieval of clinically relevant and scientifically sound study reports from bibliographic databases. To develop and test such filters a hand search of the literature was required to determine directly which articles should be retrieved, and which not retrieved. Objective: To determine the extent to which 6 research associates can agree on the classification of articles according to explicit research criteria when hand searching the literature.
Design: Blinded, inter-rater reliability study.
Setting: Health Information Research Unit, McMaster University, Hamilton, Ontario, Canada.
Participants: 6 research associates with extensive training and experience in research methods for health care research.
Main outcome measure: Inter-rater reliability measured using the kappa statistic for multiple raters.
Results: After one year of intensive calibration exercises research staff were able to attain a level of agreement at least 80% greater than that expected by chance (kappa statistic) for all classes of articles.
Conclusion: With extensive training multiple raters are able to attain a high level of agreement when classifying articles in a hand search of the literature.
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
Tel.: +1 703 830 6300
Fax: +1 703 830 2300 email@example.com
(Corporate matters and books only) IOS Press c/o Accucoms US, Inc.
For North America Sales and Customer Service
West Point Commons
Lansdale PA 19446
Tel.: +1 866 855 8967
Fax: +1 215 660 5042 firstname.lastname@example.org