Clinician-selected Electronic Information Resources do not Guarantee Accuracy in Answering Primary Care Physicians' Information Needs
AbstractA review of:
McKibbon, K. Ann, and Douglas B. Fridsma. “Effectiveness of Clinician-selected Electronic Information Resources for Answering Primary Care Physicians’ Information Needs.” Journal of the American Medical Informatics Association 13.6 (2006): 653-9.
Objective – To determine if electronic information resources selected by primary care physicians improve their ability to answer simulated clinical questions.
Design – An observational study utilizing hour-long interviews and think-aloud protocols.
Setting – The offices and clinics of primary care physicians in Canada and the United States.
Subjects – 25 primary care physicians of whom 4 were women, 17 were from Canada, 22 were family physicians, and 24 were board certified.
Methods – Participants provided responses to 23 multiple-choice questions. Each physician then chose two questions and looked for the answers utilizing information resources of their own choice. The search processes, chosen resources and search times were noted. These were analyzed along with data on the accuracy of the answers and certainties related to the answer to each clinical question prior to the search.
Main results – Twenty-three physicians sought answers to 46 simulated clinical questions. Utilizing only electronic information resources, physicians spent a mean of 13.0 (SD 5.5) minutes searching for answers to the questions, an average of 7.3 (SD 4.0) minutes for the first question and 5.8 (SD 2.2) minutes to answer the second question. On average, 1.8 resources were utilized per question. Resources that summarized information, such as the Cochrane Database of Systematic Reviews, UpToDate and Clinical Evidence, were favored 39.2% of the time, MEDLINE (Ovid and PubMed) 35.7%, and Internet resources including Google 22.6%. Almost 50% of the search and retrieval strategies were keyword-based, while MeSH, subheadings and limiting were used less frequently. On average, before searching physicians answered 10 of 23 (43.5%) questions accurately. For questions that were searched using clinician-selected electronic resources, 18 (39.1%) of the 46 answers were accurate before searching, while 19 (42.1%) were accurate after searching. The difference of one correct answer was due to the answers from 5 (10.9%) questions changing from correct to incorrect, while the answers to 6 questions (13.0%) changed from incorrect to correct. The ability to provide correct answers differed among the various resources. Google and Cochrane provided the correct answers about 50% of the time while PubMed, Ovid MEDLINE, UpToDate, Ovid Evidence Based Medicine Reviews and InfoPOEMs were more likely to be associated with incorrect answers. Physicians also seemed unable to determine when they needed to search for information in order to make an accurate decision.
Conclusion – Clinician-selected electronic information resources did not guarantee accuracy in the answers provided to simulated clinical questions. At times the use of these resources caused physicians to change self-determined correct answers to incorrect ones. The authors state that this was possibly due to factors such as poor choice of resources, ineffective search strategies, time constraints and automation bias. Library and information practitioners have an important role to play in identifying and advocating for appropriate information resources to be integrated into the electronic medical record systems provided by health care institutions to ensure evidence based health care delivery.
The Creative Commons-Attribution-Noncommercial-Share Alike License 4.0 International applies to all works published by Evidence Based Library and Information Practice. Authors will retain copyright of the work.