Electronic Journal of Academic and Special Librarianship

v.8 no.2 (Summer 2007)

Back to Contents

Evidence-Based Anything: Priorities for Librarians

T. P. Hutchinson
Centre for Automotive Safety Research, University of Adelaide, Australia
paul@casr.adelaide.edu.au

A. J. Meier
Library, ARRB Group Ltd, Victoria, Australia

Abstract

This paper discusses the key ideas that librarians need to know about the movement for “evidence-based” policy and practice. The most important is the methodological quality of research, particularly the importance of randomized controlled experimentation in estimating the effectiveness of interventions. In the early stages of the spread of evidence-based ideas into a new area, librarians will have clients who have a limited appreciation of these ideas, and who may not be clear about the different interpretations of this phrase (e.g., specifically referring to the pre-eminence of randomization, versus a wish for some empirical data). It may also be the case that randomized experiments are largely absent, those that have been conducted are not to be locatable because of inadequate indexing, and there is no consensus on what weight to give to “low quality” research. Librarians in other disciplines will probably seek to learn from those in medicine; there are existing courses for librarians on evidence-based methods in medicine that could be a basis.

Introduction

We know that a treatment works if the data come from a randomized experiment. We know that it doesn’t work if the data come from a randomized experiment. We don’t know much at all if the data don’t come from a randomized experiment.
Exaggerating a little, that is the message from the proponents of “evidence-based” medicine. It is a message that is familiar to medical and health librarians, but largely novel to other librarians. But the concepts behind “evidence-based” medicine are spreading into other disciplines, and are being promoted to the public. Taken seriously, they imply major changes in the way that research is conducted and later used in making decisions. This has important implications for libraries, whether the majority of their clients are researchers or policy-makers or the public.

In the early stages of the spread of evidence-based ideas into a discipline, librarians will have clients who have only a limited appreciation of these ideas: they will have heard of the concept of high quality evidence and may think that finding it is a simple matter. It may not be quite fair to the librarian for the client to turn up with their thoughts incompletely formulated.  If it happens, there are a few key ideas that will enable librarians to help them (and to help them help themselves). In the present paper, we will first sketch why methodological quality of research --- that is, exactly how it was done --- has become such a concern, and then turn to the implications for librarians and information scientists.

Importance of research methods in evidence-based disciplines

A currently popular phrase in medicine and some other disciplines is “evidence-based”. The most distinctive features of this are systematic reviewing of previous research on a topic, giving much greater weight to studies that adhere to high methodological standards than to those of lower standards, and attempting to conduct new research to high standards rather than lower, especially the preferring of randomized trials over observational studies (because bias is so common in non-randomized experiments). Meta-analysis refers to an extensive search for relevant previous research, making full use of computerised databases and software to interrogate them, and attempting to locate “grey” literature and unpublished studies. Results found are averaged in some numerical fashion rather than reviewing the narrative. High methodological standards are advocated in the conduct of research, including obtaining a large enough sample size to answer the question of interest, randomized allocation of the experimental units (in medicine, usually patients) to treatment or control groups, and masking of both the units and the experimenter to the allocation. Other features of evidence-based medicine include the practitioner keeping up-to-date with research and considering the individual patient in the light of that research, and a degree of attention to quantitative methods in diagnosis and the taking of decisions, but randomized experimentation and meta-analysis are especially prominent. The emphasis, then, is on objective characteristics of how the research was conducted. This corresponds to the methods of evaluating evidence discussed at pp. 184-185 of Clyde (2006). (Clyde also discusses --- skeptically! --- evaluating research by evaluating the journal it is published in, and by collecting the opinions of experts.)

The Cochrane Collaboration (http://www.cochrane.org/index.htm) is an international organisation under whose guidance many meta-analyses and systematic reviews in medicine and health are performed. The Campbell Collaboration (http://www.campbellcollaboration.org) is a similar body in the social sciences. Justification is expressed by Fitz-Gibbon (2004) as follows: “Many examples illustrate that guessing and good intentions are not a basis for effective action.... we must check our theories and hypotheses”. Writings by advocates of randomized experiments and meta-analysis (e.g., Chalmers, 2003) emphasize humility: there are many instances in medicine, criminology, education, social welfare, and so on where none --- not the experts, the administrators, the politicians, the pressure groups --- really know what the best course of action is. Plausibility, recommendation by experts, and empirical support from studies of low methodological quality are not enough.

Evidence-based librarianship itself has achieved some prominence recently; see, for example, Lerdal (2006), which includes an annotated bibliography, and there has been a series of conferences (http://conferences.alia.org.au/ebl2005/index.html, and see Missingham, 2005). The key idea is to raise the standard of library research, especially via randomized experiments, of which some have already been conducted (e.g., that of Cheng, 2003, on the effectiveness of information training provided to clinicians). In the short term, though, it seems likely that librarians will need understanding of the concept when a client refers to it, more than for purposes of library research. Our own field of road safety, being so close to epidemiology and preventive medicine, is likely to be strongly influenced by evidence-based ideas in the near future (Hutchinson and Meier, 2004). Public works research may be influenced both from that direction and from the direction of engineering laboratory experimentation (Hutchinson and Meier, 2005). A paper by Lerdal (2006) noted that evidence-based methods may become influential in law as well as in law librarianship.

Weak and strong meanings of “evidence-based”

“Evidence-based” is already a catchphrase in political and managerial contexts. Researchers in medicine and social welfare take the ideas very seriously. Thus the incursion of these ideas into other fields, when it comes, may be very rapid. A librarian who hears the phrase one day may be expected to be an expert on it the next. The client may be hardly any more knowledgeable, and the first task of the librarian may be to advise on different interpretations that are given to this phrase. In particular, is the client’s intent to use the phrase weakly or strongly?

For example, we found it striking that the conference presentation by Head (2006), though having the title of “Evidence-based policy”, makes no reference to the technicalities of methodology necessary in order for the evidence to be high quality. And the same goes for “Evidence-based planning” (Davoudi, 2006).

In fields where the phrase is new, the client may not be clear about the distinction, and the librarian may need to do some education. The more sophisticated clients will themselves vary in the relative emphasis they give to “high quality” versus “best available” evidence: on many topics, the best available evidence may be quite low quality. The librarian also needs to be cautious, and should not necessarily think that a document with phrases like “evidence-based” in its title and published by a respectable source is necessarily the last word on a topic: the phrase may have been used in the weak sense, or with a low threshold of quality for inclusion, and represent one opinion among a wide variety strongly held by different experts.

The meaning of the phrase “evidence-based” needs to be sorted out, because it has enormous implications for the librarian’s work. If the meaning is the relatively strong one used in medicine, health, and social welfare, the implications will include the following.

In addition, concerning many questions in many fields, at the end it will be found that there is no, or very little, evidence-based knowledge in the sense of randomized experiments (or so we suspect). A plan needs to be in place for this situation. Will the client like this answer? How can a librarian prove a negative? Is it proper to base a meta-analysis or systematic review on controlled but non-randomized studies because those are the best that were done? The differences from conventional information search may be not so much in an emphasis on randomized experiments (which may not exist), but in being very thorough and systematic about the search, and documenting this carefully, in paying attention to grades of quality of evidence, even when this refers to different shades of poor quality, and in educating the client about the advantages and limitations of true experiments. Moody (2003) attempts to apply the medical model to information systems practice. To our eyes, he glosses over the differences that result from the lack of randomized experiments in the latter field. In librarianship itself, Booth (2006) makes the point that “systematic reviews expend extensive resources and should be pursued only if there is a reasonable expectation of furnishing some ‘answer’ and hence of achieving ‘closure’.... There is a very strong likelihood that we will continue to witness the production of well-conducted, well-written systematic reviews where the bottom line is that there is no bottom line”.

Implications for librarians

Librarians practising in medicine and health are familiar with researchers being interested in the methodology of interventions (Fowler, 2000). Indeed, they may have a much wider role than is traditional, including appraising evidence as well as locating it (Rader and Gagnon, 2000; Sladek, 2000; Cheng, 2001; Lappa, 2004). Our view is that in the near future, librarians in other disciplines will need to learn from them. Courses for librarians in this area already exist (Palmer, 2000; Wathen and Leckie, 2005; and see http://sils.unc.edu/programs/courses/special_topics.html for the School of Information and Library Science, University of North Carolina at Chapel Hill). These would not be perfect for those faced with evidence-based ideas coming into new fields, but they would be a good base. Another area where evidence-based ideas are influential is social welfare. As part of the What Works for Children project, an “implementation officer” was employed to work with practitioners to identify where research findings might be helpful, and to facilitate access to the research literature and understanding of it (Stevens et al., 2005; Liabo, 2005). The person concerned seems to have been more of a researcher than anything else, but it could well have been a librarian doing this.

The importance of the indexing and retrieval of past research may mean that some librarians will be qualified to take the lead in making the ideas accessible to the particular discipline that they are serving. But though librarians are well-placed to take the lead in extending the appreciation of evidence-based methods to new fields, it is likely that only a minority will wish to be proactive in this way. Many will need to be able to react to, and interact with, a client. Thus it is important the librarian is clear about the strengths of different research methods. And about the weaknesses, too --- as well as practical difficulties, there are difficulties of principle associated with randomized experimentation that may be underemphasised by enthusiasts for high quality methodology. The set of circumstances in which the intervention took place is an example of this: while it is the client’s responsibility, not the librarian’s, to judge whether (for example) crime prevention results obtained in an area of high employment and high marriage will transfer to an area of low employment and low marriage (we are taking this example from p. 19 of Tilley and Laycock, 2002), issues like this may affect the search filter that the librarian constructs. In some quarters there is strong opposition to evidence-based methods: see Oakley (2006) for some flavour of the debate, though that paper is pro-randomization rather than middle-of-the-road.

As already mentioned, “evidence-based” and similar phrases usually refer to a specific intervention in the real world, and a concern with high quality methodology (having a control group, randomization of experimental units, perhaps masking from the participants which group they are in, and so on). The client and the librarian need to discuss these concepts. Is the client concerned only with making an intervention, rather than with a whole subject area? What intervention or range of interventions? What is the client’s standard of quality? (Is the minimum standard the use of a carefully-chosen control group, or is the higher standard of randomization essential, or is the lower standard of a before-after comparison with no control group sufficient?)

Even when the client’s needs become clear, the librarian’s tools may not be very helpful. If library acquisitions have not been indexed using subject headings for methods, as well as for topic, the catalogue will not be sufficient. Databases may be of assistance for recently published material where the abstract mentions the methodology, but not in the case of items indexed before this was stressed. (Material published decades ago can be quite important if the methods employed were high standard.) Much grey literature may be hidden in the report databases of research institutes: because of a fear that published studies may be a biased sample of the totality of studies (for example, biased towards those in which a statistically significant difference was found), meta-analysis emphasises searching the grey literature as well as that published in journals. It is unlikely that much of the expertise in search filters that has been gained in medicine will be easily transferable to other fields, again because of the rarity with which methodology has been indexed.

Improving the ability of librarians to serve the “evidence-based” needs of their clients in areas other than medicine is an issue that has been raised before.  Roberts et al. (2001) called for wide scale retrospective indexing of road safety material in order to make retrievable studies that are at present insufficiently indexed (in regards to evidence-based issues) in citation databases. This is quite a long term idea, though.

Publicising the growing importance of evidence-based issues may be a more immediately achievable goal. Where and when might this occur? One answer is as an element of continued professional education --- short specialised courses have already been mentioned, above. But earlier is probably even better, presumably under the heading of research methods when teaching prospective librarians. It would lead to graduates knowing the importance of indexing the methodology of research as well as its subject matter, and being able to oversee it in practice, in whatever field they may find themselves employed. They would also have the knowledge to more satisfactorily deal with a client who has an evidence-based reference query, but perhaps little familiarity with the concepts behind it. Education in the pros and cons of randomization and the implications for what this means for literature review and research synthesis will also strengthen the case of information specialists that they have a place that cannot be taken by a Google search. (We might note that there is some discussion in the literature of the present training of librarians in research methods. Morris (2006) recently surveyed library courses in the U.K.; judging by the list she gives of topics under the heading of research methods, it seems unlikely that there is any substantial coverage of the evidence-based movement. In the case of the requirements of Master of Library Science programs in North America, Park (2003) found that students are not usually required to pass a course in research methods. Park regarded this as placing library and information science with the humanities and education, rather than with science, social science, and business, and viewed it as regrettable.)

Discussion

In the above, we have principally had in mind the academic librarian, who routinely works with researchers. But much is relevant also to the special or public librarian, who may have a wider variety of enquirers. Some enquiries refer to controversies --- mixed-ability versus selective school systems, say, or controlling mosquitoes with land drainage or with insecticides. It is possible, for example, that a parent might come across the What Works Clearinghouse, and want to follow up what it says on its Website (http://www.whatworks.ed.gov), from which the following quotations are taken:

“The WWC aims to promote informed education decision making through a set of easily accessible databases and user-friendly reports that provide education consumers with ongoing, high-quality reviews of the effectiveness of replicable educational interventions (programs, products, practices, and policies) that intend to improve student outcomes.”

“The current nationwide emphasis on ensuring that all students and schools meet high standards has increased the demand for evidence of ‘what works’ in education.”

“Scientifically based research.... makes claims of causal relationships only in random-assignment experiments or other designs (to the extent such designs substantially eliminate plausible competing explanations for the obtained results).”

No doubt appropriate books can be found to aid the enquirer. Should the librarian attempt to go further, and help to educate the enquirer about different levels of evidence that may be found in the books? It seems to us that if the librarian is handling an appreciable number of such enquiries, he or she will want to be able to hold an intelligent conversation on this. The library may be an enquirer’s first point of contact with the scientific world. It would be unthinkable to withhold information, and let the enquirer come to possibly erroneous conclusions and perhaps later be unexpectedly criticized over the quality of evidence --- and criticism may indeed come, as in the case of driver education (Cochrane Injuries Group Driver Education Reviewers, 2001). Of course, we presume that the enquirer is willing to receive the information, and that the librarian does not have more pressing duties.

The librarian thus needs to be able to guide the enquirer to some discussion of laboratory versus real-world experiments, choice of criterion, cross-sectional versus longitudinal research designs, randomization into treatment and control groups, generalization of findings to other circumstances, what “statistical significance” means and what it doesn’t mean, the distinction between exploratory and hypothesis-testing research, the limitations of meta-analysis, and so on. And ideally, the librarian will know of a course on this at the local university (whether in statistics or public health or psychology or some other department). The challenge in regards to technical knowledge of research methods is obvious. There is also a challenge in regards to mind-set and interaction with the enquirer: the enquirer may perceive the librarian as an authority and expect an answer, whereas the movement for evidence-based everything tends to question expert opinion, preferring the supposedly impartial message from high quality experiments. In the field of health information, selection by experts (including librarians) is important to consumers of health services, who may have quite a strong desire for certainty in the information they are given (Marshall and Williams, 2006).

Raising the issue of levels of evidence with library clients will need to be handled with sensitivity. It is common experience that there are “one answer single item” clients and “full picture many items” clients and every combination in between. Introducing the issue of quality of evidence to someone near the “one answer single item” end of the scale could easily lead to client confusion and dissatisfaction, but those clients looking for a full picture will appreciate knowing about the importance of research methodology. Of course there can be no clear cut answer to when and how one should introduce evidence-based issues to a client, but the relevant skills are ones that librarians are taught and familiar with. Firstly, they are instructed in structuring a reference query interview and targeting information to suit a client's needs. Introducing evidence-based concepts should be a reasonably natural extension. Secondly, librarianship programs do introduce the issue of liabilities connected with information provision, possible requirements for disclaimers, and so on. Assessing and handling this, and judging how much to say about evidence-based concepts, are not dissimilar. So librarianship syllabuses already include the skills required to ably handle the introduction of evidence-based issues to clients. What librarians lack is a background in evidence-based anything.  In short, though in principle there may be a danger of overwhelming a library client, this is small compared with that of not discussing a concept vital to assessing the information provided to them.

In conclusion, we might list the major issues that have come up in this paper.

It is reasonably clear what evidence-based methods mean in medicine, and that appropriate courses for librarians can be developed. The principles are generic, and can readily be transferred to other disciplines. Undoubtedly there are real difficulties: a large research body of randomized experimentation may not exist, such randomized experiments as have been conducted may not be locatable because of inadequate indexing, and there may not be a consensus on what weight to give to “low quality” research when randomized experiments are lacking. But only the second of these is the responsibility of librarians, and certainly the principles are well enough developed that a librarian can interact intelligently with a subject-matter specialist.

Acknowledgements

The Centre for Automotive Safety Research, University of Adelaide, receives core funding from the Motor Accident Commission (South Australia) and the Department for Transport, Energy and Infrastructure (South Australia). The views expressed in this paper are those of the authors, not necessarily of these institutions.

References

Booth, A. (2006), “The unteachable in pursuit of the unreadable?”, Evidence Based Library and Information Practice, Vol. 1 No. 2, pp. 51-56.

Chalmers, I. (2003), “Trying to do more good than harm in policy and practice: The role of rigorous, transparent, up-to-date evaluations”, Annals of the American Academy of Political and Social Science, No. 589, pp. 22-40.

Cheng, G. (2001), “The shifting information landscape: Re-inventing the wheel or a whole new frontier for librarians”, New Library World, Vol. 102 No. 1160/1161, pp. 26-33.

Cheng, G.Y.T. (2003), “Educational workshop improved information-seeking skills, knowledge, attitudes and the search outcome of hospital clinicians: a randomized controlled trial”, Health Information and Libraries Journal, Vol. 20 No. s1, pp. 22-33.

Clyde, L.A. (2006), “The basis for evidence-based practice: Evaluating the research evidence”, New Library World, Vol. 107 No. 1224/1225, pp. 180-192.

Cochrane Injuries Group Driver Education Reviewers (2001), “Evidence based road safety: The Driving Standards Agency’s schools programme”, The Lancet, Vol. 358 No. 9277, pp. 230-232.

Davoudi, S. (2006), “Evidence-based planning. Rhetoric and reality”, DISP, Vol. 42 No. 165, pp. 14-24.
Fitz-Gibbon, C. (2004) “Editorial: The need for randomized trials in social research”, Journal of the Royal Statistical Society, Series A, Vol. 167 No. 1, pp. 1-4.

Fowler, G. (2000), “Evidence based health care: Diverse career opportunities for librarians”, presented at the Conference of the Australian Library and Information Association. http://conferences.alia.org.au/alia2000/proceedings/greg.fowler.html.

Head, B. (2006), “Evidence-based policy”, presented at the Conference of the Public Policy Network. http://www.jcipp.curtin.edu.au/local/docs/Paper-Head.PPN2006.ppt.

Hutchinson, T.P. and Meier A.J. (2004), “Evidence-based road safety policy? Evidence-based transport policy? A discussion of randomized experimentation and meta-analysis in the evaluation of interventions”, in Taylor, M.A.P. and Tisato, P.M. (Eds), Papers of the 27th Australasian Transport Research Forum, Transport Systems Centre, University of South Australia, Adelaide.

Hutchinson, T.P. and Meier, A.J. (2005), “If you do not know whether a real-world intervention will work, consider a randomized controlled experiment”, in Proceedings of the International Conference of the Institute of Public Works Engineering Australia, All Occasions Management, Thebarton, South Australia.

Lappa, E. (2004), “Clinical librarianship (CL): A historical perspective”, Electronic Journal of Academic and Special Librarianship, Vol. 5 No. 2-3. http://southernlibrarianship.icaap.org/content/v05n02/lappa_e01.htm

Lerdal, S.N. (2006), “Evidence-based librarianship: Opportunity for law librarians?”, Law Library Journal, Vol. 98 No. 1, pp. 33-60.

Liabo, K. (2005), “What works for children and what works in research implementation? Experiences from a research and development project in the United Kingdom”, Social Policy Journal of New Zealand, No. 24, pp. 185-198.

Marshall, L.A. and Williams, D. (2006), “Health information: Does quality count for the consumer? How consumers evaluate the quality of health information materials across a variety of media”, Journal of Librarianship and Information Science, Vol. 38 No. 3, pp. 141-156.

Missingham, R. (2005), “Conference report”, Australian Academic and Research Libraries, Vol. 36 No. 4, pp. 244-247.

Moody, D.L. (2003), “Using the World Wide Web to connect research and professional practice: Towards evidence-based practice”, Informing Science Vol. 6, pp. 31-48.

Morris, A. (2006), “Provision of research methods teaching in UK LIS departments”, New Library World, Vol. 107 No. 1222/1223, pp. 116-126.

Oakley, A. (2006), “Resistances to ‘new’ technologies of evaluation: Education research in the U.K. as a case study”, Evidence and Policy, Vol. 2 No. 1, pp. 63-87.

Palmer, J. (2000), “Schooling and skilling health librarians for an evidence-based culture”, Advances in Librarianship, Vol. 23, pp. 145-167.

Park, S. (2003), “Research methods as a core competency”, Journal of Education for Library and Information Science, Vol. 44 No. 1, pp. 17-25.

Rader, T. and Gagnon, A.J. (2000), “Expediting the transfer of evidence into practice: Building clinical partnerships”, Bulletin of the Medical Library Association, Vol. 88 No. 3, pp. 247-250.

Roberts, I., Bunn, F. and Wentz, R. (2001), “How can we discover what works in the prevention of road traffic crashes?”, in Peden, M. (Ed), Proceedings of WHO Meeting to Develop a 5-Year Strategy for Road Traffic Injury Prevention, World Health Organization, Geneva, pp. 48-49.

Sladek, R.M. (2000), “Evidence-based medicine: An opportunity not to be missed”, Australian Library Journal, Vol. 49 No. 3, pp. 271-277.

Stevens, M., Liabo, K., Frost, S. and Roberts, H. (2005), “Using research in practice: A research information service for social care practitioners”, Child and Family Social Work, Vol. 10 No. 1, pp. 67-75.

Tilley, N. and Laycock, G. (2002), “Working out what to do: Evidence-based crime reduction”, Crime Reduction Research Series Paper 11, Research Development and Statistics Directorate, Home Office, London.

Wathen, N. and Leckie, G.L. (2005), “Educating Information Professionals to Support Evidence-Based Health Care: Development, Delivery & Evaluation of a Course in Evidence-Based Health Librarianship”, poster presentation at the Conference of the Canadian Health Libraries Association, held in Toronto. http://www.chla-absc.ca/2005/posters.html

Back to Contents