Electronic Journal of Academic and Special Librarianship

v.10 no.1 (Spring 2009)

Back to Contents

Gresham’s Law in the 21st Century

Joshua Finnell, Reference Librarian
Frazar Library, McNeese State University, LA, USA

Abstract

Research indicates that most people today satisfy their information needs through the Internet.  As we move deeper into the information age, librarians must embrace the role of inculcating information literacy skills lest Gresham’s Law of economics becomes a reality in our information economy.  This article discusses the probabilistic nature of the Internet against the backdrop of Gresham’s Law.

Article

Sir Thomas Gresham was an English merchant and financier who served as financial liaison to both King Edward IV and Queen Elizabeth I of England during the 16th century.  Largely referring to the common practice of “clipping” or “shaving” silver coins and passing them off at the same value, Gresham articulated the basic economic law that bad money drives out good money.  Succinctly stated, if you find an unshaved coin you are going to hoard it and pay with a shaved coin instead.  However, bad money, though necessary for this law to hold true, is not sufficient.  In 1891, British economist Robert Giffen reflected on the monetary system in England by stating, “Good and bad coins will circulate together in a given country as if they were all good when the circulation itself is not in excess of the demand for it” (Giffen, 1891, p. 304).  Echoing this sentiment in 1962, political philosopher F.A. Hayek spoke of the puzzled historian discovering that although good and bad money have been circulating together for years, good coins have become scarce.  Fayek wrote, “He [the historian] will have to understand that neither the wear and tear nor clipping have caused this relatively depreciation.  He will have to look for a cause which either increased the relative supply or decreased the relative demand for coins” (Hayek, 1962, p. 102).

In our current society, information is the currency we deal with on a daily basis, and the Internet is the current marketplace of exchange. Gresham’s law asks us the fundamental question: is bad information driving out good information?  Speaking at a White House Conference on Library and Information Science in 1979, Daniel Boorstin considered the implications of our 20th century technology (largely radio and television) producing information instantly and in abundance.  Relating this emerging technology to economics, Boorstin remarked, “In our ironic twentieth-century version of Gresham’s law, information tends to drive knowledge out of circulation” (Boorstin, 1989, p. 46).

However, unlike a coin whose worth can be objectively assessed by weight or measurement, information quality is a complex issue.  The value of information stems from its application, which implies a relativistic nature, as data considered appropriate for one use may not possess sufficient attributes for another use.  For example, citing an encyclopedia entry on George Washington may be good information for a 5th grade term paper, but bad information for a doctoral dissertation.  This would suggest that information quality cannot be assessed independent of the people who seek and use information.  This makes simple quality dimensions such as good and bad extremely difficult to gauge (Knight & Burn, 2005, pp. 160-163).

At the same time, application and use of information are often done with reduced resources, most notably time.  Both the 5th grader and graduate student are working under a time deadline.  Therefore, the information found by a person may satisfy their timely need but fail to adequately fulfill the criteria set forth by the demand.  The graduate student who cites the encyclopedia entry on George Washington in his/her dissertation because a deadline is drawing ever closer may not be rewarded with a degree.  This would suggest that user application is a necessary but not sufficient criterion for assessing good and bad information.

Information, outside of its application, is multi-faceted in nature.  Information is characterized by several attributes: timely, lengthy, wordy, pictorial, credible, vague, controversial, or contradictory.  This meta-information, or information about information, is where the user ranks applicability and quality.  Someone tracking a hurricane may be more interested in timely information than pictorial information, for example.  However, one characteristic in particular should be considered the basis upon which we decipher good information from bad information: credibility.   Regardless of how up-to-date information is, without credibility the information is bad.  For example, under the threat of an approaching hurricane you click on www.weather.com to see whether or not you should evacuate and see that it is heading directly for your town.  As you finish packing your car, a UPS driver arrives at your front door to deliver a package.  Upon seeing you s/he informs you that s/he just heard that your particular town is not under threat and that you would be fine to stay.  Unless this particular UPS driver has a background in meteorology, most people would exchange the up-to-date information of the UPS driver for the credibility of weather.com.

Source credibility is the key to distinguishing between information we should use and information we should dismiss.  However, both credible and non-credible sources circulate in our current information economy.  Expanding well beyond technology that which Boorstin could have conceived, in the 21st century the profusion of information is bewildering.  The Internet has democratized the production of information and allowed everyone to participate in its creation and dissemination. These factors, that make the Internet a powerful means of disseminating good or credible information, are the same characteristics that allow for bad and misleading information to become widespread (Bates, Romina, Ahmed, & Hopson, 2006, p. 46).

Before the advent of the information age, the task of assessing credibility was much easier.  Traditionally, information in print was executed by a whole host of professionals (journalists, professors, editors, peer-reviewers) ensuring quality of the information distributed.  Since print information was time consuming and expensive, the dissemination of good information was much slower and hoarded by those who could afford it (universities, culture institutions, journal subscribers).  As a corollary, bad information (non-credible information) was less ubiquitous.  As a trade-off, less people were afforded the opportunity to a voice in a system that restricted publication to those with the means of production.

The inverse is true today; more people are afforded the opportunity to voice their opinion by publishing a Webpage online, but bad information is in heavier circulation.  In his book, The Long Tail, Chris Anderson colorfully illustrates the information distribution on the Internet when he writes, “with probabilistic systems there is only a statistical level of quality, which is to say: some things will be great, some things will be mediocre, and some things will be absolutely crappy” (Anderson, 2006, p. 70).  The Internet is not completely full of non-credible sources, just as it is not always authoritative or trustworthy.  With a nod toward the power of mass collaboration and creation, Anderson states, “Give enough people the capacity to create, and invariably gems will emerge.” (Anderson, 2006, p. 126)

The unspoken issue, however, is that such gems may be buried under several tons of rocks and gravel.  For every Webpage constructed by experts with extensively cited research, there are many more with little care for citation, grammatical correctness or facts.  Currently, both are circulating on the marketplace of the Web with seemingly equally value.  With information dissemination readily available for instant distribution through Web publishing, bad information obtained from a non-credible source is circulated back into the system almost as quickly as it is created.   The Internet magnifies Giffen’s point that good and bad information will circulate together as if they are both good when the circulation itself does not exceed the demand.  We are far from reaching our saturation point, as the number of blogs, Webpages and social networking sites increase on a daily basis.   Again, this isn’t a problem depending on the information one is seeking, given that information quality must consider the user’s needs.  At the same time, credibility is the crucial element of meta-information that all seekers of information should be using to assess whether the information they are finding is good or bad.  With anyone given the ability to publish their ideas online, your chances of finding bad information in proportion to good information are fairly good. Yet, from accurate movie times to authoritative health information, people are seeking good information from qualified sources.

Is Gresham’s law correct? Is bad information driving out good information?  In terms of education the preliminary answer is yes.  Educators across the country are discovering that their students are using the Internet too heavily and extracting more bad information than good.  Tara Brabazon, Professor of Media at the University of Brighton, believes the Internet is “flattening expertise” because every piece of information is given the same credibility by users (Frean, 2008, para. 7).  The first result in a search engine is by no means the most authoritative, but it is the most convenient.  The puzzled professor who finds well-researched papers a scarcity need look no further than the Internet to see that good information is at a premium and that bad information is pushing it to the fringes.

The same is true in the health care industry.  A joint research study conducted by researchers at the School of Communication and The American Cancer Society Partnership at the University of Ohio concluded that source credibility had little to no impact on information seekers evaluation of the quality of information they were receiving.  When presented with six messages discussing lung cancer (three from credible sources and three from non-credible sources), participants attributed both with equal levels of trustworthiness and truthfulness.  The researchers concluded, “presenting high-credibility sources of health information on the Internet has little to no effect on consumers’ perception of quality when these sources are compared to a no-credibility source” (Bates et al., 2006, p. 49).

Perhaps most importantly Gresham’s law is taking effect in politics.  As with any issue in the American political system, individuals with strong partisan affiliation or prior beliefs on a topic will tend to practice self-selection when seeking information.  However, younger voters in the 21st century seeking information on a particular presidential candidate in order to make an informed decision will more than likely find themselves reading political blogs on the Internet.  Informing the citizenry at very little cost from a wide array of sources, blogs should be an example of what John Stuart Mill envisioned as a marketplace of ideas.  However, a study conducted by Gregory Sheagley at the University of Minnesota, Morris found evidence that young people may be inclined to choose partisan blogs over those that are credible.  Testing a group of students, Sheagley recorded general information about political affiliation and computer use from the students before having them choose between several different sources of information, ranging from credible to non-credible.  Surprisingly, students who reported using political blogs at least once every two weeks were less likely to select the credible blog out of an array of sources (Sheagley, 2007, p. 15).

Obviously, Gresham’s law is still alive and well today in our information economy.  As Robert Mundell, winner of the 1999 Nobel Prize in Economics, points out, “The motivating force underlying Gresham’s Law is economy: we settle a debt or transaction with the cheapest means of payment” (Mundell, 1998, p. 61).  In terms of information, the cheapest means of payment is that information which is easily accessible, regardless of credibility.  In education, students routinely hand in assignments listing bibliographies full of questionable sources.  When it comes to personal health, we are willing to give equal credence to information coming from the World Health Organization as from a random Website.  In terms of our participation in the political system, our beliefs and ideas are formulated not from expertise and fact, but rather from those who share our understanding of the world.   With an informed citizenry as the bedrock of democracy, the profusion of bad information in our information economy is a serious issue to consider.

With bad information pushing out good information, our ability to make informed decisions from everything from historical facts to political leaders is in jeopardy.  However, before the Internet took hold in our society, the library was a well-utilized marketplace of ideas. The purpose of a public library is to challenge the individual to the continual self-education.  Historically, public libraries have collected widely, striving to provide a collection of that is open to all ages and skill levels and to supplement the programs of formal educational institutions.  Today, however, libraries are no longer considered sources of information.  A recent study by PEW Internet and American Life Project reported that more people turn to the Internet than any other source of information (Estabrook, Witt, & Rainie, 2007, p. 5). With good and bad information circulating the Web this fact raises serious concerns about the quality of information users are finding: great, mediocre or otherwise.

With the rise in telecommunications in the late 70s, Boorstin suggested that libraries become, “places of refuge from the tidal waves of information – and misinformation” (Boorstin, 1989, p. 48).  In the 21st century this call to action is even more so true.  Unlike the Internet, library collections are developed by a whole host of professionals ensuring the quality of the information added to the collection.  As a corollary, libraries become the place of refuge from the tidal waves of bad information that is circulating throughout our information economy.  It behooves libraries to begin marketing themselves as repositories of good information in light of Gresham’s Law.

However, simply having good information is not enough. As was found in research pertaining to quality health information on the Internet, the existence of good information has little to no effect on the perception of quality.  The Internet has good information.  The difficulty is discerning the credible from the non-credible.  Thus, librarians must take on the role of teaching the populace information literacy skills.  An information literate person is defined by the American Library Association's  Presidential Committee on Information Literacy Final Report as a person who is able to recognize when information is needed and have the ability to locate, evaluate, and use effectively the needed information (American Library Association, 1989, p. 4) As gatekeepers of good information, librarians are uniquely qualified to teach information literacy skills to the populace.

The next generation of librarians will not only have to ensure that libraries preserve and collect good information, but redistribute the knowledge of this information back into our information economy.  In the 21st century, libraries play an integral role in ensuring that Gresham’s Law is not a reality in our information society.

References

American Library Association. (1989). Presidential Committee on Information Literacy: Final Report (pp. 1-15). Final Report, Washington  DC: American Library Association. Retrieved from http://www.ala.org/ala/mgrps/divs/acrl/publications/whitepapers/presidential.cfm.

Anderson, C. (2006). The Long Tail: Why the Future of Business is Selling Less of More (p. 256). New York: Hyperion.

Bates, B. R., Romina, S., Ahmed, R., & Hopson, D. (2006). The effect of source credibility on consumers' perceptions of the quality of health information on the Internet. Medical Informatics & the Internet in Medicine, 31(1), 45-52. doi: 10.1080/14639230600552601.

Boorstin, D. J. 1. (. J., & Daniel J. Boorstin Collection (Library of Congress). (1989). The Republic of Letters: Librarian of Congress Daniel J. Boorstin on Books, Reading, and Libraries, 1975-1987 / Cole, John Young,; 1940- (p. 115). Washington: Library of Congress.

Estabrook, L., Witt, E., & Rainie, L. (2007). Information Searches that Solve Problems, PEW Internet and American Life Project. Retrieved from http://www.pewinternet.org/~/media//Files/Reports/2007/Pew_UI_LibrariesReport.pdf.pdf.

Frean, A. (2008, January 14). White bread for young minds, says University of Brighton professor. The Times Online. Retrieved from http://technology.timesonline.co.uk/tol/news/tech_and_web/the_web/article3182091.ece.

Giffen, R. (1891). The Gresham Law. The Economic Journal, 1(2), 304-306.

Hayek, F. A. (1962). The Uses of 'Gresham's Law' as an Illustration in Historical Theory. History and Theory, 2(1), 101-102.

Knight, S., & Burn, J. (2005). Developing a Framework for Assessing Information Quality on the World Wide Web. Informing Science, 8, 159-172. Retrieved from http://inform.nu/Articles/Vol8/v8p159-172Knig.pdf.

Mundell, R. (1998). Uses and Abuses of Gresham's Law in the History of Money. Zagreb Journal of Economics, 2(2), 57-72.

Sheagley, G. (2007). Blogs as Information Sources: The Impact of Source Credibility and Partisan Affiliation. Conference Papers -- Midwestern Political Science Association, 1-24.

Back to Contents