I teach workshops on library databases to a range of users at the New York Public Library. Some of the students are academics, others are unaffiliated scholars, and many more are undergraduate or graduate students from nearby schools. The degree to which they’re familiar with platforms, searching, Boolean logic, peer-review, and formats varies. But one thing nearly all students share is general confusion as to which database they should use for the kind of research they’re conducting.
The database names don’t help: Ulrich’s? Never heard of it. Project Muse? Sounds vaguely familiar. Of course, titles like The New York Times or Worldwide Political Science Abstracts, can steer the user in a general direction, as do strong database descriptions on library websites. But without a greater understanding of the kind of content that can be found in each resource, the user is often lost.
Take online reference databases—in the past, a library’s reference collection consisted of titles lined up on a shelf, organized by Dewey or Library of Congress classifications. If a patron needed to consult the Encyclopedia of Associations, for example, the physical set was relatively easy to locate. Today, much of that reference shelf has moved online to platforms like Credo or Gale Directory Library. These sources may provide 24/7 access to information, accessible from any computer, but finding the relevant titles within each database can often be difficult unless you know what to look for. That’s the paradox of academic databases: much more to explore, harder to do.
Theoretically, that’s where discovery platforms like Summon and EBSCO Discovery Service (EDS) come in. Discovery platforms search the metadata of a library’s subscription resources simultaneously so users don’t need to visit each database individually. But each discovery platform uses a proprietary search algorithm that weighs content differently. According to a 2013 study comparing discovery platform effectiveness in College & Research Libraries, ProQuest’s Summon platform seemed to rank newspaper articles higher than academic journal articles. EBSCO’s EDS, on the other hand, seemed to weigh relevance by length of the article, so short newspaper pieces tended to rank lower than peer-reviewed articles. “Discovery,” therefore, is a relative term depending on which company is doing the searching. Regardless of rankings, nearly all platforms tested in the study returned a large results list, so students were “routinely faced with a set of search results that far exceeded what could reasonably be evaluated on an item-by-item basis.”
Perhaps it’s no surprise, then, that 97% of academic library directors surveyed in the recent Ithaka S+R survey cite teaching informational literacy to undergraduates as an important function of the library. With the limited transparency of online sources, undergraduates clearly need all the help they can get when starting their research.
The Beyond Citation team hopes that researchers—both seasoned and amateur—will shine the light on databases they use regularly by examining their strengths, weaknesses, and the overall range of material. In other words, the content. We intend to provide information about a number of databases to help researchers understand them. Because without a better understanding of the troves of rich information discoverable in each database, they’re all just links on a page.