Twelve professors, 15 graduate students, and 54 undergraduates read the first pages of eight research articles from four social science areas and then listed up to five keywords or phrases that they believed best described the content of each. These descriptors were compared with those tagging each article in a popular computer-assisted bibliographic search program, PsychLIT™. Professors and graduate students generated about 50% more matches to PsychLIT descriptors than undergraduates. Professors gave more synoptic descriptors (abstract, category words and phrases not appearing in the text of the articles) than graduate students and undergraduates. In general, however, all groups did very poorly at matching the PsychLIT terms; on average fewer than 10% of their descriptors matched, even when they were experts in the topic areas of the articles. Some possibilities and limitations of improving this “hit rate” are discussed.

Additional Metadata
Persistent URL dx.doi.org/10.1177/107554709001100302
Journal Science Communication
Citation
Thorngate, W, & Hotta, M. (Miho). (1990). Expertise and Information Retrieval. Science Communication, 11(3), 237–247. doi:10.1177/107554709001100302