Search for People


      Bookmark and Share

A few days after Easter, I went up into the attic of our house, picked up an owl, put it in my suitcase, and flew with it to Finland. This, as you might already have figured out, was no ordinary owl: It was a Strix owl. That is, a likeness on top of a bronze statuette.

The Strix Award might best be described as the Oscar of the information retrieval research world. It’s presented annually to a person who has made a significant impact on research in information retrieval. The award was set up in memory of Tony Kent, Ph.D. (1933–1997), who made a major contribution to the development of information science and to information services, particularly in the field of chemistry.

After Tony’s death in October 1997, a group of his friends met for lunch, and each spoke of what he knew of his life. From each speaker’s recollections came new revelations to each of us of the breadth of his work and the influence he had exerted in the information field. A new appreciation of the achievement of this modest man led to the idea of an award to commemorate him and his work

The award is administered by the U.K. Electronic Information Group (UKeiG; www.ukeig.org.uk). This year it was sponsored by the Chemical Information Group of the Royal Society of Chemistry. The 2008 award was presented to professor Kalervo Järvelin of the University of Tampere, Finland, who, as well as being an outstanding researcher and teacher, has played a major role in the ACM Special Interest Group for Information Retrieval (www.sigir.org). My trip was undertaken in my role as chairman of UKeiG.

Research into information retrieval is not a recent development. Rather, it has its origins in the late 1950s and early 1960s. Probably one of the most important pieces of research from that period was undertaken by Cyril Cleverdon, the librarian at the Cranfield Institute of Technology (now a university) in the U.K. in the late 1960s. This research involved the development of test collections to be able to measure the effectiveness of information retrieval software, which in time led to the TREC conferences, which continue to this day.

What is especially interesting about the work in which Järvelin and his colleagues are engaged is that they want to take a user perspective to information retrieval. This aspect has been largely overlooked in much of the research that has been done over the last few decades, where much of the focus has been on research with quite carefully managed test collections of published content (such as Medline). Enterprise search presents all sorts of difficult problems, and these have only had a fairly cursory assessment by the academic community. Information retrieval research is, to a large extent, the intersection of applied mathematics, linguistics, and psychology. Thus, it requires a multidisciplinary research team.

This is not only the case for academic research but also for search vendors, all of which have to maintain a substantial research and development operation to ensure that their products continue to meet user requirements. This is especially the case in the worlds of chemical information retrieval (where Kent was a major innovator), retrieval of information from patents, and e-discovery. As search becomes more business critical, the search software has to evolve to address these new requirements.

Much has been made of survey research (notably published in Jane McConnell’s “Global Intranet Trends” report) of the dissatisfaction of users with search. The problem here is that we need to distinguish between dissatisfaction with the way in which the search application works (user interface) and dissatisfaction with the search results, which may be nothing to do with the search application itself but with the governance of content quality. The assessment of “relevance” in enterprise search is usually not a yes or no assessment, especially on the basis of the initial results returned, but a far more complex assessment with many variables, another area that Järvelin is working on.

There is much to do, and there are relatively few research groups working on the problems. The highly competitive search market also means that few search vendors are willing to either publish the outcomes of their research or take part in the TREC and SIGIR programs. When the Strix Award committee sits down in September to review the nominations, the number will be few.  I look forward to seeing where I will be flying next year with an owl in my bag.