I am a frequent conference speaker, and preparation is always a challenge because I know that most of the attendees at my sessions are experienced information professionals; I want to tell them something they do not already know. For a change of pace, I recently gave a series of workshops to groups of people within a variety of organizations who use the Web as part of their work, but who are not information professionals or Web researchers. From their questions and comments, and from watching them do hands-on searching afterward, I was reminded once again of some of the knowledge that we info pros take for granted.
One common mistake is typing a URL in a search engine's search box. This has been going on for years, of course, but now that many people have installed a search engine's toolbar on their browser, it is even easier for them to mistake that box for the browser address field. Another frequent error is assuming that the best or most authoritative sources sit at the top of a search results page. Consumer Web Watch, maintained by the folks who write Consumer Reports, published a 2003 report for which they interviewed Internet users and discovered that "a majority of participants never clicked beyond the first page of search results as they had trust in the search engine to present only the best or most accurate results on the first page, making it unnecessary to review later results pages" [emphasis mine]. Now, "best" is a relative term, and the search engines that survive are the ones that have the best relevance calculation algorithms. But "most accurate"? A Google search for the phrase "Martin Luther King" shows the abhorrent www.martinlutherking.org site within the first ten results. And, of course, Google-bombing examples such as the phrases "best president" and "worst president" (both of which at one point returned the official White House biography of President Bush as the first search result in Google) show that search engines can be gamed.
The other common misperception is that most search engines return similar results. Whenever you teach non-info pros and want to elicit a gasp, show them a side-by-side comparison of the (lack of) overlap between two search engines. Ranking.thumbshots.com and Jux2.com are two tools for showing dramatically how much a searcher misses by relying on only one search engine.
Although every info pro who has prepared a budget is acutely aware of how much information is not available for free on the Web, some clients still have not grasped this fully. Some info pros view the introduction of Google Scholar as a threat, but I believe that, in fact, it provides us with a dramatic tool for teaching our clients about fee-based information services and the true cost of obtaining published material. Since most of the records that appear in Google Scholar are only bibliographic citations and not links to the full text, our clients will either purchase the articles directly from the publisher or come to us to request a copy. At that point we can talk about the true cost of interlibrary loans, discuss the royalty fees that run $30 or more per article, and explain why we pay for econtent that isn't available on the Web. (For a free white paper I wrote on this topic for Factiva, "Free, Fee-based and Value-Added Information Services," see www.snurl.com/factiva.)
While these insights are useful for a brown-bag training session, a workshop, or in one-on-one client training, there are other reasons we should be aware of our clients' blind spots. Keep in mind that clients ask us to do only what they think we can do. On the one hand, that means that, if a client asks for research that is more in-depth than you usually provide or for additional analysis that you do not normally offer, you must resist the instinct to reply, "Oh, we can't do that." The fact that your client asked you means that she has confidence the request is within the scope of your capabilities. Treat this question as free market research; you just identified an unmet information need you can probably meet.
The idea that clients ask us to do only what they think we can do also means that, if they don't have a broad enough understanding of our skills and resources, they won't think to bring a particular type of need to us. During the course of your reference interview—er, information needs interview—consider "up-selling." No, you aren't offering to super-size an order of fries; rather, you are raising your clients' expectations of the value you provide. It is hard enough to market information services. At least, during a needs interview, you have your client thinking about what kind of information she needs, and she is more likely to really hear you when you describe an electronic clipping service, an internal blog you maintain, the delivery of the results with an accompanying PowerPoint presentation of the key information, and so on. You are the info pro; communicating the value of your knowledge is one of the most challenging—and essential—parts of your job.