Where is a fascinating story and collection of photos at Baylor University’s website about the “Monster Crash” of two locomotives. Back in 1896, an employee of the Missouri, Kansas, and Texas Railway – with perhaps more creativity than common sense – noticed that everyone is drawn to a train wreck. So, for a publicity stunt, the railroad staged a head-on collision between two locomotives, both going at 45 mph. An estimated 40,000 people gathered nearby to watch; when the locomotives collided, the boilers exploded, killing several people and injuring many. Incredibly, in one photograph taken an instant before impact, a man is seen standing within 10′ of the tracks. The crowd had been assured that engineers were consulted and an explosion wasn’t possible.

As I write this, the world is watching Japan struggle to recover from a 9.0-magnitude earthquake and tsunami that resulted in the destruction of several nuclear reactors. Commentators are asking why the reactors were built on an active fault line without enough safeguards to protect against an earthquake of this magnitude, and the answer appears to be that no one thought anything this bad would happen.

In both cases, the inability of experts to sufficiently imagine something outside their realm of direct experience contributed to major catastrophe. No one questioned the basic assumptions or thought to ask, “But what if the unthinkable does happen?”

While not all the research we info pros are called upon to conduct has as many ramifications as the design of a nuclear facility, we can play a significant role in bringing a larger, outside perspective to our work. One of the ways to avoid informational myopia is to identify outliers when scanning the information environment. How do you find a study that predicts the probability of another earthquake the size of the one that struck Indonesia in 2004, if that’s not what you are looking for? How do you find studies of the effect of an airplane crashing into a building such as the World Trade Center towers, if the parameters of your research don’t take that possibility into account? The information often exists, but it is not findable because no one is asking the question that would unearth this information: “How sure are you about all your assumptions?”

Eventually, I expect that we will have research bots to identify and flag all the underlying assumptions of a project. However, having watched IBM’s Watson, the computer that beat two Jeopardy! champions, I can see that these algorithms have a long way to go. In the meantime, info pros need both the tools and the insight to question the assumptions we encounter every time we start a research project.

What would that look like? For starters, we need both data visualization tools and the skills to use them. In addition to “merely” providing access to premium content, econtent providers need to roll out tools enabling us to discover information about information. I want to graph, for example, how often two phrases appeared, over time, in the top 50 news sources. I want to know what words appear most frequently near my search terms. I want to identify terms that have suddenly risen in frequency in the context of my topic. In other words, I want to learn about what I wasn’t anticipating.

On top of these relatively simple data-mining tools, we info pros need to develop a higher awareness of those “Huh? That’s weird” moments we encounter during research. You know the drill: You start out looking for information on one issue and suddenly realize that an unanticipated game changer is looming on the horizon. It can be tempting to ignore these unexpected findings, particularly as we spend more of our time summarizing and distilling the information we have gathered. One way to highlight unexpected results without derailing the focus of the research is to provide a separate section in your report on anomalies and unexpected findings. By pulling this information out, you enable your client to determine whether it is irrelevant or a critical new look at the problem.

One of the crucial roles of info pros is to bring a distilled view of the external world into their organizations. Our responsibility is to ensure that we bring as rich and nuanced a view as we can, and that can involve bringing attention to the things that just don’t look quite right. Who knows the impact that can have?