Wikipedia Weighs Information Against Indecency

Article ImageOne of Wikipedia’s defining characteristics is the space it creates for healthy debate from around the world. But one thing is not up for discussion, at least not anymore, according to the wiki discussion page for the entry on the album Virgin Killer by the German band, Scorpions: "Prior discussion has determined by broad consensus that the Virgin Killer cover will not be removed." Normally, that would be that. But some groups are saying the decision of Wikipedia’s volunteer editorial army to keep a controversial album cover up constitutes harboring child pornography and are calling for its removal.

While it’s just one entry among millions, the Virgin Killer cover presents a dilemma for the user-generated encyclopedia project: What’s information, and what’s just indecent? The picture in question—swapped out for the album’s U.S. release but still sold in Europe—depicts a very young, nearly nude girl in a suggestive pose. While no prosecutions resulted when the picture was released in 1976, a WorldNetDaily investigation in 2008 alerted FBI agents to its presence on the Virgin Killer Wikipedia page. A few weeks later, federal agents announced they were reviewing the photo to decide whether it qualified as child pornography.

Whether or not the Scorpions album goes too far is only part of the issue. With its unique publishing and editorial structure (devotees refer to it not as a "site" or an "encyclopedia" but as "the project") and limitless topical scope, Wikipedia traffics a huge amount of information. Out of more than 2.3 million entries in English, the mere presence of some entries (such as the scandalous Virgin Killer, sexually explicit topics, or the hotly debated pictures of Mohammad that violated Islamic law) is enough to offend certain groups. Now that Wikipedia has gone from an ambitious community project to a well-known worldwide information resource, it is beginning to gain the attention of groups, such as Morality in Media, that are calling for general moral accountability to temper the exhaustive informational accessibility.

According to Wikimedia’s head of communications, Jay Walsh, media censorship groups don’t understand the unique nature of how the Wikipedia community defines appropriate content. "It’s not a matter of views or taste," says Walsh. "If it’s notable, the community will try to direct attention to that as a group and ask themselves if it’s an appropriate topic." If such groups object to some content, the only recourse lies in reopening the discussion on an entry’s "Talk" page and swaying administrators. But the 1,000 administrators and 75,000 editors that volunteer hours every week to building the project favor inclusion in all but the most extreme cases.

Wikipedia’s content, editors, and American viewers are still subject to their native laws. The dozen or so full-time Wikimedia Foundation employees who oversee Wikipedia and other wikis "work with the community to minimize or remove content that would threaten the community," says Walsh. In the past, this kind of action has included removing threats posted to some pages. If the FBI determines the photo is child pornography, anyone who possesses the image could be held legally liable.

The Wikimedia Foundation is protected from prosecution by Section 230 of the Communications Decency Act, which grants immunity to interactive user-generated content publishing platforms. It’s the same law that got craigslist off the hook for ads violating the Fair Housing Act posted to its bulletin board, as decided in a 7th Circuit federal decision last March (Chicago Lawyers’ Committee for Civil Rights Under the Law, Inc. v. Craigslist). Because sites such as Wikipedia and craigslist are only information conduits, they dodge responsibility for what their users choose to include.

Walsh doesn’t discount the notion that someday there could be a process to keep offensive material away from people who don’t want it popping up during their Scorpions research. "There are technologies that could help us stay alert to potentially negative or bad info on Wikipedia," says Walsh. "And there’s the people side of it—the volunteers, organized ways of thinking about controversial topics, and bringing other organizations into the dialogue." Wikipedia currently instructs users desiring to avoid offensive images to register specific image-blocking site settings, create a proxy filter, or configure their browsers to block some or all images from Wikimedia pages.

The bottom line is that if the FBI decides the decades-old album cover is child pornography, Wikipedia will be in legal hot water if the community keeps it up—but so will any of the dozens of other individual sites that contain the Virgin Killer image. If not, the decision is in Wikipedia administrators’ hands. As far as they’re concerned, the red notification at the top of the "Talk" page is a sign that the community has spoken. Notability has outweighed offensiveness, so, like the groups that are irate about the Mohammad illustrations or the photos accompanying the entry on strippers, anyone offended by the photo will just have to steer clear.