Building Engagement and Community in the Age of Trolls


      Bookmark and Share

Article ImageBuilding reader engagement is at the top of every web publisher's to-do list. Getting your users to comment and partake in online discussions is one of the best ways to keep them coming back to your site. But when you open your content to commenters, you also open it to trolls, and many very successful websites with engaged users are still struggling to find ways to foster discussion and drown out the troublemakers.

A slang term that caught on like wildfire, the word "troll" describes an individual who uses anonymity to stir up drama, often within a comment thread on a blog or news website. These individuals purposely sow the seeds of racism, hatred, bigotry, or anything else that may coax online bickering among fellow commenters. Public forums provide an audience that few trolls can resist, because they are likely to provoke at least one emotional response and, they hope, wage a form of cyber war.

The motivation behind each troll's actions varies. Some do it just to get a reaction out of other readers, while others have ulterior motives. Tara Rawlins, president of RAW Marketing in Missouri, was the saving grace for a client whose online reputation was completely shattered by trolling. "Some unfortunate circumstances occurred that were out of the business' control, and about ten people who were displeased would follow all of my client's social media sites and be nasty all over the page," says Rawlins. "They were blocked over and over again. The company even had to shut down and restart all social media and it still didn't work. We came on then and worked for about four months to clean up the company's online reputation because it was totally smashed. Fortunately, we were successful, and now even some of the ‘haters' are playing nice online."

Trolling is so mainstream that many websites popular with readers are looking for ways to prevent, or at least curb, trolling behavior. In January, The Huffington Post announced its release of the Conversations feature, which pulls full, complex discussions from story comment threads and puts them on a separate page so that readers can continue the conversation without filtering through unrelated comments. Each conversation will have its own URL, giving it optimal sharability throughout the online community. HuffPost CTO John Pavley explained that algorithms do most of the work in determining viable conversation leaders, but there are human moderators included in the equation.

Michele Linn, content development director for Content Marketing Institute, advocates a system that uses a combination of filtering algorithms and manually moderating comments. "At CMI, we use Disqus as our commenting system. It removes the obvious spam, and we also manually review every comment we receive and remove comments that are completely inappropriate if something gets by the system," she says.

The New York Times implemented a system over a year ago that encourages frequent commenters to behave and earn "trusted" status so that their comments no longer have to face moderation. However, in order to earn moderation-free status, readers must connect their Facebook and New York Times accounts.

In fact, many sites have implemented a similar strategy in the hopes that requiring commenters to associate with their real names and their Facebook profiles will deter malicious activity. The Manchester Evening News in the United Kingdom relaunched its site at the beginning of the year with Facebook tied directly to its commenting system. Many NBC news affiliates use Facebook for commenting, as well as sites such as TechCrunch, SportingNews.com, Redbookmag.com, and Examiner.com.

While Rawlins understands the reasoning behind using Facebook comment thread plug-ins to verify a reader's legitimacy, she doesn't believe the method to be effective at preventing trolling. "The people that get blocked will just create a fake Facebook account to continue with their message," she explains.

However, Jordan Kretchmer, founder and CEO of Livefyre, Inc., questions the limitations Facebook comment plug-ins put on readers. "We do not believe in having just one ID for all online conversations," he says. "In real life, people act very differently depending on the particular social or professional situation they're in, and our identities online should be no different." Kretchmer also points out that he's seen a decrease in the number of comments on sites that implement the Facebook comment application, saying the permanence of having their real names tied to comments forever is often intimidating.

Livefyre offers products that integrate social media and company websites in an effort to drive real-time conversations and community engagement. Its products employ a number of automated and manual controls for customers to work against trolls. "My favorite is what we call the ‘bozo' filter that automatically hides troll-ish comments and commenters so no one else can see what they're posting," Kretchmer says. "Of course, the trolls can continue to see their own posts, so they're none-the-wiser."

Livefyre also provides a service managed by the content control company Impermium that analyzes the semantics of a new comment against a large database of existing comments to determine if it is intentionally malicious. If a comment is found to be malicious, Impermium filters the bad content out. Though a date was not specified, Kretchmer also mentioned that a new moderation panel that surfaces bad content and automatically takes action without human involvement will be launching through Livefyre.

The battle with trolls isn't over, and the options for curbing malicious comments are only beginning to surface. In an April 2012 interview for GigaOM, Nick Denton, founder of Gawker Media, expressed his distaste for how the online commenting world works, including on his own websites. He doesn't believe that comments adequately capture the intelligence of a site's readership and is working to change the system completely.

His first attempt was much like the reward system The New York Times adopted, but Denton told GigaOM it was a mistake, and that "all it did was encourage social-media gurus and professional commenters to game the system in order to get rewards." While Denton did not reveal many details about the new system he's working on, he suggests a sort of comment curation platform. In the end, allowing users to establish their own comment communities within a site, where they can deny access to troublemakers, may be the way of the future. After all, if a troll falls in the forest and no one is around to hear it, does it make a sound?