For years, it’s been a standard piece of advice to anyone reading the news online: “Don’t read the comments.”
It’s no secret that user-submitted comments on news websites are often angry, racist, misogynistic, or simply ill-informed. That’s contributed to media organizations from NPR and Reuters to Popular Science and legal news site Above the Law deciding in recent years to eliminate comment sections altogether. Other sites, such as the New York Times, have taken to heavily moderating what readers post and limiting when they can do so.
But many media companies remain committed to providing forums where their readers can pose questions, contribute their wisdom, and even civilly debate with one another. In part, many in the industry see it as important to offer a place where people can discuss current events outside “the friends and family echo chamber” of social media, says Aja Bogdanoff, cofounder and CEO of Portland comment tech startup Civil.
Denying them the opportunity to comment might drive them to other venues where “the conversation” is already taking place—social media sites like Facebook and Twitter, local networks like Nextdoor, or forums like Reddit. While some media groups have reported success with closing their comment sections and connecting with readers through social media, that trend also risks reducing reader engagement with the news outlets themselves, and disperses potentially valuable discussion about an article across the web, where it’s harder to find.
More pragmatically, many in the industry have found frequent commenters are often also among sites’ most dedicated readers, spending more time reading articles and looking at ads, and among those more likely to purchase paid subscriptions, says Greg Barber, director of digital news projects at the Washington Post.
“By othering this group—by saying these people who come and speak in the comment spaces, they’re something else and they’re undesirable—that’s actually really dangerous for news organizations,” says Barber, who is also head of strategy and partnerships at The Coral Project, a collaborative effort of the Post, the New York Times, and Mozilla to improve community engagement on news sites. “They are, in fact, your most loyal readers—these are the people who are paying the bills.”
Civil and The Coral Project are among a group of startups, established media companies, and academic projects working to develop ways for news sites to engage active readers. Many are looking to move beyond the troll-friendly environs of a traditional comment box, which can often incentivize attention-seeking commenters to take extreme positions and harass other users.
“The comment’s just a tool—it’s like walking around your house and trying to take on all your household chores with just a broom,” Barber says. “What we’ve been trying to do as an industry is, we’ve been tacking this one feature onto every article, and then we’re shocked when it doesn’t work every time.”
Organizations are experimenting with a variety of tools, from streamlined human moderation to automated filtering that can improve the traditional comment experience, as well as looking at other tools like more targeted questionnaires and quizzes that can let readers interact and express themselves without providing an avenue for harassment.
“I think this is a bit of a million-dollar question right now: How can we create a space where members of the public have the opportunity to talk to each other, and it doesn’t turn into a terrible cesspool?” asks Talia Stroud, associate professor of communication studies at the University of Texas at Austin and director of the Engaging News Project at the university’s Annette Strauss Institute for Civic Life.
That project offers an embeddable quiz widget that it has found encourages readers to spend more time on articles without the partisan ballot box stuffing and scientifically dubious results of traditional online opinion polls. The quizzes—asking readers to do things like recall numbers presented in articles—can also help readers remember concrete figures better than simply reading a written piece, the project has found. (You can try one at the bottom of this article.)
The Coral Project also offers an embeddable polling tool called Ask that enables publishers to ask readers specific questions. It gathers information they can use for additional reporting and to produce galleries out of interesting responses that can be displayed onsite.
“Using a form where the responses aren’t posted immediately creates a different tone,” says Barber.
Readers tend to respond a bit more deliberately, he says, than when their posts are immediately dropped onto an existing thread. The project is also working on a sophisticated, highly configurable tool for traditional comments called Talk, slated for a beta release early next year, Barber says.
About nine months ago, Civil released a next-generation comment tool called Civil Comments that is now used by about 50 sites, including the Globe and Mail, among Canada’s most widely read newspapers. Civil Comments requires readers to review others’ comments before they can submit their own, specifying whether a particular comment is subjectively good and whether it’s “civil.”
On the back end of the system, the company’s algorithms adjust for biased moderation. Civil Comments has reduced flagging of published comments as abusive by 90% to 95% on sites where it’s deployed, substantially reducing the requirements for moderators employed by the sites, even as the number of comments submitted has often doubled or tripled, Bogdanoff says.
“In general, what we’re seeing is, you get a wider range of voices,” she says. “The old approach to commenting on news sites really prevented anybody [from participating] who didn’t just want to fight with people all day.”
Other organizations are taking a more automated approach to boosting civility. Toronto startup Viafoura provides a platform that lets customers monitor user email addresses, display names, avatars, and IP addresses to weed out aggressive spammers and trolls, says Allison Munro, the company’s head of marketing and business development.
The company offers a separate tool where individual comments are filtered by smart algorithms that use a mix of configurable rules and machine learning to avoid publishing unwanted material.
“We’ll start from the beginning where we have a basic set of rules, and we can tune those to be either very intense, as in, don’t allow anything in that looks like a swear word or looks like spam, or we can adjust the levels,” says Munro. “We had one customer in Texas that had ‘redneck’ as a positive word. In New York, we had another customer that saw that as a personal attack.”
For the CBC, Viafoura appeared to lead to users commenting more and spending more time engaging with the site—and a decrease in the number of comments flagged as inappropriate, Munro said in a November blog post.
Customers include the Los Angeles Times, TMZ, and the Canadian Broadcasting Corp (CBC).
Disqus, the nine-year-old San Francisco company that says it serves more than 1.8 million comments every day across 750,000 websites, also introduced improved features to fight spam, facilitate moderation, and block trolls this year.
“We’ve been hearing from Disqus users that they want better blocking controls, and I’m happy that the feature is finally here,” Disqus CEO Daniel Ha said in a June statement. “We hope that this makes it easier for people to dive into good discussions, plus we also know that it’ll really help out publishers by reducing how much moderating they have to do.”
Ultimately, quality comments will likely come as a combination of technology and human effort, says Nicholas Diakopoulos, an assistant professor of journalism at the University of Maryland, College Park.
“Commenting platforms are a classic example of a sociotechnical system: They rely not only on technology, but also people like commenters, editors, and moderators operating under a set of norms and policies that influence behavior,” he wrote in an email.
Better-quality comments can also benefit media organizations when news breaks: In those moments, thoughtful reader discussion can itself go viral, encouraging yet more discussion (and more traffic), as happened for some British publications after this year’s Brexit vote, says the Washington Post‘s Barber. And they can reduce situations where user-submitted sections of a publication’s site don’t meet the typical standards of the organization, he says.
“They’re pixels on a page, just like all the other pixels on a page that we manage very carefully,” Barber says. “By creating these spaces and then ignoring them, we’re not mowing the yard and then being angry when the grass grows in a way we didn’t expect.”
Tweet
<> Embed
@ Email
Report
Internet Comments Are Awful. Could They Be Awesome?
For years, it’s been a standard piece of advice to anyone reading the news online: “Don’t read the comments.”
It’s no secret that user-submitted comments on news websites are often angry, racist, misogynistic, or simply ill-informed. That’s contributed to media organizations from NPR and Reuters to Popular Science and legal news site Above the Law deciding in recent years to eliminate comment sections altogether. Other sites, such as the New York Times, have taken to heavily moderating what readers post and limiting when they can do so.
But many media companies remain committed to providing forums where their readers can pose questions, contribute their wisdom, and even civilly debate with one another. In part, many in the industry see it as important to offer a place where people can discuss current events outside “the friends and family echo chamber” of social media, says Aja Bogdanoff, cofounder and CEO of Portland comment tech startup Civil.
Denying them the opportunity to comment might drive them to other venues where “the conversation” is already taking place—social media sites like Facebook and Twitter, local networks like Nextdoor, or forums like Reddit. While some media groups have reported success with closing their comment sections and connecting with readers through social media, that trend also risks reducing reader engagement with the news outlets themselves, and disperses potentially valuable discussion about an article across the web, where it’s harder to find.
More pragmatically, many in the industry have found frequent commenters are often also among sites’ most dedicated readers, spending more time reading articles and looking at ads, and among those more likely to purchase paid subscriptions, says Greg Barber, director of digital news projects at the Washington Post.
“By othering this group—by saying these people who come and speak in the comment spaces, they’re something else and they’re undesirable—that’s actually really dangerous for news organizations,” says Barber, who is also head of strategy and partnerships at The Coral Project, a collaborative effort of the Post, the New York Times, and Mozilla to improve community engagement on news sites. “They are, in fact, your most loyal readers—these are the people who are paying the bills.”
Civil and The Coral Project are among a group of startups, established media companies, and academic projects working to develop ways for news sites to engage active readers. Many are looking to move beyond the troll-friendly environs of a traditional comment box, which can often incentivize attention-seeking commenters to take extreme positions and harass other users.
“The comment’s just a tool—it’s like walking around your house and trying to take on all your household chores with just a broom,” Barber says. “What we’ve been trying to do as an industry is, we’ve been tacking this one feature onto every article, and then we’re shocked when it doesn’t work every time.”
Organizations are experimenting with a variety of tools, from streamlined human moderation to automated filtering that can improve the traditional comment experience, as well as looking at other tools like more targeted questionnaires and quizzes that can let readers interact and express themselves without providing an avenue for harassment.
“I think this is a bit of a million-dollar question right now: How can we create a space where members of the public have the opportunity to talk to each other, and it doesn’t turn into a terrible cesspool?” asks Talia Stroud, associate professor of communication studies at the University of Texas at Austin and director of the Engaging News Project at the university’s Annette Strauss Institute for Civic Life.
That project offers an embeddable quiz widget that it has found encourages readers to spend more time on articles without the partisan ballot box stuffing and scientifically dubious results of traditional online opinion polls. The quizzes—asking readers to do things like recall numbers presented in articles—can also help readers remember concrete figures better than simply reading a written piece, the project has found. (You can try one at the bottom of this article.)
The Coral Project also offers an embeddable polling tool called Ask that enables publishers to ask readers specific questions. It gathers information they can use for additional reporting and to produce galleries out of interesting responses that can be displayed onsite.
“Using a form where the responses aren’t posted immediately creates a different tone,” says Barber.
Readers tend to respond a bit more deliberately, he says, than when their posts are immediately dropped onto an existing thread. The project is also working on a sophisticated, highly configurable tool for traditional comments called Talk, slated for a beta release early next year, Barber says.
About nine months ago, Civil released a next-generation comment tool called Civil Comments that is now used by about 50 sites, including the Globe and Mail, among Canada’s most widely read newspapers. Civil Comments requires readers to review others’ comments before they can submit their own, specifying whether a particular comment is subjectively good and whether it’s “civil.”
On the back end of the system, the company’s algorithms adjust for biased moderation. Civil Comments has reduced flagging of published comments as abusive by 90% to 95% on sites where it’s deployed, substantially reducing the requirements for moderators employed by the sites, even as the number of comments submitted has often doubled or tripled, Bogdanoff says.
“In general, what we’re seeing is, you get a wider range of voices,” she says. “The old approach to commenting on news sites really prevented anybody [from participating] who didn’t just want to fight with people all day.”
Other organizations are taking a more automated approach to boosting civility. Toronto startup Viafoura provides a platform that lets customers monitor user email addresses, display names, avatars, and IP addresses to weed out aggressive spammers and trolls, says Allison Munro, the company’s head of marketing and business development.
The company offers a separate tool where individual comments are filtered by smart algorithms that use a mix of configurable rules and machine learning to avoid publishing unwanted material.
“We’ll start from the beginning where we have a basic set of rules, and we can tune those to be either very intense, as in, don’t allow anything in that looks like a swear word or looks like spam, or we can adjust the levels,” says Munro. “We had one customer in Texas that had ‘redneck’ as a positive word. In New York, we had another customer that saw that as a personal attack.”
For the CBC, Viafoura appeared to lead to users commenting more and spending more time engaging with the site—and a decrease in the number of comments flagged as inappropriate, Munro said in a November blog post.
Customers include the Los Angeles Times, TMZ, and the Canadian Broadcasting Corp (CBC).
Disqus, the nine-year-old San Francisco company that says it serves more than 1.8 million comments every day across 750,000 websites, also introduced improved features to fight spam, facilitate moderation, and block trolls this year.
“We’ve been hearing from Disqus users that they want better blocking controls, and I’m happy that the feature is finally here,” Disqus CEO Daniel Ha said in a June statement. “We hope that this makes it easier for people to dive into good discussions, plus we also know that it’ll really help out publishers by reducing how much moderating they have to do.”
Ultimately, quality comments will likely come as a combination of technology and human effort, says Nicholas Diakopoulos, an assistant professor of journalism at the University of Maryland, College Park.
“Commenting platforms are a classic example of a sociotechnical system: They rely not only on technology, but also people like commenters, editors, and moderators operating under a set of norms and policies that influence behavior,” he wrote in an email.
Better-quality comments can also benefit media organizations when news breaks: In those moments, thoughtful reader discussion can itself go viral, encouraging yet more discussion (and more traffic), as happened for some British publications after this year’s Brexit vote, says the Washington Post‘s Barber. And they can reduce situations where user-submitted sections of a publication’s site don’t meet the typical standards of the organization, he says.
“They’re pixels on a page, just like all the other pixels on a page that we manage very carefully,” Barber says. “By creating these spaces and then ignoring them, we’re not mowing the yard and then being angry when the grass grows in a way we didn’t expect.”
Take a quick quiz we built using the Engaging News Project’s quiz tool:
Fast Company , Read Full Story
(16)
Related
Pinned onto Feedback Management
Embed Pin on Your Blog
Email This Pin
Report This Pin