From
Send to

Online slander can be detected early on

April 22, 2015 - 19:45 By Korea Herald
There is some hope to reducing online trolling and flaming.

Researchers have discovered an algorithm that can identify antisocial users -- who post malicious comments online -- early in their online activities by observing only five to 10 posts.

In the recent paper titled “Antisocial Behavior in Online Discussion Communities,” the researchers at Cornell and Stanford University examined over 40 million posts from a total of 1.7 million members over an 18 months period. They looked at the three online news communities: CNN’s comment section, the political site Breitbart and game site IGN.

“We find that users who tend to concentrate their efforts in a small number of threads, are more likely to post irrelevantly, and are more successful at garnering responses from other users,” said the researchers.

To get this conclusion, the researchers compared posts from permanently banned users with regular users who had never been banned. Temporarily banned users were omitted in the research.

The researchers studied the evolution of the banned-users from the moment they entered an online community up until they get kicked out from the site, and concluded that “not only do they write worse than other users over time, but they also become increasingly less tolerated by the community.”

If the users have their posts arbitrarily deleted, they are likely to write worse in the future, the research concluded.

However, banning these users may not be the most adequate way to discourage them, the researchers added. ”While we present effective mechanisms for identifying and potentially weeding antisocial users out of community, taking extreme action against small infractions can exacerbate antisocial behavior,” they said.

Better way is to give these antisocial users a chance to redeem themselves, the researchers recommended.

By Ahn Sung-mi (sahn@heraldcorp.com)