Periscope Cracks Down On Abusive Comments, Lets Random Juries Decide The Verdict
Periscope is launching a new community-based feature to get rid of abusive comments and spam from live streams in real time.
With the tool at hand, viewers can now immediately report any comment they deem inappropriate as soon as they appear on the screen during a broadcast. After that, the app will randomly select a group — or rather, a jury — of other spectators, and it will task them with voting on whether the comment is indeed improper or not.
As for the penalty, the spammer or the user sending abusive comments will have their ability to post temporarily revoked, and if they repeat their offense, the punishment will last for the rest of the broadcast.
It should also be noted that whatever the voting result may be, the reporter will no longer see any more activity of the user they flagged down.
Before this development, broadcasters already had access to more or less the same function, but staying on top of everything isn't exactly that easy.
"Broadcasters have always been able to moderate commenters in their broadcast, but we've now created a transparent tool that allows the collective actions of viewers to help moderate bad actors as well," Kayvon Beykpour, CEO and co-founder of Periscope, says.
What's more, the new voting system can reduce the instances of unjustified reports. For instance, it has become a trend to flag users who offer a point of view that's against what the majority of a certain stream thinks. With the final decision at the discretion of randomly picked viewers, the number of unreasonable reports should dwindle in due time.
Of course, broadcasters have the option to not use the tool, and viewers can choose not to participate in moderating comments via Settings.
Availability-wise, the update is already rolling out across Android and iOS devices.