For public online communities, the time and effort it takes to keep the community clean of offensive content can sometimes be daunting for community managers. Spammers employ creative tactics to gain the attention of your community members and steer clear of automated spam filters (and diligent moderators). One approach to mitigating spam is to require that every comment be screened by a moderator prior to making it public; however, this practice can negatively impact legitimate contributions from community members.
Telligent provides a number of automated moderation tools to help protect against spam. But, what if you could also use crowdsourcing for moderation – leveraging the wisdom of the crowd to guard against spam? With Telligent’s abuse management system, you can save time and deliver a better customer experience by empowering community members to moderate content. And, you can do this while still enabling moderators to have the final decision on what content stays and what content gets removed.
The abuse management system taps into our dynamic reputation engine to determine when to automatically hide content. And, it empowers content authors to moderate content through a comprehensive appeals process. The following four components of the abuse management system, along with existing automatic moderation tools for spam filtering, provide a comprehensive solution to keeping your online community clean and safe for all to benefit from.
While moderating abuse is the least fun aspect of managing an online community, it is essential to community health and vitality. You can learn more about Telligent’s moderation capabilities in our product documentation on Telligent.com.
Telligent Systems, Inc. ©2013, All Rights Reserved