Content moderation keeps horrible internet content away from the public -- but is there a cost? What happens to the actual moderators tasked with sorting through the disturbing content?
Vice says these moderators see a lot of ugly things like murders, animal abuse, and sexual violence. A former FB moderator said it can be overwhelming at times, and there isn’t enough psych support
A paper from the Conference on Human Factors in Computing Systems says the lack of support is a problem -- there are *tons* of psychological health concerns with content moderation
Harvard Law agrees: These moderators go through hell to keep us safe
But New Media Services says there’s really no other way; it’s too crucial for business -- without moderation, users wouldn’t be safe from harmful content
Business Insider agrees -- extremists tend to benefit from less content moderation, and that’s bad for everybody. We need moderation
Innodata agrees, but maybe we don’t need moderators -- AI could be a solution. It’s not a reality yet though
TTEC is optimistic -- all we need is a solid psychological health plan, corporate accountability, and good AI tools to help workers