Read the full commentary EFF.org 

In all of the debate about the Stop Enabling Sex Traffickers Act (SESTA, S. 1693), there’s one question that’s received surprisingly little airplay: under SESTA, what would online platforms do in order to protect themselves from the increased liability for their users’ speech?

With the threat of overwhelming criminal and civil liability hanging over their heads, Internet platforms would likely turn to automated filtering of users’ speech in a big way. That’s bad news because when platforms rely too heavily on automated filtering, it almost always results in some voices being silenced. And the most marginalized voices in society can be the first to disappear.

Pin It on Pinterest

Share This