Read the full article by Mike Masnick on TechDirt.com

So, what does SESTA do? Rather than make it easier for law enforcement to go after those illegal activities, it creates a new illegal activity: that of running a website that is used by sex traffickers. So, as we’ve discussed before, this creates a serious “moderator’s dilemma” for websites, leading to one of two likely outcomes. Many websites may stop moderating content, because if they’re not looking at the content, they can more credibly claim a lack of knowledge. That means less moderation, less oversight, and likely more use of those platforms for sex-trafficking ads. So, suddenly, sex-traffickers will gravitate to those platforms, and those platforms will be less likelyto cooperate with law enforcement because (again) they want to avoid “knowledge” of how their platform is being used, and working with law enforcement risks more knowledge.

On the flip-side of the moderator’s dilemma, you will get sites that much more vigorously moderate content. This seems to be the solution that SESTA supports think all platforms will embrace — which is almost certainly incorrect. Indeed, it’s incorrect on multiple levels, because not only will some platforms embrace this more heavy moderation setup, those that do will almost certainly over-moderate to a drastic degree, in order to avoid liability. That will mean fairly aggressive levels of keyword blocking, filters, and automated removals. And, as anyone who has studied how such systems work in the real world, all of those will fail. And they’ll fail with both false negatives and false positives. That is, lots of perfectly legitimate content will get taken down (perhaps, as we’ve discussed before, it could be material to help victims of sex-trafficking), and lots of sex-trafficking content will still get through.

You can follow Mike on Twitter.

Pin It on Pinterest

Share This