I Was a Victim of Deepfake, And I Can't Get It Removed

Section 230 of the Communications Decency Act protects trolls who post fake videos and photos on social media. A.I. is about to make that a much bigger problem.

Carlyn Beccia

--

I Was a Victim of DeepFake, And I Can’t Get It Removed — Section 230 of the Communications Decency Act protects trolls who post fake videos and photos on social media.
Pexels| Photo by Darya Sannikova

When I paid a professional photographer to create a headshot for my last book launch, I never imagined my copyrighted photo could be doctored and posted on social media without my permission. But this is the Wellsian world we created.

The doctored photo was posted anonymously on several dating sites and a misogynistic Facebook Group. The troll scraped my likeness from my publisher's author page and created a disgusting fake screenshot.

Oh, and their Facebook post accused me of a felony.

Your knee-jerk reaction might be similar to mine. One would assume that doctoring someone's image and then accusing them of a felony is a slam-dunk defamation case. That might have been true in the days of print libel, but social media has an ironically named legal loophole — Section 230 of the Communications Decency Act.

This act protects social media companies by not holding them responsible for the content users post on their sites. I learned this the hard way when I asked Facebook to remove the fake…

--

--

Carlyn Beccia

Award-winning author of 13 books. My latest: 10 AT 10: The Surprising Childhoods of 10 Remarkable People, MONSTROUS: The Lore, Gore, & Science. CarlynBeccia.com