I Was a Victim of Deepfake, And I Can't Get It Removed

Section 230 of the Communications Decency Act protects trolls who post fake videos and photos on social media. Here's why A.I. is about to make that a much bigger problem.

Carlyn Beccia
6 min readMay 19


I Was a Victim of DeepFake, And I Can’t Get It Removed — Section 230 of the Communications Decency Act protects trolls who post fake videos and photos on social media.
Pexels| Photo by Darya Sannikova

When I paid a professional photographer to create a headshot for my last book launch, I never imagined my copyrighted photo could be doctored and posted on social media without my permission. But this is the Wellsian world we created.

The doctored photo was posted anonymously on several dating sites and a misogynistic Facebook Group. The troll scraped my likeness from my publisher's author page and created a disgusting fake screenshot.

Oh, and their Facebook post accused me of a felony.

Your knee-jerk reaction might be similar to mine. One would assume that doctoring someone's image and then accusing them of a felony is a slam-dunk defamation case. That might have been true in the days of print libel, but social media has an ironically named legal loophole — Section 230 of the Communications Decency Act.

This act protects social media companies by not holding them responsible for the content users post on their sites. I learned this the hard way when I asked Facebook to remove the fake image. They refused.

When I couldn't get the photo removed for defamation, I tried using copyright infringement. This is known as a "DMCA Take Down." I am still waiting for a response from Facebook.

Sadly, I won't be the last victim of deepfake. Deepfake uses artificial intelligence to create image, audio, and video hoaxes. It has been a trending issue recently with the advent of text-to-image A.I. tools. Now, anyone can steal someone's likeness and create a fake video or photo with a few simple sentences in less than five seconds.

Advertisers have even created commercials using deepfakes of celebrities — Elon Musk, Tom Cruise, and Leo DiCaprio. And let's not forget the deepfake image of a pimped-out Pope Francis that fooled everyone.

Your voice can be scraped too. Recently, a Wallstreet Journal reporter created an A.I. clone to mimic her voice. Her A.I. self tricked…



Carlyn Beccia

Author & illustrator. My latest books — 10 AT 10, MONSTROUS: THE LORE, GORE, & SCIENCE, and THEY LOST THEIR HEADS. Contact: CarlynBeccia.com