If you Google my name, the phrase “revenge porn” is bound to pop up. Go ahead and do it, I’ll wait. There’s a good reason for this association: I led a protest march across the Brooklyn Bridge called “March Against Revenge Porn,” and just a few years ago, I spoke at a TEDx event about my journey from victim to activist.
If you Google my name, the phrase “revenge porn” is bound to pop up. Go ahead and do it, I’ll wait.
But today, I’d never use those words to describe my experience. I’m not a victim of “revenge porn” — I’m the victim of child sexual abuse material, or CSAM, and image-based sexual violence, or IBSV.
And these distinctions matter. Using the correct terms is crucial in raising awareness of a problem that is still traumatizing thousands, and to getting lawmakers and tech companies to take action. Pornography is produced between two consenting adults and has nothing in common with CSAM, which depicts sexualization, rape, and assault of children and teenagers.
As I’ve come to understand how to accurately categorize my abuse, I’ve also learned more about the broader landscape of CSAM. It’s not an exaggeration to describe this as an epidemic of harm that is shattering childhoods worldwide. In 2022, the National Center for Missing and Exploited Children’s CyberTipline received 32 million reports of CSAM. The vast majority of those reports were submitted by Google, WhatsApp, Facebook, Instagram and Omegle. (Omegle shut down in 2023.) In 2023, NCMEC says it received a record 36 million reports.
Missing from this list is Apple, and its nearly entirely unregulated iCloud.
In 2022, while other large tech companies reported and worked to remove millions of CSAM, Apple reported just 234 pieces of content. That’s because, unlike many of its competitors, the company refuses to voluntarily monitor for and flag when such content is uploaded. Reporting indicates that Apple’s disclosures are merely the tip of an iceberg of abuse happening under the surface.
When I was a teenager, nude photos of me were shared on the internet, many of which were shared on iPhones and, given their proliferation, likely stored on iCloud.
To this day, many of these photos remain online.
Like many survivors, I live with complex post-traumatic stress disorder from compounded childhood trauma — trauma that was exacerbated by the public shaming that I endured after identifying as a survivor.
In 2021, I felt hopeful when Apple announced that it was taking action to flag and remove child sexual abuse material from iCloud. Maybe this would mean my images would no longer be circulated, I remember thinking. Maybe this would be the end of the abuse that I suffered at the hands of the boys who first exploited me and the adults who found — and continue to find — sexual gratification in images of my childhood body.
But the company was criticized by privacy advocates and others. Thousands of people signed a petition against the moves. Unfortunately, rather than finding a way to address these concerns — while still monitoring for child abuse — the company simply abandoned its proposal.








