Thorn

Thorn

Non-profit Organizations

Manhattan Beach, CA 30,868 followers

About us

We are Thorn. Our mission of defending children from sexual exploitation and abuse is deeply embedded within our core—a shared code that drives us to do challenging work with resilience and determination. Here, you’ll work among the best hearts and minds in tech, data, and business, creating powerful products that protect children’s futures. Unleash your own formidable talents while learning among peers and growing every day. All in a supportive environment of wellness, care, and compassion. Build your career as we help build a world where every child can be safe, curious, and happy.

Website
http://www.thorn.org
Industry
Non-profit Organizations
Company size
51-200 employees
Headquarters
Manhattan Beach, CA
Type
Nonprofit
Founded
2012
Specialties
technology innovation and child sexual exploitation

Locations

Employees at Thorn

Updates

  • View organization page for Thorn, graphic

    30,868 followers

    There has been an alarming rise in financial #sextortion involving youth. Brand-new Thorn research, to be released on June 24, gives insights into just how pervasive this devastating form of exploitation has become. Join us Wed, June 26 at 3 p.m. ET for a critical conversation to learn about: • The alarming rise in financial sextortion cases • Insights into exploitation tactics • Support systems and where to find help for victims • How to make a difference Register now: https://lnkd.in/ggySADR5

    • No alternative text description for this image
  • View organization page for Thorn, graphic

    30,868 followers

    What an incredible few days it was at the Social Innovation Summit, connecting with so many social impact leaders across sectors. Our CEO, Julie Cordua, had the honor of sharing a roadmap for innovating with technology and within our communities to combat the crisis of online child sexual abuse and exploitation. What does that look like in action? 1️⃣ Every company with an upload button proactively designs safe spaces for children and detects child sexual abuse at scale. 2️⃣ Parents have access to resources that help them engage in frequent, shame-free conversations with their children about staying safe online. 3️⃣ Consumer brands leverage their platforms to raise awareness and change the conversation around this issue that affects nearly every child in some way. 4️⃣ Policymakers create and implement legislation that balances the need for both privacy and safety for our children online. 5️⃣ Philanthropists recognize that tackling this threat is crucial for achieving positive long-term outcomes for children across the board. We can’t wait to see what ideas come to fruition for those who joined her session and how they might bring innovation to this urgent issue. Were you there? Comment below with your takeaways or questions and if you weren’t in attendance, we hope to hear your thoughts on the roadmap above. #SIS24 #SocialImpact

    • No alternative text description for this image
    • No alternative text description for this image
  • View organization page for Thorn, graphic

    30,868 followers

    Check out some highlights from our webinar on generative AI risks to child safety for online platforms and catch the full session on-demand: https://lnkd.in/eM4dEDZZ Thorn Senior Research Manager, Amanda Goharian, and Vice President of Customers & Partnerships, 🦄 Amanda H. Volz, discussed #GenerativeAI and its implications. They also explored the observed and potential risks of child sexual exploitation, aiming to prevent the exponential growth of this threat on your platform. #ChildSaftey #GenerativeAI

  • View organization page for Thorn, graphic

    30,868 followers

    In 2023, Thorn technologies detected more child sexual abuse material files online than ever before, helping to stop the viral spread of this horrific content and reduce revictimization. As images and videos of a child’s sexual abuse circulate online, sometimes for years, they continue the cycle of trauma even after that child has been rescued. By detecting CSAM, online platforms can report and work to remove those files from the internet, bringing an end to that revictimization. Last year, our comprehensive CSAM detection solution, Safer, empowered digital platforms to detect a staggering 3.8 million files of known and previously unreported CSAM online. This remarkable impact is only possible because of our caring supporters, like you. See all that you helped us accomplish last year in our 2023 Impact Report: https://lnkd.in/gDBvUu6k

  • View organization page for Thorn, graphic

    30,868 followers

    In 2023, Thorn’s CSAM Classifier allowed more law enforcement investigators to detect CSAM faster, accelerating their ability to identify child victims and remove them from harm. Every day, these agents sift through mountains of digital evidence in child sexual abuse cases. Our CSAM Classifier significantly speeds up this painstaking process, saving investigators critical time and allowing them to identify child victims faster. Last year, we proudly launched a beta partnership with @Magnet Forensics, the world leader in digital media forensics for child sexual abuse investigations. Our CSAM Classifier is now available in Magnet Griffeye, a platform used by law enforcement worldwide. Discover the full impact you helped make possible in 2023 by viewing our inspiring Impact Report: https://lnkd.in/gDBvUu6k

  • View organization page for Thorn, graphic

    30,868 followers

    The National Center for Missing & Exploited Children recently released its 2023 CyberTipline report, capturing key trends from reports submitted by the public and digital platforms on suspected child sexual abuse material and child sexual exploitation. Two clear themes emerged from this year’s report: Child sexual abuse remains a fundamentally human issue — one that technology is making significantly worse. And, leading-edge technologies must be part of the solution if we’re to tackle the astonishing scale of this crisis. Learn about the dual role technology plays in the fight against child sexual abuse and exploitation, as well as more insights from the NCMEC report: https://lnkd.in/grrzzewj

  • View organization page for Thorn, graphic

    30,868 followers

    In 2023, with your support, we empowered youth and parents to have vital, judgment-free conversations around digital safety. Our youth-centered program, NoFiltr, received over a million engagements on its social media — reaching youth with critical prevention and support messaging in a fun and informative way. Also, last year, over 10,000 parents visited our Thorn for Parents resource hub for tips on conversation starters and other tools to help them have open, honest conversations with their children to prevent abuse before it starts. See how your support helped us create these important resources, plus the rest of our accomplishments last year, in our 2023 Impact Report: https://lnkd.in/gDBvUu6k

  • View organization page for Thorn, graphic

    30,868 followers

    In 2023, thanks to the help of our supporters, Thorn made truly remarkable impact: 1.4 million NoFiltr youth social engagements 2,839 parents signed up for conversation tips 32% average time savings for users of our CSAM Classifier 3,833,792 child sexual abuse material files detected 5 groundbreaking research reports published Influenced EU child safety policy and regulation discussions See all that we achieved together in the full impact report at the link in our bio, and discover how Thorn’s programs and technologies are defending children from sexual abuse: https://lnkd.in/gg-Uzgpp

  • View organization page for Thorn, graphic

    30,868 followers

    Our latest brief is here for trust & safety professionals! Empower your team with the insights and strategies to protect both youth and your platform.

Affiliated pages

Similar pages

Browse jobs

Funding

Thorn 2 total rounds

Last Round

Grant

US$ 345.0K

See more info on crunchbase