Tech Coalition

Tech Coalition

Civic and Social Organizations

Washington, DC 3,367 followers

Where the global tech industry comes together to fight child sexual abuse online.

About us

The Tech Coalition facilitates the global tech industry’s fight against the online sexual abuse and exploitation of children. The Coalition is an alliance of technology companies of varying sizes and sectors that work together to drive critical advances in technology and adoption of best practices for keeping children safe online. We convene and align the global tech industry, pooling their knowledge and expertise, to help all our members better prevent, detect, report, and remove online child sexual abuse content. This coalition represents a powerful core of expertise that is moving the tech industry towards a digital world where children are free to play, learn, and explore without fear of harm.

Website
https://www.technologycoalition.org
Industry
Civic and Social Organizations
Company size
2-10 employees
Headquarters
Washington, DC
Type
Nonprofit
Founded
2006

Locations

Employees at Tech Coalition

Updates

  • View organization page for Tech Coalition, graphic

    3,367 followers

    Tomorrow at 12:00pm ET, we are pleased to host a Tech Coalition webinar with Modulate, a provider of AI-powered voice technology solutions. They will discuss how Modulate's ToxMod system leverages artificial intelligence in a privacy-aware manner to detect and prevent instances of child exploitation and grooming in voice chat environments. Learn about the technical approaches to moderating in voice chat and other live stream environments by registering here: https://lnkd.in/e3iewaBX

    Welcome! You are invited to join a webinar: Audio Moderation with Modulate. After registering, you will receive a confirmation email about joining the webinar.

    Welcome! You are invited to join a webinar: Audio Moderation with Modulate. After registering, you will receive a confirmation email about joining the webinar.

    us06web.zoom.us

  • View organization page for Tech Coalition, graphic

    3,367 followers

    Interested in learning more about technical solutions for combatting CSAM? We invite you to attend the Tech Coalition's upcoming webinar with Videntifier on June 4 at 1:00pm ET. Videntifier will share about their newest tool, Videntifier Nexus, that provides access to comprehensive hashing databases for various types of harmful content, including CSAM. Videntifier has already signed agreements with NCMEC and IWF, and is in the process of partnering with C3P and Interpol to expand its database coverage. In addition to supporting the Videntifier proprietary hash format, Videntifier Nexus supports other hash types like MD5, PDNA, and PDQ. Register here to learn more about this solution and hear other innovations in hashing and matching technologies: https://lnkd.in/eazscM_8

    Welcome! You are invited to join a webinar: Video and Image Hashing Solutions with Videntifier . After registering, you will receive a confirmation email about joining the webinar.

    Welcome! You are invited to join a webinar: Video and Image Hashing Solutions with Videntifier . After registering, you will receive a confirmation email about joining the webinar.

    us06web.zoom.us

  • View organization page for Tech Coalition, graphic

    3,367 followers

    The Tech Coalition will fund new research on generative AI and online child sexual exploitation and abuse (OCSEA) through its Tech Coalition Safe Online Research Fund. The first project to be funded will be research from the University of Kent on the impact of generative AI child sexual abuse material (CSAM) proliferation, focusing on how generative AI CSAM may reshape attitudes, norms, and behaviors among people who engage with CSAM and on how the perpetration and prevention ecosystems may respond. Additional projects will be chosen by the end of the year for funding in 2025. Application details will be announced in the coming months, so stay tuned! The new funding was announced today at an industry briefing on generative AI with key stakeholders hosted by the Tech Coalition. This was the second briefing of its kind and took place in London with select UK child safety experts, advocates, and members of law enforcement and government. Among them were representatives from the Home Office, Internet Watch Foundation (IWF), Lucy Faithfull Foundation, the National Center for Missing & Exploited Children (NCMEC), and Safe Online, as well as 14 Tech Coalition member companies, including Adobe, Amazon, Bumble Inc., Google, Meta, Microsoft, OpenAI, Roblox, Snap Inc., and TikTok. These briefings are designed to develop a shared understanding of the potential risks predatory actors pose to children through the misuse of generative AI and the ways companies are currently addressing those threats, as well as to identify and initiate new opportunities for stakeholder collaboration on this issue. Read more here: https://lnkd.in/epR9RYhW

    Tech Coalition | Tech Coalition Announces New Generative AI Research

    Tech Coalition | Tech Coalition Announces New Generative AI Research

    technologycoalition.org

  • View organization page for Tech Coalition, graphic

    3,367 followers

    New research out today from one of our #SafeOnline grantees that examines the "cottage industry" of online child sexual exploitation and abuse in the Philippines. It's clear that more collaboration is needed on the ground and among the financial services and tech industries. Thank you to Dublin City University, De La Salle University, and Justice and Care for this work.

    View organization page for Justice and Care, graphic

    3,383 followers

    A groundbreaking new report on the Facilitation of Online Sexual Abuse and Exploitation of Children (OSAEC) is released today - produced by Justice and Care, together with researchers from Dublin City University and De La Salle University. The issue has reached epidemic levels, sustained by global demand and is run like a ‘cottage industry’, our team found. Other key findings include: • Traffickers mentor one another, passing on advice on how to set up, enable money transfer and attract foreign customers - who primarily come from Western countries and parts of South Asia. • Social media platforms, dating sites and adult cybersex sites are being used to initially engage with foreign customers. • There is a huge disparity between the lengthy sentences received by traffickers in the Philippines and foreign customers who often go unpunished. Find the full report, including our recommendations to help tackle the crime, on our website: https://lnkd.in/ezNjt2VE

    • No alternative text description for this image
  • Tech Coalition reposted this

    View organization page for Tech Coalition, graphic

    3,367 followers

    Check out our first Lantern Transparency Report and our 2023 Annual Report. The signals being shared in Lantern are producing tangible outcomes and helping to protect children from cross-platform abuses. As a result of signals shared in Lantern through December 2023, participating companies identified, confirmed, and took action on 30,989 accounts for violations of policies prohibiting child sexual exploitation and abuse.  In addition, 1,293 individual uploads of child sexual exploitation or abuse material were removed, and 389 URLs/bulk uploads of child sexual exploitation and abuse material were removed. These outcomes are in addition to the enforcement actions taken by individual companies against violations on their own platforms in accordance with their established terms of service. The report also provides data about signals uploaded and removed from Lantern. While there is more work to be done, we are encouraged by the progress to date and look forward to sharing the results of subsequent annual Lantern Transparency Reports. The 2023 Annual Report showcases how the Tech Coalition is helping to advance industry’s collective efforts to protect children from OCSEA. It also includes our transparency report with metrics and insights from member companies. Some highlights include: - We welcomed seven new member companies, bringing the total number of member companies to 37.  - 35 of our 37 members tangibly enhanced their capacity to combat online child sexual exploitation and abuse, based on objective milestones we've established for them in five key areas - For the second straight year, member companies increased their adoption of image and video-based hashing technologies. - 57 percent of our members now use machine learning classifiers to help detect previously unknown images of CSAM. We are proud of this work and look forward to making even more of an impact in 2024. Lantern Transparency Report: https://lnkd.in/ewFmi6gZ 2023 Annual Report: https://lnkd.in/ebYvj37j

    • No alternative text description for this image
  • View organization page for Tech Coalition, graphic

    3,367 followers

    Check out our first Lantern Transparency Report and our 2023 Annual Report. The signals being shared in Lantern are producing tangible outcomes and helping to protect children from cross-platform abuses. As a result of signals shared in Lantern through December 2023, participating companies identified, confirmed, and took action on 30,989 accounts for violations of policies prohibiting child sexual exploitation and abuse.  In addition, 1,293 individual uploads of child sexual exploitation or abuse material were removed, and 389 URLs/bulk uploads of child sexual exploitation and abuse material were removed. These outcomes are in addition to the enforcement actions taken by individual companies against violations on their own platforms in accordance with their established terms of service. The report also provides data about signals uploaded and removed from Lantern. While there is more work to be done, we are encouraged by the progress to date and look forward to sharing the results of subsequent annual Lantern Transparency Reports. The 2023 Annual Report showcases how the Tech Coalition is helping to advance industry’s collective efforts to protect children from OCSEA. It also includes our transparency report with metrics and insights from member companies. Some highlights include: - We welcomed seven new member companies, bringing the total number of member companies to 37.  - 35 of our 37 members tangibly enhanced their capacity to combat online child sexual exploitation and abuse, based on objective milestones we've established for them in five key areas - For the second straight year, member companies increased their adoption of image and video-based hashing technologies. - 57 percent of our members now use machine learning classifiers to help detect previously unknown images of CSAM. We are proud of this work and look forward to making even more of an impact in 2024. Lantern Transparency Report: https://lnkd.in/ewFmi6gZ 2023 Annual Report: https://lnkd.in/ebYvj37j

    • No alternative text description for this image
  • Tech Coalition reposted this

    View organization page for Tech Coalition, graphic

    3,367 followers

    ICYMI: Meta announced that they have begun sharing more sextortion-specific signals to Lantern, our cross-platform signal sharing program for companies to strengthen how they enforce their child safety policies. Financial sextortion has been deemed a “growing crisis” by the National Center for Missing & Exploited Children, after seeing an alarming increase in CyberTipline reports in 2023 related to this crime. Collaboration is necessary to protect children online, particularly from cross-platform threats like financial sextortion.

    • No alternative text description for this image
  • View organization page for Tech Coalition, graphic

    3,367 followers

    ICYMI: Meta announced that they have begun sharing more sextortion-specific signals to Lantern, our cross-platform signal sharing program for companies to strengthen how they enforce their child safety policies. Financial sextortion has been deemed a “growing crisis” by the National Center for Missing & Exploited Children, after seeing an alarming increase in CyberTipline reports in 2023 related to this crime. Collaboration is necessary to protect children online, particularly from cross-platform threats like financial sextortion.

    • No alternative text description for this image

Similar pages

Browse jobs