• Maeve@kbin.earth
    link
    fedilink
    arrow-up
    33
    ·
    7 hours ago

    I was able to pull transcripts of sale negotiations for teen girls that traffickers were engaging in on Facebook Messenger, the private messaging function. In exhibit documents, there were pictures of trafficking victims being advertised for sale in Instagram’s Stories function. Money and logistics had been discussed. In the cases we found, none of these crimes had been detected or flagged by Meta. McNamara and I contacted former contract workers who had been employed to moderate Facebook and Instagram, tasked with reporting and removing harmful content. Many were traumatised by the content they had had to review each day. All said their efforts to flag and escalate possible child trafficking on Meta platforms often went nowhere, and harmful content was rarely taken down by the company. They felt helpless, and believed Meta’s criteria for escalating possible crimes to law enforcement was too narrow.

    • VieuxQueb@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      They make money out of trafficking and scams etc… all the interactions sell ADS ! And the scammers even buy ADS. Why would facebook remove any of it, it would need to cost them money to make it stop, so fine them and make.them accountable for the horrors they allow.

    • P00ptart@lemmy.world
      link
      fedilink
      English
      arrow-up
      30
      ·
      4 hours ago

      “hey, we did the work, now we expect something to be done on it” is far from “look what we found! Give us a Pulitzer!”