• Swordgeek@lemmy.ca
    link
    fedilink
    arrow-up
    16
    ·
    11 months ago

    The database started out empty. They added all of the content. The filtering should have been part of the intake process, not after the fact. Image recognition has beem used to detect CP for many years now.

    They could have and should gave stopped these images from getting into the dataset at all, but they didn’t. Consequently, people who were victimized as children are having the exploitive images of them being used to generate new (synthetic) child porn.

    • Grimy@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      2
      ·
      11 months ago

      They did run filters. The group that found the new ones made a completely new stronger filter that is better at detecting it. You can’t blame them for not using technology that just wasn’t available at the time. They also pulled the whole dataset the moment the group alerted them to it and removed them.