Pictures of Canadian victims are among the thousands of images depicting child sexual abuse that an internet watchdog group found in databases used to train popular artificial image generators “...
This watch dog group was able to find this content. You don’t think the producer of the database should have any responsibility for the content within it? If it’s not feasible for you to guarantee that the contents of your product are legal/ethical, maybe that’s a problem?
I’m not sure about guarantee. That implies perfection which is never attainable in anything. But requiring transparent evidence of due diligence is certainly doable. As are penalties for failure to meet some kind of standard.
It’s past time to institute “grading standards” on large datasets. I have in mind the same kind of statistical standards that are applied in various kinds of defect and contamination analysis. For example, nobody ever guarantees that your food is free of animal feces, only that a fair and representative sample didn’t find any.
This watch dog group was able to find this content. You don’t think the producer of the database should have any responsibility for the content within it? If it’s not feasible for you to guarantee that the contents of your product are legal/ethical, maybe that’s a problem?
I’m not sure about guarantee. That implies perfection which is never attainable in anything. But requiring transparent evidence of due diligence is certainly doable. As are penalties for failure to meet some kind of standard.
It’s past time to institute “grading standards” on large datasets. I have in mind the same kind of statistical standards that are applied in various kinds of defect and contamination analysis. For example, nobody ever guarantees that your food is free of animal feces, only that a fair and representative sample didn’t find any.
The watch dog group made a completely new filter for it.
Yes, the producers should be running all available filters and they did. This one simply wasn’t available.