- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
I’m not surprised unlisted content would show up. A single public or leaked link means unlisted is discoverable elsewhere than the primary listings. YouTube can’t solve that. The private alternative setting already exists.
The problem with law solutions is that they only work as far as the law and prosecution reaches. Maybe the western nations will agree on common policies. Like they do on copyright for example. But will China follow? Russia? Smaller countries? Will the prosecution be active or realistically possible?
Laws are important as agreed upon baselines. But they’re no technical guarantees. They’re quite limited on a public, accessible Internet.
While I don’t think training on hidden data, or without the author’s permission, is particularly great… won’t the next article be "AI discriminates against races/cultures/ages" when this data gets removed from the training set, without being replaced by equivalent authorized photos?