- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
Really good article I read today. I was already impressed by the first hype of image generators but since haven’t informed myself much. They got really good lately apparently. I can’t decide if I am concerned or impressed. Do you think this would actually be used for something other than memes and misinformation? I thought I might share it and hear your opinions.
Yeah, we finally start to get accountability from public officials via bodycam and now here comes technology that will make it trivial to skew the narrative
It’s still rather easy to identify AI generated pictures, especially of people. There’s still way to go untill we get video that’s sufficiently good that it’s difficult to tell it’s fake. It’s absolutely going to be a problem sooner or later but I doubt we’re anywhere near.
Also, the one benefit this all comes with is the plausible deniability when you’re accused of something even if it really was you. Say you have nudes leak online for example. You can just say they’re not real and it would be really difficult to prove otherwise.