Most people are retarded and won't be able to spot the obvious mistakes that AI still makes. Just look at how many people get fooled by terrible AI videos on Instagram.
I don't believe AI can detect AI generated images with any reliability. If Trump really did this the AI it would say the same thing unless it was some huge news story it became aware of.
I see two blatant "AI" (LLM) image generation errors, plus three lesser tells.
I could fix all of them in photoshop, but then you'd be able to spot the edits with photoshop. Sorry, the tech's just not there yet.
Now, shit like this will definitely fool normies who can't be bothered to read a two minute primer on spotting shit like the phone or the incorrect grip. But they already don't run basic photo edit checks, and don't believe people who do if the content is something they want to believe.
Listen images haven't been able to be trusted for many many decades. Although right now it's easier for the average person to make an image that looks very real people have been making fake images and editing images manually with analog light and darkroom processing for decades. The Soviet Union was rather famous for cutting former officials out of old photographs when they had fallen out of favor. Of course one of the Jews favorite pictures of the Holocaust with the very skinny man standing next to all the people in the bumps was completely faked and the image of the man was double exposed or however they did it in the dark room into that image and you can see the first image where he doesn't appear
So you really have been able to trust photos for many decades if it was a big enough organization that wanted to spend the skilled resources to make a photo faked or to change a photo.
The only difference now is it's just extremely easy for the average Joe to do one in 5 seconds.
The neckline at higher resolution has a very obvious clip&paste artifact. It's not terrible, looks like the AI did a cutout of a reference image, better than a human would.
[ + ] drstrangergov
[ - ] drstrangergov 7 points 1 monthMar 26, 2025 01:11:37 ago (+7/-0)
[ + ] Niggly_Puff
[ - ] Niggly_Puff [op] 3 points 1 monthMar 25, 2025 22:07:30 ago (+3/-0)
[ + ] boekanier
[ - ] boekanier 1 point 1 monthMar 26, 2025 06:34:50 ago (+1/-0)
[ + ] AngryWhiteKeyboardWarrior
[ - ] AngryWhiteKeyboardWarrior 1 point 1 monthMar 26, 2025 06:00:31 ago (+1/-0)
[ + ] VitaminSieg
[ - ] VitaminSieg 1 point 1 monthMar 26, 2025 15:04:47 ago (+1/-0)
[ + ] GeorgeBailey
[ - ] GeorgeBailey 1 point 1 monthMar 26, 2025 04:37:16 ago (+1/-0)
https://files.catbox.moe/6dre9g.png
[ + ] Niggly_Puff
[ - ] Niggly_Puff [op] 1 point 1 monthMar 26, 2025 05:01:30 ago (+1/-0)
[ + ] BMN003
[ - ] BMN003 0 points 1 monthMar 26, 2025 10:21:54 ago (+0/-0)
I could fix all of them in photoshop, but then you'd be able to spot the edits with photoshop. Sorry, the tech's just not there yet.
Now, shit like this will definitely fool normies who can't be bothered to read a two minute primer on spotting shit like the phone or the incorrect grip. But they already don't run basic photo edit checks, and don't believe people who do if the content is something they want to believe.
[ + ] Crackinjokes
[ - ] Crackinjokes 0 points 1 monthMar 26, 2025 03:27:45 ago (+0/-0)
So you really have been able to trust photos for many decades if it was a big enough organization that wanted to spend the skilled resources to make a photo faked or to change a photo.
The only difference now is it's just extremely easy for the average Joe to do one in 5 seconds.
[ + ] dassar
[ - ] dassar 0 points 1 monthMar 26, 2025 01:02:20 ago (+0/-0)
[ + ] puremadness
[ - ] puremadness 0 points 1 monthMar 26, 2025 00:19:31 ago (+0/-0)
It's not terrible, looks like the AI did a cutout of a reference image, better than a human would.