I thought by now we’d have seen a fuckton of celebrity deepfake nudes and rule 34 porn of every variety, plus apps that let incels create it from pics of their high school crushes/enemies, but it seems like that tidal wave hasn’t hit yet.
Or perhaps the legal protections arrived just in time to discourage those with the know-how
Great, I’m sure nothing bad is going to come out of this, like everything else deepfake-related.
ugh I guess we will just learn to operate under total uncertainty. online media literacy classes for kids have become an urgent need.
It’s been urgent since the beginning of the rise of electronic mass media a hundred years ago. I’m sure we’ll get to it some day.
I had a detecting propaganda workshop when i was a kid in a school in the US. Its not mandatory everywhere, but we do teach these things
And I don’t mean this in any mean way, but how much propaganda did you learn to detect? Like understanding that the american dream is a propaganda?
Me? I thought the class was dumb because it was super obvious. But I’m inherently skeptical, and I do think its important to have for most people who don’t think critically.
I can’t remember the details, but I suspect it was things like who wrote it? Are the claims cited? Who are they citing? Is it peer reviewed? What is the author trying to convey? What type of language is being used? Who is the target audience? Etc
Still better than nothing. Although most of those seem to be training people that “who said it” is how truthfulness of a statement should be judged which is exactly backwards.
The value of “who said it” is to help you recognize their motivation. Anyone trying to convince you of something has a vested interest in their position. Understanding the speaker is critical in understanding their position.
The point is to understand the concept of credibility. Who said it matters. Some people have a demonstrated history if credibility. Some people have a demonstrated history of incredibly.
should be mandatory everywhere and repeated every couple years throughout education because it evolves very fast
Presenter: claims there is no 20-minute process to make a video
Presenter: makes a 20-minute presentation of how to get it set up
On a more serious note, this is fucking terrifying.
Ugh. God…dammit. Now we all have to suffer through months or years of this dumb shit.
We can always do the “nothing beyond my immediate observation can be trusted” method like the flat earthers do
“Were you there?” shudders with rage
No.
Fuck you for bringing that bullshit up though.
I’m starting to live my life like that now, and it’s awesome!
This is going to make politics so much more insufferable.
Not really since in this specific context it’s only working with live feeds. I was more talking about the technoweenies finding low-hanging fruit in the Deepfake world to make life more miserable or annoying to people. Scammers, YouTubers, teenage family members…ALL will be annoying everyone using this specific tool any day now.
Neat, I wonder who I’ll be in tomorrow’s morning stand up meeting
Shouldn’t have shared your face online publicly I guess.
that let’s you deepfake
that lets* you deepfake
That’s my job
Thank you for your service, ociffer.
Another reason to not post pictures of yourself online unless absolutely necessary. Also upload them in really low quality.
Always Photoshop on a sixth finger.
And a third penis
No, let him cook.
It’s one thing to show it off, it’s a whole other to actually show people how to set it up.
It’s literally on github, it’s way out of the bag.
But Muta, they could do this for a couple years now already thanks to AI.
Besides the joke above, should the government ban deepfake porn ai or the undress ai? Im suprised it hasnt been banned in many places yet as you can take stock photos of many models and put them up there. Should 100% be illegal as its not consent.
Is this not how people have been making deep fakes already? What’s different here?
It’s the fact that it’s in near real time now, older techniques with older hardware would need much longer.
This is real time and based from one image.
Deep fakes up to this point are generally not real time, and are generally trained on the source, then with different methods can be applied to the video. Say, making Kermit the Frog doing a dance as the final video, but it’s been deep faked to look like Ms. piggy.
There are tons of examples of AI that post process deep fakes. This is one of the few real time ones that you can link to a webcam, have a single photo, and you are the deepfake.
From my understanding, that hasn’t been done yet, at least not in the AI spaces I’ve been part of.
I guess I’m going to show up to my next meeting with my boss as my boss.
Omg that’s a great idea for my next 1 on 1 lol