arrow_upward
Do you think it's ethical to look at deepfake porn
#1
I for one enjoy deepfake porn, but it always bother me knowing people didn't give consent to showing their face on porn.



[+] 1 user Likes firehi159's post
#2
I dont see why not. Its just fantasy anyway.



#3
I guess in a way its not all that different from a lot of the R34 stuff. Before and after Deepfake, its safe to assume if it exists there's porn of it



#4
I agree with you



#5
What if in the future, the program gets more mature and you dont have to do it with just celebrities anymore. Wouldn't it be creepy if you could do it to your hot friend or crush. I think its the same idea.



#6
(21-10-2020, 03:20 AM)firehi159 Wrote: What if in the future, the program gets more mature and you dont have to do it with just celebrities anymore. Wouldn't it be creepy if you could do it to your hot friend or crush. I think its the same idea.

you already can do it to your hot friend or crush assuming you can get enough footage of them for the deepfake AI to be able to learn their face.



#7
Do I think it's ethical? Not in the slightest. Will that stop me from secretly watching and enjoying it anyway? Also not in the slightest.



[+] 2 users Like tetoffensive's post
#8
I agree with tetoffensive. I can't get into the deepfake videos it looks too fake for me. I did grab a copy of DeepNude before the site was killed, and use it on friends from time to time. Usually they come out as completely garbage but sometimes there is a winner.



[+] 1 user Likes Fwanktastic's post
#9
From what I've seen, the footage has to be good quality and quantity to really make it work. Too many people seem to be pumping out DeepFakes using not enough of one, the other, or both. So end result is you end up with a crappy pixelated face that doesn't match anything and it kills the mood more than anything else.

As far as those little edit programs like DeepNude go, I agree that it tends to be absolute garbage. I fiddled around with that on actual people I've seen naked, whether celebs or more personal, and yeah, every now and then you might get one that looks somewhat decent, but the rest...can't do it.



[+] 1 user Likes tetoffensive's post
#10
per portions are important too, you could have quality content and a good amount of it.. but putting DDs on Ariana Grande just doesn't work and is an immediate turn off



#11
Extremely borderline. I'm not 100% where I fall on the issue



#12
I've seen a few recently where it's difficult to tell if it's the real person or not.



#13
Hmm never rly seen any good ones. Any suggestions



#14
Its so easy to tell its fake, although they are getting better

I think there a good natalie portman one out there, better than the other ones i have seen



#15
Does it really matter though? People have been creeping on instaslut and facebook for years at this point. And before any of that they just creeped on whatever photos they managed to acquire. Is this really any different at all? Now they do less imagination, more jazz hands and more number crunching before they do their imagination? It is creepy, yes. But does it really hurt anybody? If not the ethics question is not legitimate.



#16
If it gets to a level where its genuinely hard to tell them apart then we would need to revisit this topic again i reckon



#17
Most deepfakes are easy to discern just by looking at the rest of the proportions and the countenance. It's still grossly unethical but as long as it's labeled as 'Deepfake' then it shouldn't be that big of an issue, for now at least.



#18
There was a fairly convincing one on Pornhub recently of Emilia Clarke but it was actually Naomi Woods. I think the illusion visually was great but not so much when "Emilia" started talking with an American accent though.

Unethical? I would agree that when these become so good that we have to have actual people deny that it's them then we revisit this topic again but right now, it's not so much 'deep fake' but deeply sad that there are some people out there who spend so much time doing weird stuff like this.