Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
AI is getting better at generating porn (techcrunch.com)
104 points by mmastrac on Sept 3, 2022 | hide | past | favorite | 125 comments


Creator of pornpen.ai here, looks like this discussion is ramping up again! Wanted to respond to some of the points made here and in the article.

The grotesque images: Simply put, the model is not good enough yet. This will improve over time as the models get better.

Diversity/male gaze/heteronormative: Generating men and other genders did not yield very high quality results unfortunately, so I didn't include it in the initial launch. I am also manually fine tuning the prompts, so it was easier to pick one category (women) and go with it. I hope to expand this in the future.

Sex workers: I do agree with the quote in the article that this will not replace sex workers (at least for now) in the same way that I don't think dalle/midjourney/stable diffusion will replace illustrators. In the future, I believe this technology could create a new category of sex worker that can generate content without using their own body (similar to VTubers).

It's harder to say what will happen in the future though. If these models keep improving at this pace, I think these types of discussions will be more important. Let me know if you have any questions!


I'm just happy for it to take 5 minutes to retrieve a single picture again.


Don't fret, AIDungeon already had its run, with plenty of slash fiction in its dataset.


What are your thoughts on also releasing the prompts and maybe even seed for each image? I assume it wouldn't be much work and would allow people to fine tune them.


Why are the default tags babe, busty, perfect boobs, skinny, and perfect body?


Good question. I figured that many people who come to the site will just press generate right away without realizing you can choose tags. When 0 tags are selected, you get weird results. So I wanted to make a good default selection. But maybe a better option is to force users to select some tags first, and start empty?


FWIW, I think your initial idea is the better one. Like you said, people will just want to hit Generate to see what happens. Modals in the way will just annoy. Maybe just randomizing the start tags, for variety in the stream?


I think the term “perfect” is a bit abstract/subjective as a qualitative assessment. Perhaps make it a bit more descriptive in an objective way what kind of type of body it is?


Especially given the number of limbs --- and other appendages --- that sometimes appears!


Its the tyranid kind of perfection ;-)


But why specifically these tags? You avoided answering that. :)


I’ll guess because those are popular porn categories. What would you prefer they be?


Three ass cheeks are indeed perfect, right there with three boobed bar hooker from Mars.


> On Y Combinator’s Hacker News forum, a user purporting to be the creator describes Porn Pen as an “experiment” using cutting-edge text-to-image models. “I explicitly removed the ability to specify custom text to avoid harmful imagery from being generated,” they wrote. “New tags will be added once the prompt-engineering algorithm is fine-tuned further.” The creator did not respond to TechCrunch’s request for comment.

Scraping an HN comment to post on an article just to get submitted back onto HN for discussion, classic.


I’m more surprised by the apparent editing error where the only sex worker evidently quoted is characterized as female throughout the piece except in one instance. I think the intent of the editorial missed-a-spot was greater anonymity provisions by default. I doubt it harmed the particular interviewee in this instance, but that kind of slippery protection of sources certainly won’t encourage more sex workers to talk to them or trust their process. Never mind that their single source is already unrepresentative of a very diverse range of attitudes among sex workers.


Nearly all sex workers (97% in stats I found) are cisgender or transgender women, so that in itself doesn’t seem to significantly de-anonymize them.


The editing mistake wasn’t deciding whether to identify the interviewee’s gender, but revealing part of their internal privacy process by missing it in edit. It doesn’t matter, for being interviewed, what proportion of sex workers are any gender if the publication reveals flaws in how they protect anonymity of the people they interview. I spotted this with a very cursory read, and distrust the publication on instinct. Again not because I care about the gender of the single sex worker interviewed, but because I don’t think their editorial process was thorough enough to protect one source.


Ah, thanks for the explanation. From an editorial POV, what makes you think it was a mistake?


Just seeing several “she”, with direct references to the interviewee, and one “they” that felt really out of place and didn’t feel like a fully formed section of the article.


> But it’s still depressing how both the options and defaults replicate a very heteronormative and male gaze

I don’t understand this particular aspect of fretting. Do the people making such comments think this won’t be applied eventually to all sexual preferences? Given that heteronormative male gaze drives the vast majority of porn consumption, doesn’t seem surprising that that’s where this kind of thing starts at first.

The bigger concern to me by far is revenge porn and use by children and that has nothing to do with gender.


Making comments like this is just obligatory throat-clearing now, especially if you're in academia, which the quote source is. At this point I just fnord right over them and move on.


Learned a new word today

https://en.wikipedia.org/wiki/Fnord


That's either a healthy attitude that I lack or a form of defeatism that I lack, I am not sure which. I love the turn of phrase though.


This is what happens when virtue signaling takes precedence over reality.

Did the person who is so depressed over this seriously think it would be male nudes specifically targeted to a female audience that will first make it into production?


Moreover it's intellectually lazy. Male genitals being external mean they're legitimately harder to draw and we know how much these AIs struggle with intricate anatomy like hands.

Super disingenuous woke virtue signalling.


Especially since the author of pornpen states he just hasn’t gotten around to it. I mean it’s virtue signaling before giving the author a break; that is truly lazy picking an angle for your clickbait needs.


I predict that we're not going to hear any complaints from the woke types when all the AI-generated images of men are 6'3 with 10% body fat and 9 inch cocks.


Exactly. Woke culture never seems to mention women's preference for tall men, which is actually a lot worse than men's preference for skinnier women, as the former is not something a person can change.


> Woke culture never seems to mention women's preference for tall men,

“Woke culture” isn’t an actual thing, it's a propaganda terms that applies to a fiction deliberately created for partisan political purposes.

But the real world social justice and DEI communities do in fact talk a fair amount about height bias as an aspect of physical. appearance discrimination, and it is by no means limited to women, though that may be where it shows up most in mate selection (it's very prominent, outside of mate selection, as a bias in assessments of competency across the board, and shows up a lot in the workplace.)


That's true. A lot of people conflate aggressive and misandrist self-described (but fake) feminists who adopt that label for clout and status and who are prominent in high culture, and the actual more low profile social justice folks who often criticize things like a culture towards height preference. Amber Heard is a good example of the former. A clear narcissist who uses "short" as a pejorative against men and adopts the feminist label for clout.


If woke culture is not a thing, then what is the word to describe what you know that I’m talking about?


> If woke culture is not a thing, then what is the word to describe what you know that I’m talking about?

I don't know what you are talking about. I assume from both the term and the claims you make about it, that you are using “woke culture” in its standard use, as a term for a thing which exists only in the right-wing propaganda and the minds of those influenced by it, in which case it is not a terminology problem, but an understanding of reality problem.


> Do the people making such comments think this won’t be applied eventually to all sexual preferences?

I’d see two aspects of this:

- it cements the status quo. This technology isn’t bound by the availability of actors, real life societal issues etc, so I understand the disappointment when it follows exactly all the same lines as the current industry.

And a lot of people aren’t happy with the current industry, to say the least.

- there’s a decent chance that yes, that technology won’t be applied to other genres before a long time. Satisfying the majority first means lowering way way down the needs of the minorities. We’ll probably see incremental technical improvements before inclusiveness improvements, except if enough members of the minority become core participants in the development.


This technology isn’t bound by the availability of actors, real life societal issues etc, so I understand the disappointment when it follows exactly all the same lines as the current industry.

It will follow the same lines as current industry because is the almost literal (if very complex and sophisticated) average of current industry input.

An unfortunate quality of AI simulation output is it "freezes" the structure of it's training set, maintaining societal preferences exactly as-is.


I guess capitalism (or really: resource optimization) is often incompatible with social equality?

You’re going to optimize for a proven market. Indeed, it’s absolutely plausible that the existence of that market drowns out other possible markets from emerging. But how do you get an organization with limited resources to make the less optimal choice?

I guess we need government AI porn.


On porn, I'd argue it's probably the other way round as well, creating a cycle. Traditional norms dictated where the money go, and people wanting to chase that money now repeat the process, even as norms have shifted and they could address a bigger market.

There is of course a case to make for places with religious laws, but it doesn't looks like the AI devs are thibking about that at this point.


hey man i get arguments about stuff like access to equal education and police brutality as real problems. but are you really trying to make the case that unequal access to porn is a social woe or oppressive?


I wasn't going to these lengths in my comment, but yes I actually think porn is pretty important to our society in many many ways.

It depends on your country, but most western societies have piss poor sex education. I think it got a lot better, but we still mostly expect parents to have "the talk" (like it was just a single conversation...). There is unlimited amounts of media touching on relationship dynamics, but comparatively way few fewer touching on sex itself, as we commercially punish that depiction.

What we have left to cover that is porn.

So yes, I think what porn touches on, what's depicted, and having a broad landscape matching as many orientation as we can accept actually matters. Porn restrictions are bound to a society's moral compass, I don't know if improving porn can bias toward a better direction as well, but at least we could try.


I downloaded the latest Stable Diffusion model yesterday evening, and I've been playing with it all day.

The current model isn't very good at portraying any specific sexual act - but I imagine that will change soon enough.

It's already great at producing "tasteful nudes", which fortunately aligns well with my personal tastes.

Here are some of the highlights from my tinkering (NSFW, obviously)

https://imgur.com/a/S81Vom6

(some images have received minor touch-ups, such as cropping or hiding mangled hands - it's not very good at those)


Using this emerging generation of Text to image AI generators to make 'tasteful nudes' seems to me a bit like flying on a rocket to the moon and then playing golf. We have unlocked whole new frontiers of gratification.

How about 'Tasty nudes' https://imgur.com/a/Q6QSfAA


Other times, as alluded to earlier, it shows physically improbable models, typically with extra limbs, nipples in unusual places and contorted flesh.

By definition [these systems are] going to represent those whose bodies are accepted and valued in mainstream society

What an ironically contradictory juxtaposition.


I mean if you want a picture of Unreal Engine Medusa with dicks for hair instead of snakes and toes instead of thumbs, you are in luck.

But no, really. Go join the Unstable Diffusion discord if you are brave enough for the body horror. This is not your grandpappy’s porn stash.


there's some funny shit that comes out of it too though, like someone sent me a pic of this stripper looking chick with a whole ass head and neck growing out of her.

edit found it https://cdn.discordapp.com/attachments/1010982959525929010/1...


...along with a hand which probably belongs to that head, next to her right leg.

Almost looks like what happens in 3D games when collision detection doesn't work.


The more I see AI generated porn, the more I crave a genre of porn that has badly rendered humans or anatomical structures. It just always looks so new and interesting compared to the usual crap you always see. A true game changer.


Imagine uncensoring JAV with AI, what a time to be alive!


This has existed for a few years, see https://site-1717195-6840-9526.mystrikingly.com/.

Not with great results though I’d say.


I wonder if this could help pedophile people since generated porn wouldn't imply abuse of children.


I think this is a very interesting question that is being systematically sidestepped in these debates.

Sexual preferences are a given; some people are attracted to the opposite sex, some people are attracted to the same sex, some people are attracted to children. Nothing they (or anyone) can do about it, lest one decides to indulge in so-called "conversion therapy".

Of course, children can't consent, and abusing children is horrible. And child porn is horrible because it implies children have been abused to produce it.

But AI-generated child porn, however nauseating it might sound, doesn't hurt anybody.


>doesn't hurt anybody

You're looking at it too narrowly. AI-generated CSAM may facilitate a sexual interest in children among certain individuals. Development of such an interest makes it significantly more likely they will abuse children. So it ought be illegal.


Does watching gay porn make you gay too?

Why on Earth any normal person would specifically look for CSAM material be it AI generated or not? I doubt that many people who seen something sickening on internet immediately became attracted to it and huge percent of people who use internet for 10+ years seen all kind of terrible things. Unfortunately anyone who hosted or moderated online communities had to deal with a lot of terrible things.

Also we have whole army of youtube / facebook / twitter moderators who actually see CSAM daily and we dont have stories about everyone of them turning into child-abusers. Though it's proven that this kind of job can make person mentally ill, but for completely different reasons.

PS: Again be it AI generated or not I totally sure that CSAM should stay banned from sharing. But I dont give a damn about what you paint, generate and watch on your own PC.


The idea is that viewing AI-generated CSAM by someone who is already a pedophile will increase the probability that they'll decide to go out and abuse a child. Or, it'll foster a cultural acceptance of child abuse, leading people to be more confident to go out and do it. There's no actual data that demonstrates this, given it's hard to study, but it's not implausible.

One could make the opposite argument, too. If you give pedophiles AI-generated CSAM, it reduces the demand for real CSAM, thereby reducing production of real-world CSAM, thereby reducing child abuse. Moreover it provides pedophiles their fix without them having to actually abuse a child.

In the end these are all armchair speculations made from our intuitions about individual psychology and group psychology.


We can speculate all we want. The fact is: its impossible to apply any real restrictionsn ML model. Just like any DRM this one gonna be easy to bypass and will only degrade experience of legit users and gonna slow down technology advancement.

We dont require image editors to have CSAM filters and requiring filter for ML model is madness.


You're talking about filters inside the ML model, but I think the more interesting question is morality and the law.


"The law" is kind of irrelevant unless it's enforced, so it really boils down to morality.

There are two separate questions to be answered here:

1. Is it immoral?

2. Is it sufficiently immoral to warrant imposing my own moral standard upon others, by force if necessary? i.e. is responding to or preventing it with violence itself moral?

Morality is subjective and relative, which is why it's interesting to ask both questions: given the subjectivity and relativity of morality, at what point does your own moral code take precedence over those of others?


If there is no technical means to enforce the law, there is no law. At least when it comes to the war on drugs ... at least there's some physical thing being traded along. These models would generate such imagery and then immediately destroy it.

This will be a bigger disaster than the war on drugs, because it's almost impossible to provide proof. So what's left is convicting people based on oral testimony alone ...


I am personally on conservative side here so I'm not questioning morality and the law we have now. I'm very much against of spreading of such material regardless of how it was created. If someome decide to publicly share something like that there should be consequences.

At the same time I think everyone should be free to wrire / paint / generate whatever material they want on their own computer within comfort of their house. As long as no one gets hurt I dont give a damn.


> The fact is: its impossible to apply any real restrictions ML model

What? Of course you can curate what you feed and have filters to train your ML model

I am sure it is currently easier to train a model with legal sex content and without any children

Also the law is still the same, having and distributing pedophilia content is illegal


This is not how ML works. You don't need real child abuse material in your training set to make a model capable of creating CSAM. All you need is depictions of any children and any sexual content.


That is in line with what I said

Most publicly available sources of sexual content are adult, legal, don't have children

So if you feed Pornhub to your model, which most porn is actually well tagged, as is many ML data, you really have to go out of your way to get any children

Not saying is not possible, I am saying is easy to avoid. Not as an impossibility to separate both as you suggested above


Nope. You missed the point. I said the opposite.

If your training data have any absolutely innocent children pictures at all and also NSFW content with adults then model will be able to generate NSFW content with children. It's that simple.

Also even if you remove any children completely from training model will still be able to generate CSAM-like content because a lot of legit porn include jailbait models.


> There's no actual data that demonstrates this, given it's hard to study, but it's not implausible.

Actually there are studies suggesting that letting known child molesters access to child porn cartoons reduces recidivism.

Decent review of some of the evidence:

https://en.wikipedia.org/wiki/Relationship_between_child_por...


Is this backed empirically? The argument you made is almost identical to the ones used for "grand theft auto is making kids violent!".


I think that does not apply. That would be if criminals in prison were given GTA would they have better or worse chance of rehabilitation?


There's no reason to make such material lawful unless the converse is proved by empirical evidence. That is, can it be proved that people who view AI-based material are less likely to act on a sexual interest in real children? Unless that's proven, there's no other reason to make such material lawful. You're welcome to provide one.

The analogy is not apt. You cannot compare image-based abuse of children with a video game. GTA is a legitimate form of entertainment, child abuse material is not. They are not remotely similar. This shows why analogies are almost never helpful and it's best just to focus on the topic under discussion.


We live in a society where things are allowed by default and banned if they cause harm, not a society where things are banned by default and allowed if they are proven to not cause harm.


Sadly, these days one can only hope that we do.


I doubt anyone on HN would argue for making AI generated CSAM lawful. At the same time attempts to add unenforceable limitations into ML technology like ClosedAI trying to do is worse than useless.

You can generate whatever you want on your own PC, but if you try to use it for fraud or to share illegal material then you are a criminal.


Legality of fictional child pornography is surprisingly contentious, with positions ranging from "protected as artwork under freedom expression", "illegal if provenance involves child abuse within jurisdiction", to "illegal but non-realistic depictions don't count", to "depiction of children are illegal, regardless of provenance", to "depictions of adults who could be mistaken for children is also illegal"—just among first-world countries. "AI-generated CSAM" ranges from obviously unlawful to an oxymoron depending on the standard applied!


You are making a bunch of assumptions, that, if true, are a strong argument for banning it.

In reality, if it is the case that this will satisfy some people desires, then it may be the case that fewer children are abused this way, in which case it would be a bad idea to ban it.


This sounds a lot like the rather-thoroughly-debunked "violent video games make people violent" argument.


I don’t think the idea that all sexual preferences are innate, really holds much water.

There’s no genetic or development-while-in-womb explanation for why some people are specifically sexually into the idea of someone taking on the properties of a giant blueberry.

What is a fetish? It is the scar that guides the knife.


One particular sexual preference has children as a result ... and of course we all start as children. The first sexual preference, the first sexual relation anyone is confronted with in their life is heterosexual, simply because that's how childbirth works.

People are deeply uncomfortable with it, especially these days, but of course child-rearing families are the place were nature actually intends there to be sex, where there's almost guaranteed to be sexual activity. Probably the younger the child, the better the odds of sex. And, of course, this is not bad at all: that's how human society has worked for hundreds of thousands of years, how every animal society has worked for a billion or so years.

This probably "gives an advantage" to that particular sexual preference. You learn sexual behaviours, and the first ones you'll learn are heterosexual ones. There's nothing wrong with that ... or at least I don't think there is.


> But AI-generated child porn, however nauseating it might sound, doesn't hurt anybody.

I suppose it must have been trained on something? Are you 100% sure it's cool to use the sum of all the images of all the children to generate child porn? Also there might be accidental resemblances to particular living children in the produced images, is that ok?


I think most people actually doing something with this technology will be making the maker argument:

"Unless you have real-world means to make your opinions reality ... I don't care about your opinions"

Which goes both in the positive and the negative. Both "show it working or you can't do it" and the negative "prevent me from making it working or shut up". You have zero means to prevent such use of the technology ...

Add to that trillions of dollars in reasons not to outlaw the technology itself.

Attempts to enforce this will be worse, much worse, than the war on drugs. Plus, frankly, drugs do a lot more harm than just allowing this fully would do, so hopefully that will mean less attention going into this.


there's a real concern that "erotic plasticity" (esp if you combine this with a recommendation algo) can push people towards weird ass shit. literally the same thing happens on youtube, tiktok etc. with regular videos so why would this be an exception?


In my country it is a crime to possess imagery depicting underage sex acts. It does not matter if the actors are legal adults or not -- so, it probably does not matter if they are real either.


But is it a crime to possess an algorithm that can generate CP? You don't possess the images, it just generates them.

If it is a crime then it means any model which can generate CP under any circumstances will be illegal.


Once it generates them, at your request, you possess them.

Don't count on the legal system to go "haha you got me touché" on this.


If that's how it works, these AI models _really_ should prevent taking queries with "child" in them. Imagine you have one of these AIs installed, all you need is to stroke six keyboard keys to face a 20+ years criminal charge (of the type your ass is not spared). A bad actor with a few seconds in front of your screen could take this hell upon you. Having something like this installed in a laptop is equivalent to leaving your unlocked gun on a table pointed at you in public.


I have naked pictures of my wife on my iPhone (consensually taken, over 18, etc).

Apple has AI software the searches for photos given a text query (Siri feature I guess).

Searching for “baby” produces breast-centric photos of my naked wife posing in a sexy manner.

I mean I get the “breast/baby” connection - not rocket science. But I’d appeciate it if SFW queries didn’t product NSFW results.


> A bad actor with a few seconds in front of your screen could take this hell upon you.

This has already been the case for quite some time. AI doesn't really add anything new to that part of the equation.


> Once it generates them, at your request, you possess them.

No you don't, it just generated it and display it to the screen, it doesn't save it anywhere, you at no point posses it. Only solution is that everyone who posses a model capable of generating CP is seen as having CP.


> No you don't, it just generated it and display it to the screen, it doesn't save it anywhere, you at no point posses it.

"It's not my pot, officer! I was just holding it for someone!"

If you accidentally generate something inappropriate once with a poorly chosen prompt or an algorithm glitch, and it lives for a few seconds in your RAM before you close it, sure.

If a jury sees your generation history with a bunch of clearly "gimme that CSAM" queries, I wouldn't anticipate an acquittal.


Some countries forbid child porn even if no child was involved, it does not need to be real to be illegal. I assume so one can't say it was assumed to be fake but it wasn't. The lines get blurry when you generate porn with keywords like 18. Can you guarantee a legal training set of images? What if some actors look younger than they actually are? How will this affect the results? How do you proof the age of a person which doesn't exist based on one image?


> I wonder if this could help pedophile people since generated porn wouldn't imply abuse of children.

In some jurisdiction even "virtual" child porn is still treated as criminal:

> A sufficient reason, on basis on which Canadian Parliament was warranted to prohibit not only production and distribution (which were not direct question in that case) but also purely private possession of child pornography - even, if it has nothing to do with use of actual children in its production - was, in the Canadian Supreme Court opinion "reasoned apprehension of harm" which may flow from exposing some people to such a materials.

* http://bartlomiejkozlowski.pl/virtualchildporn.htm


Help? If someone admits they think they might go do something wrong like: sexual abuse, arson, shooting someone... would it help them to give them virtual content of rape, shooting games, horror/torture...?

Would GTA helps criminals in their rehabilitation?

I am not an expert on what help they need, but it would not be content that fuels their obsession

For most criminals we give them prison. For people with disorders we give the mental treatment

But possibly they can be treated with psychotherapy https://en.m.wikipedia.org/wiki/Sexual_fluidity


No, because "child porn" includes cartoons which already don't imply abuse of children. As long as such content can decrease recidivism/prevent child abuse, which preliminary research suggest it does, it seems logical to permit it. Unfortunately I'm not sure it will ever fly with the wider public, the subject matter is just too distasteful regardless of the evidence.


This is still child abuse material and probably still illegal where you live. The reason is obvious, it's a gateway to the abuse of real children.


Let's discuss the morality first.

There's a few reasons that consuming CSAM can be immoral:

- encouraging further production by adding view count, ad revenue, social encouragement to creators.

- compounding victimization by improving circulation.

- apparent gateway effect, either in the individual consuming it or by allowing a culture of tolerance.

It's the same logic for snuff films.

AI created CSAM basically removes the first two issues and leaves the third. But I'd add that the third is somewhat up for debate unless there's some specific evidence from psychology (although my default assumption is that you're right on this). And I'd add that there's some argument that it reduces the demand for real CSAM.


It's also the same logic as the moral panic against violence in video games, which seems like a poor look for the credibility of the "moral" argument.


> The reason is obvious, it's a gateway to the abuse of real children.

Please provide some evidence for this.


I may be mistaken but wasn't there some cases of people convicted for owning drawings of CP?


Really? That would be conceptually similar to AI-generated images. What would have been the rationale?


> That would be conceptually similar to AI-generated images.

I would expect those to be similarly illegal in the same jurisdiction.

> What would have been the rationale?

If I see a photo of ice cream, I tend to crave real ice cream.


> If I see a photo of ice cream, I tend to crave real ice cream

Would you say this works for other things? Esp. violence? Should all depictions of violence be banned? Or any undesirable behavior? The list may be long.

But more importantly, what you're describing is advertising. An image of an ice cream isn't an ice cream.

Porn works differently. It doesn't advertise for real sex. Porn conflates the representation and the thing itself into one; it delivers gratification directly, something an image of an ice cream doesn't do.


> Would you say this works for other things? Esp. violence?

If I find myself sexually aroused by violence.

> But more importantly, what you're describing is advertising.

Sure. We restrict or prohibit advertisements of certain things. Including child sexual abuse on that list is probably OK.

> Porn works differently. It doesn't advertise for real sex.

I'm not sure I agree. There are, for example, indications consumers of porn are more likely to request / engage in certain sex acts common in porn, like facials, anal sex, choking, etc. https://pubmed.ncbi.nlm.nih.gov/32081698/


> There are, for example, indications consumers of porn are more likely to request / engage in certain sex acts common in porn, like facials, anal sex, choking, etc. https://pubmed.ncbi.nlm.nih.gov/32081698/

The full paper is behind a paywall, but the abstract at least doesn't appear to make any conclusion w.r.t. causality. It's just as likely (if not more so) that people who are more likely to request / engage in certain sex acts will tend to prefer porn featuring them.


Photoshops of adult bodies with non-adult heads.


[flagged]


Do y'all really see no irony of you being the first one to bring in corrective language while simultaneously assigning it to and blaming a pejorative SJW, I mean, woke, ideology? Not to mention the absolutely stupid premise in general that progressivism seeks to downplay pedophilia? Again, do you have any idea how it's perceived?

Also, since HN threads about this topic devolve into pedantry, I'm pretty sure the term to be used in these types of threads is CSAM. It's explicit and long enough than it can be acronym-ized in a way that feels less icky that typing out "child sexual abuse material" each time.


This is sort of offtopic, but I wonder how the term CSAM appeared. It used to be just "child porn", and then fairly recently all of a sudden, over the course of at most a few months, everybody started to call it "CSAM" instead. Funny how this sort of thing happens, it makes me suspect some sort of agenda that isn't really stated out loud.


What does the 'M' stand for in CSAM?

I don't mind what they're called as long as it's palatable to wider society and they don't feel offended.


I know slandering "the woke" is de rigueur on HN, which is why you made this non-sequitur comment, but pedophiles renaming themselves MAPs has nothing whatsoever to do with "wokeness" or its terminology. Enjoy the endorphine hit though I guess.


I renamed it don't worry



FTA:

> that is, until you reach her torso, where three arms spit out of her shoulders.

I haven't tried generating porn, but as a pianist I tried to get both DALL-E and stable diffusion to generate pictures of pianists with 3 arms and I've never had it produce a picture of a person with 3 arms. I assumed that's because there's no training material for it to base it on.


(NSFW images below)

I think what happens is that in the early stages of generation, the model picks up on arm-like structures and just keeps building it out. It doesn't realize that the overall structure will be flawed later on.

(3 arms)

https://pornpen.ai/view/B8DguHKatzDvRk7esEX9

(bonus, 3 breasts)

https://pornpen.ai/view/vZ2XbVHRNTEX5kWqOLJA


It's amazing how "normal" 3 breasts look! (Dislclaimer: I'm a 59 year old gay man and have little experience with human female breasts).


4 arms, not 3. There’s a hint of one behind her left hip. (Her left, not image left.)


I’ve not tried to generate pictures of people with three arms but when I try to generate normal people it often gives them extra limbs.


4chan has been big on generating Toy Story porn lately and the results are just hilarious when not occasionally horrifying.


Pornpen is not porn, it merely generates nudity. Only prudes would call that porn.


Why do Americans think of a naked woman as “porn”?


Because that’s the normal context where naked women appear in mass media.


So you are saying that Americans’ puritanical view of the body is mainly due to seeing the world mainly through the media?


Daily life has no non-sexual nudity either.

What little non sexual nudity is left (sharing showers at health clubs, for example) is becoming rarer and rarer.

And co-Ed nonsexual nudity is virtually non-existent.

When I was in secondary school in the 90s, we were expected to shower together, but refused to without consequence.

We all signed up for gym class at the last hour so we could shower at home.


So after killing the work of digital illustrators, the next target is sex workers.

Next are actors, screenwriters (gpt-4), all creative industries will fall down.

How much time before Neural Nets remove all UX/UI design, frontend development and other programming related tasks?

5 years? 10 years?


Could it be that the world will be a better place if everyone can express themselves fully in the media of digital art, acting, screenwriting and eventually all creative industries cheaply and easily? Artistic expression won't be gate kept by art school educations, Hollywood connections, and access to the tools of the trade. Every person will be able to bring their inner ideas to life with a flair of expertise. This is not a given, but certainly a possible optimistic one.


You still need training material.


I think we need new tools of media studies and critical analysis to deal meaningfully with these images. It's clear submitting them to existing theories is missing the mark, and leaving some meaningful insights on the table.


Really? Not a single example? Ok VICE time to one up them.


That's awful where are these sites? I assume there are sites.


It's linked in the article but here you go NSFW) https://pornpen.ai/


Servers overloaded! Gotta be kidding me!


I'm working on it! In the meantime try the search feature, there are tons of pre-generated images to search through.

https://pornpen.ai/search




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: