• 1 Post
  • 44 Comments
Joined 1 year ago
cake
Cake day: August 22nd, 2023

help-circle




  • Man, they could have made the letter something that would persuade people about the importance of ideas and how no nation is a monolith, but they just couldn’t help but make it a blatantly “Israel is right” letter.

    “We continue to be shocked and disappointed to see members of the literary community harass and ostracise their colleagues because they don’t share a one-sided narrative in response to the greatest massacre of Jews since the Holocaust.

    “Israel is fighting existential wars against Hamas and Hezbollah…"

    Someone here is obfuscating reality, and it’s not the boycotters. These people are insane.



  • Absolutely agree. My comment above was focused on whether some minimal amount of CSEM would itself make similar images happen when just prompting for porn, but there are a few mechanics that likely bias a model to creating young-looking faces in porn and with intentional prompt crafting I have no doubt you can at least get an approximation of it.

    I’m glad to hear about the models that are intentionally separating adult content from children. That’s a good idea. There’s not really much reason an adult-focused model needs to be mixed with much other data. There’s already so much porn out there. Maybe if you want to tune something unrelated to the naked parts (like the background) or you want some mundane activity, but naked, but neither of those things need kids in them.


  • I have not personally explored AI porn, but as someone with experience in machine learning and accidental biases that’s not very surprising to me.

    On top the of the general societal bias towards youth for “beauty” related roles, smoother and less-featured faces (that in general look younger) are closer to an average face so defaulting to that gets a bit of training boost (when in doubt, target the mean). It’s probably also not helped by youth-related porn keywords (teen, daughter, young) that further associate other porn prompts (even ones not about youth) with non-porn images of underage women that also have those keywords.


  • I assume any CSEM ingested into these models is absolutely swamped by the massive amount of adult porn that’s much more easily available. A handful of images aren’t going to drive model output in datasets of the scale of the image generation models. Maybe there are keywords that could drill down to be more associated with the child porn, but a lot of “young” type keywords are already plentifully applied to adults, and I imagine accidental child porn ingests are much less likely to be as conveniently labeled.

    So maybe you can figure out how to get it to produce child porn, but it probably won’t just randomly produce it for an innocent porn prompt.











  • He’s literally not doing what’s most popular. If he was we’d be reducing and/or conditioning military aid, the position supported by a majority of the population. What are you even talking about?

    Five in 10 Americans (53%) support placing restrictions on US military aid to Israel so that it cannot use that aid toward military operations against Palestinians.

    Wild that people can act condescending when just patently saying the wrong thing after being shown the exact data that proves them wrong.