Hugh Nelson, 27, from Bolton, jailed after transforming normal pictures of children into sexual abuse imagery

A man who used AI to create child abuse images using photographs of real children has been sentenced to 18 years in prison.

In the first prosecution of its kind in the UK, Hugh Nelson, 27, from Bolton, was convicted of 16 child sexual abuse offences in August, after an investigation by Greater Manchester police (GMP).

Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.

He was also found guilty of encouraging other offenders to commit rape.

    • Flying SquidM
      link
      fedilink
      English
      258 days ago

      I agree, but if there were some way to create CSAM without using real children (I’m not sure how you would train such an AI model), it would probably be worth seeing if that did anything to make pedophiles less likely to act out on their desires.

      Because my god, we need to figure out something.

      • @[email protected]
        link
        fedilink
        English
        208 days ago

        I mean trying to help them get treatment instead of going all pod-people on anyone showing even the possibility of being attracted to kids would be helpful.

        • Flying SquidM
          link
          fedilink
          English
          228 days ago

          I’ve been saying that for ages. Obviously we don’t want to enable any pedophiles to do anything horrific to children, but we’re at a state right now where if you have those urges to begin with, you’re basically already told to accept that you’re an incurable monster. So why not act on the urges?

          Somehow we need to get through to such people that they need to get help before they do anything terrible. I’m not sure how to do that in the current climate though.

          • @[email protected]
            link
            fedilink
            English
            -18 days ago

            “it would probably be worth seeing if that did anything to make pedophiles less likely to act out on their desires.”

            What’s the implication here? You’re saying we should look into placating child predators by creating AI CP for them to consume?

            • Flying SquidM
              link
              fedilink
              English
              48 days ago

              That would be worth a scientific study, don’t you think? Isn’t it worth trying to find ways to stop child predators before they become predators?

              You seem to think I’m suggesting that the UK government create childporn.gov.uk or something.

      • @[email protected]
        link
        fedilink
        English
        58 days ago

        Train it to depict humans that look like anime characters that are definitely 18 or older immortal dragons that are taking on the bodies of young human beings

        Disclaimer

        I am not condoning, endorsing, or suggesting this

      • JohnEdwa
        link
        fedilink
        English
        7
        edit-2
        8 days ago

        The way AI models work, you don’t have to train it on the thing you want it to do, you can ask it to combine the things it knows about. Take any of the meme loras for example, like pepe punch or patcha.

        So literally any model that can generate pictures of naked adults and clothed children - which is to say almost all of them - is going to be at least somewhat competent in creating CP unless those prompts are being actively censored and blocked.

        • @[email protected]
          link
          fedilink
          English
          27 days ago

          Wouldn’t that generate images of children with small-sized adult bodies?

          If it doesn’t know what a child’s body looks like, it can’t just figure it out.

          • JohnEdwa
            link
            fedilink
            English
            17 days ago

            The datasets will have enough images of kids in bikinis and underwear from stock photos and clothes shop listings etc to figure that part out rather easily.

      • Jake Farm
        link
        fedilink
        English
        78 days ago

        Its a form of stalking, probably makes it more likely for them to rape that child, even if they don’t wind up doing that it would still qualify as a form of revenge porn.

          • @[email protected]
            link
            fedilink
            English
            4
            edit-2
            8 days ago

            It is when they are commissioning these “works”.

            Ed8t: To be clear, that’s what happened here.

            • @[email protected]
              link
              fedilink
              English
              38 days ago

              Commissioning as in buying? I’m not sure how that changes it to stalking.

              IMO, the worst part about it is that there’s someone else out there who thinks less of me because there’s some naked imagery of me.

              • @[email protected]
                link
                fedilink
                English
                38 days ago

                People will always find ways to think less about you.

                For example, I think less of you because your comments support pedophilia.

                  • @[email protected]
                    link
                    fedilink
                    English
                    2
                    edit-2
                    7 days ago

                    THEY AREN’T KEEPING IT TO THEMSELVES.

                    Holy shit, how are you defending this behavior still?

                    They find children they want, take pictures of them, send them to this “CSAM AI Artist” for lack of a better term, in order to have CSAM of the specific child they are interested in.

                    If you dont see that as dangerous, especially as the CSAM creator is encouraging these people to act on those specific children, well… Let me know so I can just block you and be done.

                    What the actual fuck.

              • @[email protected]
                link
                fedilink
                English
                28 days ago

                Commissioning as in a buyer has an interest in a particular child. They ask the guy using ai to make a custom bit of CSAM, so the buyer can have CSAM of that specific child.

                That kind of commissioning.

                • @[email protected]
                  link
                  fedilink
                  English
                  08 days ago

                  Okay, but if I ask someone to draw me a picture of Nicholas Cage naked, is that stalking him? What if I have Nick Cage pictures all over my walls and even ceiling and my phone wallpaper? Is that stalking? Does it help if I’m really horny for him? And I touch myself?

                  • @[email protected]
                    link
                    fedilink
                    English
                    2
                    edit-2
                    8 days ago

                    We aren’t talking about a famous person.

                    We are talking about someone taking pictures of kids they know to have someone else turn it into CSAM.

                    The comparison you are trying to make is completely irrelevant. The fact that you see it as a comparison makes it even worse.