• stevedidwhat_infosec@infosec.pub
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    10 months ago

    You’re purposefully downplaying and over simplifying what AI models do. I’m not going to continue arguing with someone who can’t debate fairly.

    Learning models don’t fucking collage shit. That’s not how the tech works.

    I’m not going to debate this shit with someone who’s this blatant with their bad faith argumentation as you are being, good bye.

    Anyone else wants to actually discuss or learn more about the tech in a civil way, lmk.

    • AVincentInSpace
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      10 months ago

      I know perfectly well how the tech works. It’s given a bunch of images and randomly rolls dice to generate weights until it can generate things that approximate its training data, then continues in that general direction using a hill climbing algorithm to approximate that data as closely as possible. Every output of a generative neural network is a combination of random noise and a pattern of pixels that appeared in its training data (possibly across several input images, but that appeared nonetheless). You cannot get something out that did not, at some point, go in. Legally speaking, that makes them a collage tool.

      I ask again: do you have an argument or are you going to continue to make appeals to ignorance against mine?