• lime!@feddit.nu
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 days ago

    that’s more of a comment on the usage than on the technology itself.

    remember that google deepdream thing that would hallucinate dogs everywhere? it’s the same tech.

      • Glitterbomb@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        23 hours ago

        If that’s the case, then we anthropomorphize technology all the time. Like, constantly. How many times has your phone died when its not even alive? How does a phone drop a connection without hands? We feed a computer input and it regurgitates or spits out output, all without a mouth. The examples are endless but hard to immediately pick out, because the usage has changed to be completely commonplace. Even bytes were originally conceived as a play on words with ‘bite sized’ to refer to a small collection of bits. I don’t necccessarily defend these ‘AI’ tools, but policing the language people use ain’t it. Changing the word hallucinate to refer to a part of technology is exactly how language has functioned since always

      • Swedneck@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        that’s literally how it works though, the software is trained to remove noise from images and then you feed it pure noise and tell it there’s an image behind it. If that’s not hallucination idk what would be.

      • lime!@feddit.nu
        link
        fedilink
        English
        arrow-up
        7
        ·
        2 days ago

        so is calling it fabrication. something incapable of knowing what is true cannot lie.

        also, gpts and image generators are fundamentally different technologies sharing very little code beyond the basic matrix manipulation stuff, so the definition of truth needs to be very different.