For me I say that a truck with a cab longer than its bed is not a truck, but an SUV with an overgrown bumper.

  • tool
    link
    fedilink
    English
    31 year ago

    They are AI though. They’re just not Artificial General Intelligence.

    • @Kaldo@beehaw.org
      link
      fedilink
      English
      01 year ago

      My definition of AI is coming from books and media, unless it exhibits actual intelligence it is not an AI. Building a sensible sentence from large amounts of data while not understanding what it is actually saying or whether it’s actually correct or consistent does not make an intelligence.

        • @Kaldo@beehaw.org
          link
          fedilink
          English
          1
          edit-2
          1 year ago

          Nope, it’s only matching the prompt with the most likely answer from its training set. Do you remember in the early days when it would be asked slightly tweaked riddles and it would get them incorrectly, it’d just spew out something that sounded like the original answer but was completely wrong in the current context? Or how it just made up nonexistent court cases for that one lawyer that tried to use it without actually checking if it’s correct?

          LLMs are just guessing the answer based on millions of similar answers they have been trained with. It’s a language syntax generator, it has no clue what it is actually saying. They are extremely advanced and getting better at hiding their flaws but at their core, they are not actual intelligence.