• jwmgregory@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    3
    arrow-down
    2
    ·
    11 hours ago

    why do i have a feeling if i asked you to tell me what hallucinations are in a technical sense i would get a regurgitated answer from google?

    being blind to the obvious doesn’t help anyone, man. anyone who has genuinely worked on or even just with these tools knows that they are capable of producing quality outputs. sometimes they mess up, sure, but it also can work 1000000x faster than you can. the energy problem in turn is a valid discussion but this is just being oblivious to the obvious.

    why do you guys all mistake the climate of early tech adoption as an indicator of the technology itself being bad? were you not alive for the rise of the internet or something? i think you guys all just hate corporatism, not AI, but for some reason can’t take the logical step to that conclusion.

    • pyre@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      10 hours ago

      I don’t know why you have that feeling because you definitely wouldn’t get a regurgitated answer from google, since I don’t give a shit what it is in a technical sense. guess what, if I buy a phone that might catch fire every once in a while, I don’t need to know how or why it does that in a technical sense to confidently say that it is shit and not worth my money or time.

      “sometimes they mess up” is not good enough, and no, the output is not “quality”.

      i was alive for the rise of the internet and the analogy doesn’t work. llms are fundamentally useless for 90% of what they’re currently being used for, which is mostly generic assistance. assistance needs knowledge and actual skills not a glorified autocomplete for everything.