• MotoAsh@lemmy.world
    link
    fedilink
    arrow-up
    10
    arrow-down
    2
    ·
    edit-2
    8 months ago

    You are imagining a supercomputer’s LLM running an NPC.

    It literally cannot be that fancy. Maybe they can fake it and fool a few rubes, but no there will be no deep characters ran by this.

    • PonyOfWar
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      8 months ago

      The way it works right now is usually over the cloud. I’ve already tried out a bit of “Convai” as a developer, which is a platform where you can create LLM NPCs and put them in Unreal Engine. It’s pretty neat, not perfect, but you can definitely give characters thousands of lines of backstory if you want and they will act in character. They will also remember any conversations a player had with them previously and can refer to them in later convos. Can still be fairly obvious that you’re talking to an LLM though, if you know what to ask and what to look for. Due to its cloud-based nature, there is also some delay between the player input and the response. But it has a lot of potential for dialog systems where you can do way more than just choose between 4 predefined sentences. Especially once running these things locally won’t be a performance-issue.

    • owen@lemmy.ca
      link
      fedilink
      arrow-up
      4
      ·
      8 months ago

      I think you could make it work by giving them each a limited word pool and pre-set phrases to cover for panic/confusion