• JackGreenEarth
    link
    fedilink
    English
    73 months ago

    Yes, but what LLM has a large enough context length for a whole book?

    • @ninjan@lemmy.mildgrim.com
      link
      fedilink
      English
      83 months ago

      Gemini Ultra will, in developer mode, have 1 million token context length so that would fit a medium book at least. No word on what it will support in production mode though.

      • JackGreenEarth
        link
        fedilink
        English
        33 months ago

        Cool! Any other, even FOSS models with a longer (than 4096, or 8192) context length?