• KairuByte@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    The only reason this would be an issue is if it’s sending that data off to a third party. If its fully local, who cares what data it sees?

    • DarkThoughts@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      9 months ago

      If they’re local they’d be basically useless due to a lack of computing power and potential lack of indexing for a search engine chatbot, so I doubt it. It would also have to be so polished that it wouldn’t require further user knowledge / input, and that’s just not a thing with any local LLM I’ve come across. Mozilla can gladly prove me wrong though. I certainly wouldn’t mind if they generally can make the whole process of local LLMs easier and more viable.

      • Pennomi@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        The requirements to run good local LLMs have really been shrinking this past year… I have a lot of faith that there is a generally useful yet tiny AI tool within the grasp of Mozilla.

      • KairuByte@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        9 months ago

        I can understand your thinking, but it could be as simple as giving the user the option to outsource the computation to a secure something or other, if their machine can’t handle it.

        And yeah, the requirements are still quite high, but they are being reduced somewhat steadily, so I wouldn’t be surprised if average hardware could manage it in the long term.

        Edit: For the record, Mozilla is one of the only companies I would trust if they said “the secure something or other is actually secure.” And they’d likely show actual proof and provide and explanation as to how.