• coolin@beehaw.org
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    1 year ago

    Current LLMs are manifestly different from Cortana (🤢) because they are actually somewhat intelligent. Microsoft’s copilot can do web search and perform basic tasks on the computer, and because of their exclusive contract with OpenAI they’re gonna have access to more advanced versions of GPT which will be able to do more high level control and automation on the desktop. It will 100% be useful for users to have this available, and I expect even Linux desktops will eventually add local LLM support (once consumer compute and the tech matures). It is not just glorified auto complete, it is actually fairly correlated with outputs of real human language cognition.

    The main issue for me is that they get all the data you input and mine it for better models without your explicit consent. This isn’t an area where open source can catch up without significant capital in favor of it, so we have to hope Meta, Mistral and government funded projects give us what we need to have a competitor.

    • SavvyWolf
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      Sure, all that may be true but it doesn’t answer my original concern: Is this something that people want as a core feature of their OS? My comments weren’t that “oh, this is only as technically sophisticated as voice assistants”, it was more “voice assistants never really took off as much as people thought they would”. I may be cynical and grumpy, but to me it feels like these companies are failing to read the market.

      I’m reminded of a presentation that I saw where they were showing off fancy AI technology. Basically, if you were in a call 1 to 1 call with someone and had to leave to answer the doorbell or something, the other person could keep speaking and an AI would summarise what they said when they got back.

      It felt so out of touch with what people would actually want to do in that situation.

      • coolin@beehaw.org
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I suppose having worked with LLMs a whole bunch over the past year I have a better sense of what I meant by “automate high level tasks”.

        I’m talking about an assistant where, let’s say you need to edit a podcast video to add graphics and cut out dead space or mistakes that you corrected in the recording. You could tell the assistant to do that and it would open the video in Adobe Premiere pro, do the necessary tasks, then ask you to review it to check if it made mistakes.

        Or if you had an issue with a particular device, e.g. your display, the assistant would research the issue and perform the necessary steps to troubleshoot and fix the issue.

        These are currently hypothetical scenarios, but current GPT4 can already perform some of these tasks, and specifically training it to be a desktop assistant and to do more agentic tasks will make this a reality in a few years.

        It’s additionally already useful for reading and editing long documents and will only get better on this end. You can already use an LLM to query your documents and give you summaries or use them as instructions/research to aid in performing a task.

        • fine_sandy_bottom@discuss.tchncs.de
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          I guess my understanding of an LLM must be way off base.

          I had thought that asking an LLM to edit a video was simply out of scope. Like asking your self driving car to wash the dishes.

    • chicken@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      A year ago local LLM was just not there, but the stuff you can run now with 8gb vram is pretty amazing, if not quite as good yet as GPT 4. Honestly even if it stops right where it is, it’s still powerful enough to be a foundation for a more accessible and efficient way to interface with computers.