• SavvyWolf
    link
    English
    1204 months ago

    Do people actually want this?

    Like, I know the megacorps that control our lives do (since it’s a cheap way of adding value to their products), but what about actual users? I think many see it as a novelty and a toy rather than a productivity tool. Especially when public awareness of “hallucinations” and the plight faced by artists rises.

    Kinda feels like the whole “voice controlled assistants” bubble that happened a while ago. Sure they are relatively commonplace nowadays, but nowhere near as universal as people thought they would be.

      • EvilMonkeySlayer
        link
        fedilink
        254 months ago

        I think it’s those stupid hard coded buttons on my remote that I accidentally press every so often then have to repeatedly try and back/exit out of the stupid thing it launched that I cannot remove/uninstall from my tv.

        • @nyan@lemmy.cafe
          link
          fedilink
          English
          5
          edit-2
          4 months ago

          If you can figure out how to get the remote open, you’ll probably find that the buttons are all part of the same flexible rubbery insert (unless it’s 10+ years old). Put a little tape on the bottoms of the ones causing you problems. The insulation should keep them from working, and it’s 100% reversible if you ever do find a use for them.

          If it’s one of the older, more expensive remotes with individual switches, then, yeah, pliers and superglue. 😅

        • Kaspar Houser
          link
          fedilink
          14 months ago

          And it just needs to load a hasty scribbled overloaded UI that takes forever to load with no content because you don’t have an account and/or are not connected to wifi.

    • Awhiskeydrunker
      link
      fedilink
      194 months ago

      Maybe I’m a pessimist but this is going to really resonate with the people who are “looking forward to AI” because they read headlines, but haven’t actually used any LLMs yet because nobody has told them how.

    • Uranium3006
      link
      fedilink
      154 months ago

      I want a voice controlled assistant that runs locally and is fully FOSS and I can just run on my bog standard linux PC, hardware minimum requirements nonwithstanding

    • @PixxlMan@lemmy.world
      link
      fedilink
      144 months ago

      Not a single soul wants this. They just want to use every foul trick to get you to use copilot (by accident even) just like they do with bing and their other garbage.

    • @coolin@beehaw.org
      link
      fedilink
      54 months ago

      Current LLMs are manifestly different from Cortana (🤢) because they are actually somewhat intelligent. Microsoft’s copilot can do web search and perform basic tasks on the computer, and because of their exclusive contract with OpenAI they’re gonna have access to more advanced versions of GPT which will be able to do more high level control and automation on the desktop. It will 100% be useful for users to have this available, and I expect even Linux desktops will eventually add local LLM support (once consumer compute and the tech matures). It is not just glorified auto complete, it is actually fairly correlated with outputs of real human language cognition.

      The main issue for me is that they get all the data you input and mine it for better models without your explicit consent. This isn’t an area where open source can catch up without significant capital in favor of it, so we have to hope Meta, Mistral and government funded projects give us what we need to have a competitor.

      • SavvyWolf
        link
        English
        84 months ago

        Sure, all that may be true but it doesn’t answer my original concern: Is this something that people want as a core feature of their OS? My comments weren’t that “oh, this is only as technically sophisticated as voice assistants”, it was more “voice assistants never really took off as much as people thought they would”. I may be cynical and grumpy, but to me it feels like these companies are failing to read the market.

        I’m reminded of a presentation that I saw where they were showing off fancy AI technology. Basically, if you were in a call 1 to 1 call with someone and had to leave to answer the doorbell or something, the other person could keep speaking and an AI would summarise what they said when they got back.

        It felt so out of touch with what people would actually want to do in that situation.

        • @knightly
          link
          34 months ago

          I hope the LLM bubble pops this year. The degree of overinvestment by megacorps is staggering.

        • @coolin@beehaw.org
          link
          fedilink
          14 months ago

          I suppose having worked with LLMs a whole bunch over the past year I have a better sense of what I meant by “automate high level tasks”.

          I’m talking about an assistant where, let’s say you need to edit a podcast video to add graphics and cut out dead space or mistakes that you corrected in the recording. You could tell the assistant to do that and it would open the video in Adobe Premiere pro, do the necessary tasks, then ask you to review it to check if it made mistakes.

          Or if you had an issue with a particular device, e.g. your display, the assistant would research the issue and perform the necessary steps to troubleshoot and fix the issue.

          These are currently hypothetical scenarios, but current GPT4 can already perform some of these tasks, and specifically training it to be a desktop assistant and to do more agentic tasks will make this a reality in a few years.

          It’s additionally already useful for reading and editing long documents and will only get better on this end. You can already use an LLM to query your documents and give you summaries or use them as instructions/research to aid in performing a task.

          • I guess my understanding of an LLM must be way off base.

            I had thought that asking an LLM to edit a video was simply out of scope. Like asking your self driving car to wash the dishes.

      • @chicken@lemmy.dbzer0.com
        link
        fedilink
        2
        edit-2
        4 months ago

        A year ago local LLM was just not there, but the stuff you can run now with 8gb vram is pretty amazing, if not quite as good yet as GPT 4. Honestly even if it stops right where it is, it’s still powerful enough to be a foundation for a more accessible and efficient way to interface with computers.