• @snek_boi@lemmy.ml
    link
    fedilink
    9
    edit-2
    1 month ago

    I can’t see how AI can’t be done in a privacy-respecting way [edit: note the double negative there]. The problem that worries me is performance. I have used texto-to-speech AI and it absolutely destroys my poor processors. I really hope there’s an efficient way of adding alt text, or of turning the feature off for users who don’t need it.

    • @MangoPenguin@lemmy.blahaj.zone
      link
      fedilink
      English
      211 month ago

      If it runs locally then no data ever leaves your system, so privacy would be respected. There are tons of good local-only LLMs out there right now.

      As far as performance goes, current x86 CPUs are awful, but stuff coming out from ARM and likely from Intel/AMD in the future will be much better at running this stuff.