Couldn’t make this shit up if I tried.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    ·
    edit-2
    4 hours ago

    In case anyone missed it, Deepseek just released models that make OpenAI’s best nearly irrelevent… in the open, for anyone to host. For a tiny fraction of the hosting cost.

    Even the small distillation that fits on a 24GB VRAM desktop is incredible. And you can host it for others to use for free, with room for batching, like I’m doing right now. And there is so much that’s awesome about it, like the SFT training pipeline/code being published and the smaller models being built on top of models from another company (Qwen 2.5).

    I don’t even care what he’s saying now, but don’t believe a word that comes out of Altman’s mouth. He’s a just as much of a greedy con man as Musk, trying to gaslight everyone into thinking OpenAI will be relevant in a year, not a hollow, closed shell that sold out its research directive for cheap short term profit.

    • unmagical@lemmy.ml
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      3 hours ago

      24GB VRAM desktop

      That’s minimum a $1000 GPU if you go red team or $1500 for green.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        3 hours ago

        Dual 3060s are an option. LLMs can be split across GPUs reasonably well.

        3090s used to be like $700 used, but ironically they’ve gone up in price. I got mine for around $800 awhile ago, and stuffed it into 10L PC.

        Some people buy used P40s. There are rumors of a 24GB Arc B580. Also, AMD Strix Halo APU laptops/mini PCs can host it quite well, with the right software setup… I might buy an ITX board if anyone ever makes one.

        Also, there are 12GB/6GB VRAM distillations too, but 24GB is a huge intelligence step-up.

        • unmagical@lemmy.ml
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 hours ago

          Totally forgot the 3090 had 24GB. It’s definitely still enthusiast territory though.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 hours ago

            For sure.

            The 14B distillation is still quite good, and usable on like 10GB GPUs. Maybe 8 with the right settings.

  • The Quuuuuill@slrpnk.net
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    2
    ·
    4 hours ago

    WHO CARES!? WHO ACTUALLY GIVES A SHIT!? WHO THE FUCK IS LETTING THIS FUCKING 400 BILLION DOLLAR BURDEN ON THE WORLD DOMINAHE NEWS CYCLES FOR ANYTHING OTHER THAN “this guy fucking sucks” WHAT THE FUCK ARE WE ALL DOING!?

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 hours ago

      Well he’s kinda busy on the higher plane of existence, or is he back again…or did he come back and then go back again‽

      Where in the world planes of existence is Dr Daniel Jackson‽‽

      • 667@lemmy.radio
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 hours ago

        Subtly influencing events to improve the outcome of the Deadelus in their bid to leap between galaxies.