• Pennomi
    link
    fedilink
    English
    395 months ago

    Woah there, I’m not sure I’m ready for that level of commitment.

      • ɐɥO
        link
        fedilink
        English
        45 months ago

        you dont need that much power. something like a rx 6600xt/rtx 3060/rx580 is plenty

        • @neutron@thelemmy.club
          link
          fedilink
          English
          3
          edit-2
          5 months ago

          Is support for AMD cards better these days? Last time I checked it involved checking ROCM compatibility because CUDA needs nvidia cards exclusively.

          • ɐɥO
            link
            fedilink
            English
            25 months ago

            gpt4all worked out of the box for me