• 1 Post
  • 64 Comments
Joined 1 year ago
cake
Cake day: July 28th, 2023

help-circle



  • I run a lot of LLMs locally, as well as doing image generation locally with Stable Diffusion.

    The most important factor is the GPU. If you’re gonna do AI stuff with your GPU it basically has to be a CUDA GPU. You’ll get the most bang for the buck with a 3090 TI, (amount of VRAM is also important). And get at least 64 GB of RAM.

    If you get this you’ll be set for a year until you learn enough to want better hardware.

    A lot of people try to buy their way out of a lack of knowledge and skill about these things, don’t do that. I’m able to get better results with 7B models than many get with 70B models.

    Get LM Studio for the LLMs and get A1111 (or ComfyUI or Foooocus) for image generation.