theangriestbird,

My experience is with gpt4all (which also runs locally), but I believe the GPU doesn’t matter because you aren’t training the model yourself. You download a trained model and run it locally. The only cap they warn you about is RAM - you’ll want to run at least 16gb of RAM, and even then you might want to stick to a lighter model.

  • Wszystkie
  • Subskrybowane
  • Moderowane
  • Ulubione
  • nauka
  • muzyka
  • Technologia
  • test1
  • krakow
  • Blogi
  • Spoleczenstwo
  • fediversum
  • FromSilesiaToPolesia
  • rowery
  • piracy@lemmy.dbzer0.com
  • slask
  • lieratura
  • informasi
  • retro
  • sport
  • Gaming
  • esport
  • Psychologia
  • Pozytywnie
  • motoryzacja
  • niusy
  • tech
  • giereczkowo
  • ERP
  • antywykop
  • Cyfryzacja
  • zebynieucieklo
  • warnersteve
  • Wszystkie magazyny