theangriestbird,

My experience is with gpt4all (which also runs locally), but I believe the GPU doesn’t matter because you aren’t training the model yourself. You download a trained model and run it locally. The only cap they warn you about is RAM - you’ll want to run at least 16gb of RAM, and even then you might want to stick to a lighter model.

  • Wszystkie
  • Subskrybowane
  • Moderowane
  • Ulubione
  • test1
  • Spoleczenstwo
  • lieratura
  • muzyka
  • piracy@lemmy.dbzer0.com
  • rowery
  • sport
  • Blogi
  • Technologia
  • Pozytywnie
  • nauka
  • FromSilesiaToPolesia
  • fediversum
  • motoryzacja
  • niusy
  • slask
  • informasi
  • Gaming
  • esport
  • Psychologia
  • tech
  • giereczkowo
  • ERP
  • krakow
  • antywykop
  • Cyfryzacja
  • zebynieucieklo
  • kino
  • warnersteve
  • Wszystkie magazyny