shark@lemmy.org to Selfhosted@lemmy.worldEnglish · 4 days agoWhat's your self-hosting success of the week?message-squaremessage-square96fedilinkarrow-up1100
arrow-up1100message-squareWhat's your self-hosting success of the week?shark@lemmy.org to Selfhosted@lemmy.worldEnglish · 4 days agomessage-square96fedilink
minus-squareShimitar@downonthestreet.eulinkfedilinkEnglisharrow-up2·4 days agoNVIDIA Corporation GA104GL [RTX A4000] (rev a1) From lspci It has 16gb of VRAM, not too much but enough to run gpt:OSS 20b and a few other models pretty nice. I noticed that it’s better to stick to a single model, I imagine that unload and reload the model in VRAM takes time.
NVIDIA Corporation GA104GL [RTX A4000] (rev a1)
From lspci
It has 16gb of VRAM, not too much but enough to run gpt:OSS 20b and a few other models pretty nice.
I noticed that it’s better to stick to a single model, I imagine that unload and reload the model in VRAM takes time.