minus-squareAMillionMonkeys@lemmy.worldtoSelfhosted@lemmy.world•Consumer GPUs to run LLMslinkfedilinkEnglisharrow-up1·2 days ago I would prefer to have GPUs for under $600 if possible Unfortunately not possible for a new nvidia card (you want CUDA) with 16GB VRAM. You can get them for ~$750 if you’re patient. This deal was available for awhile earlier today: https://us-store.msi.com/Graphics-Cards/NVIDIA-GPU/GeForce-RTX-50-Series/GeForce-RTX-5070-Ti-16G-SHADOW-3X-OC Or you could try to find a 16GB 4070Ti Super like I got. It runs Deepseek 14B and stuff like Stable Diffusion no problem. linkfedilink
minus-squareAMillionMonkeys@lemmy.worldtoSelfhosted@lemmy.world•Rant! 100GB Log file in Nextcloud.linkfedilinkEnglisharrow-up6arrow-down1·1 month agoEverything I hear about Nextcloud scares me away from messing with it. linkfedilink
AMillionMonkeys@lemmy.world to Selfhosted@lemmy.worldEnglish · 8 months agoWhat do you guys do about usernames / passwords for your local services?plus-squaremessage-squaremessage-square0fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-squareWhat do you guys do about usernames / passwords for your local services?plus-squareAMillionMonkeys@lemmy.world to Selfhosted@lemmy.worldEnglish · 8 months agomessage-square0fedilink
Unfortunately not possible for a new nvidia card (you want CUDA) with 16GB VRAM. You can get them for ~$750 if you’re patient. This deal was available for awhile earlier today:
https://us-store.msi.com/Graphics-Cards/NVIDIA-GPU/GeForce-RTX-50-Series/GeForce-RTX-5070-Ti-16G-SHADOW-3X-OC
Or you could try to find a 16GB 4070Ti Super like I got. It runs Deepseek 14B and stuff like Stable Diffusion no problem.