minus-squareapplepie@kbin.socialtoSelfhosted@lemmy.world•Self hosting an LLM for researchlinkfedilinkarrow-up2arrow-down3·6 months agoI didnt say any. Based on what he is asking, he can’t just run this shit on an old laptop. linkfedilink
minus-squareapplepie@kbin.socialtoSelfhosted@lemmy.world•Self hosting an LLM for researchlinkfedilinkarrow-up0arrow-down5·6 months agoYou would need 24gb vram card to even start this thing up. Prolly would yield shiti results linkfedilink
I didnt say any. Based on what he is asking, he can’t just run this shit on an old laptop.