![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.ml/pictrs/image/a64z2tlDDD.png)
1·
4 months agoThat’s the technical explanation for the changes, no an explanation for closing the discussion all together.
That’s the technical explanation for the changes, no an explanation for closing the discussion all together.
@bitwarden bitwarden locked and limited conversation to collaborators
They also locked the thread 16 hours ago (as of writing this comment), with no explanation.
A very useful tip for technical images (i.e., lab report/research): export whatever graph you created as .svg, and do some prettifying touches in InkScape. It is faaaar easier than doing it in code.
Also, always export the .svg, even if you’re not gonna use it. You never know when you want to do a very small correction, and it will save you quite some time.
The 1.5B version that can be run basically on anything. My friend runs it in his shitty laptop with 512MB iGPU and 8GB of RAM (inference takes 30 seconds)
You don’t even need a GPU with good VRAM, as you can offload it to RAM (slower inference, though)
I’ve run the 14B version on my AMD 6700XT GPU and it only takes ~9GB of VRAM (inference over 1k tokens takes 20 seconds). The 8B version takes around 5-6GB of VRAM (inference over 1k tokens takes 5 seconds)
The numbers in your second link are waaaaaay off.