

Unless OP is running a data center, then there’s not really much of a power increase to run a local Ollama.
Hi! I’m Katherine, or webkitten. I’ve been on the internet since our family got our first computer - a Tandy Sensation.
Yes, I went to computer camp as a kid and learned how to program BASIC on Radio Shack TRS-80 Model 4.
I’m trans, queer, and bisexual. #actuallyautistic
I started programming with PHP in the mid 90s and haven’t stopped. I’m an advocate for the open web; I used Netscape as long as I can remember.
I have an obsession with Hello Kitty, Moogles, and Squishmallows.


Unless OP is running a data center, then there’s not really much of a power increase to run a local Ollama.


This is late but I switched to Hetzner and Domeneshop from AWS for VPS hosting and DNS and I’ve been quiet happy. I have a dedicated VPS with far more power for far less than I paid through AWS.
Also using Hetzner’s storagebox as a Google Drive replacement.


There’s a really good docker image I use for rustdesk at home. I’m thinking of just setting it up on my mom’s laptop and then dropping a script on her desk to toggle it on or off, depending on if she needs help (so she doesn’t have to fiddle with the commands).
But, yeah, the Rustdesk docker image is super easy to use along with the client. Then I just set up tailscale on my mom’s computer and invite her to my network.


Pretty flawless update from the apt repo on my end.
Server version 10.11.7
Oh my bad; I thought it wasn’t developed anymore! Makes sense!
Out of curiousity, is there a reason you picked K-9 over Thunderbird?
Aren’t they the same thing these days (genuine question)?
I have a GL-AX1800 and I’ve been happy with it; going to get another for my mum.


What troubles were you having with Baikal? I generally let mine just sit with a checked out tag from Git.


When you go to a shelf of recommendations, you’re not picking from a human; you’re picking from a shelf.


Seriously; local AI use is what everyone should strive for not only for privacy but because it’s better than using a large data centre and the power use for Ollama is negligible.


Is it any different that getting movies based on recommendations from employees at video stores?
You could probably get away with using gemma3:4b or phi3.5.