

Either or both could also be accomplished with thy setup through encrypted partition or flash drives or just manually encrypted files plus manual backups. Personally, I’d just use kde vaults for ease of use.
Either or both could also be accomplished with thy setup through encrypted partition or flash drives or just manually encrypted files plus manual backups. Personally, I’d just use kde vaults for ease of use.
Just another reason to grow weed with solar.
A bit of both for small decisions. I’d trust it with little things and for more important stuff it could work for the trick where you flip a coin and figure out which thing you actually wanted by gauging your reaction to the result.
Nvm the hyperlink had but and I didn’t see.
Server
It’s down for me.
An interesting way that I don’t know of being implemented is a donation system where you donate to a feature request / issue and whoever implements / patches it gets it, and a “tax” so that some percentage of every donation can go to maintenance, server costs, etc.
Eh, I sometimes spin up a temporary docker container for some nonsense on a separate computer. I usually just go for it after checking no one is on and backing up necessary data.
A full git commit’s what I’m thinking of
From what I’ve heard it’s someone who really doesn’t know what they’re doing making it, and it has a ton of obvious security issues that the dev refuses to acknowledge. It really isn’t something that people should actually use.
Considering my boot drive just died, backups. Also wanna get a fractal node 804 and cram tons of HDDs in it. Probably a new build with ecc as well. Perhaps transitioning current server to backup server. Also my directory structure for media is a jumbled mess of incomprehensible nonsense. I should fix that. Also I lost all my torrents that I was uploading but still have the media but can’t keep seeding after the drive failure.
I misread that and thought Wayland went in a very different direction for a second.
I don’t like that all of the questions are mandatory. For some of them, I just haven’t done enough with whatever it is to have an opinion and would not be able to provide good data.
joplin
I have an arc for transcoding, and I had to set the device to /dev/dri without the renderD128 part. If I were you, I would just use the 2060. If it’s there for llama or something I’d still try it and see how it does doing both at once, as it should be separate parts of the gpu handling that.
Yeah, I like it too. My only issue is ollama’s lack of intel support. I have been looking at issue 1590 on their GitHub. For now I have a 1050ti in a cardboard box PC with other hardware being 10+ years old and a mixed set of RAM totalling 12G. It also has a 100Mbit nic, so I can’t take advantage of full internet speed when downloading models. The worst part is they can support intel, but haven’t merged the solution because of an issue with the windows intel drivers. Linux is fine but I can 't have it. I wasn’t planning to rant, but I already typed it so… enjoy?
I thought the game was 2048.
But it isn’t. It sends me an nginx error. The nginx is on that server, so that server isn’t completely down.
Not worrying temps for most stuff. If you have mechanical HDDs you may want to check those specifically.