

Ansible.
I use docker for most of the services and Ansible to configure them. In the future I’ll migrate the server system to NixOS and might slowly migrate my Ansible to NixOS, but for the time being Ansible is working with relative ease.


Ansible.
I use docker for most of the services and Ansible to configure them. In the future I’ll migrate the server system to NixOS and might slowly migrate my Ansible to NixOS, but for the time being Ansible is working with relative ease.


It would be very tedious to type all of that on my TV, even if I could get mpv on it, and my TV/projector had hardware capabilities to decode the media, not to mention the difficulty in keeping my history between different devices or for different people. You’re clearly not understanding the problem Jellyfin solves, it’s like someone saying “why do we need Lemmy when we can write files on our samba shares” (which btw you should definitely not expose to the internet)


Yes, Google has miss reported my websites in the past, all of which were valid, but the person I’m replying to seemed to assume no-SSL is a requirement of the feature, and he doesn’t understand that a wrong/missing SSL is indistinguishable from a Phishing attack, and that the SSL error page is the one that warns you about phishing (with reason).


It is for pull requests. A user makes a change to the documentation, they want to be able to see the changes on a web page.
So? What that has to do with SSL certificates? Do you think GitHub loses SSL when viewing PRs?
If you don’t have them on the open web, developers and pull request authors can’t see the previews.
You can have them in the open, but without SSL you can’t be sure what you’re accessing, i.e. it’s trivial to make a malicious site to take it’s place an MitM whoever tries to access the real one.
The issue they had was being marked as phishing, not the SSL certificate warning page.
Yes, a website without SSL is very likely a phishing attack, it means someone might be impersonating the real website and so it shouldn’t be trusted. Even if by a fluke of chance you hit the right site, all of your communication with it is unencrypted, so anyone in the path can see it clearly.


While YUNO is a great way to get started, I strongly encourage you to understand basic concepts, like docker, and maybe try to run something outside of it for fun. While not even remotely the same thing since YUNO is just the OS and “app store”, you would be very similarly tied to that ecosystem the same way you are to Google now. Not to mean that YUNO would have any control over your stuff, but you would be dependent on them for what you can self host.


Ok, so, there are multiple things you should be aware.
First of all you’ve set that DNS to be 10.0.0.41, that range of IPs is reserved for lan, similar to 192.168.0.41 would be. Only people in the same local network as you might be able to access it.
Also, usually your home router doesn’t use the 10.x.x.x range, but some ISPs might do it in their internal network, which means your router doesn’t get an internet IP, instead your ISP router does and it shares the same external IP with different houses, so you would need to use something like https://www.whatsmyip.org/ to know what your external IP is.
But there’s more, since you don’t control that router putting that external IP in the DNS won’t work either.
You need to do something more complicated, I recommend you read on cloud flare tunnels for example.
And one final piece of advice, don’t share your urls with randoms on the internet, security by obscurity is not security and all, but publicly advertising your url is asking for trouble, even without doing that you will see several attempts of logging into your servers constantly.
On paper I should love Authelia, I’m a sucker for y’all configured services, I can write a couple of files on my Ansible and boom, everything works… However I never had much luck setting Authelia up, Authentik on the other hand was very painless (albeit) manual (via UI) configuration. I don’t do anything crazy, so any of them would work for me though, I just failed on setting Authelia and tried Authentik and had had no reason to change.


What problem are you having? Docker is very straightforward, just copy the compose file and run a command.


Kodi is a graphical app, like Firefox, so you won’t use docker for it.
I have Jellyfin running for years too and it has never broken for me, I use Linuxserver image, so maybe they delay the updates a bit?.. Now, Immich has broken so many times that nowadays is the only docker I don’t keep at latest (and I know using latest is a bad practice, I understand the reasons, but the convenience of not worrying about the versions beats all that for me)


Configuration is much easier, e.g. this is the full config you need to expose nextcloud on nextcloud.example.com (assuming caddy can reach nextcloud using the hostname nextcloud)
nextcloud.example.com {
reverse_proxy nextcloud
}
Comparing that to ngnix configs that need a template for each different service (although to be fair they’re mostly the same).


My point is that of those 120 probably 110 have never been compromised nor forced you to change the password due to expiration policies. The remaining 10 are the ones that require some mental gymnastics, so while the problem exists it’s not as serious as it sounds. I probably have more than 120 identities using this method since I’ve been using it for years, and I don’t think I ever had to use the counter, it’s a matter of being consistent in how you think about websites, for example if you know how you refer to a site slugify it and use that for the field, so you would use spotify, netflix, amazon-prime.


Yeah, it’s probably a legal thing, rreading-glasses is just metadata for books, completely legal, but readarr legality is less clear, so maybe they’re trying to prevent issues.
Also I didn’t understand what is rreading-glasses and why you need it
Say you want to grab a book by Isaac Asimov, you type the name of the book in readarr search bar, readarr contacts a metadata provider to show you cover images, author, date, etc. Then when you select the book readarr uses that metadata to search for downloads and ensure you’re getting the correct book and not another random book with the same name.
The problem is that readarr uses a closed source API for it’s metadata, and it’s constantly offline, which makes it impossible to use readarr. Luckily they allow you to customize the URL for the API, and rreading-glasses is an open source implementation of that API that you can use as a drop in replacement.


Yup, but most of that is easily solvable by being consistent, e.g. always use lowercase and your email (even if it’s not the login for that site). But yes, you need to know to be consistent so it’s a good point to make.


I noticed that my Ansible playbook failed to do a docker pull on readarr, I just commented it and was going to investigate further today. This sucks, especially because rreading-glasses did in fact completely solve the issue they’re facing. Not sure why they didn’t consider migrating to it officially, it’s only a config change.


It’s strange how I never see this mentioned anywhere, but there’s a way to get unique secure passwords for every site/app without needing to store them anywhere. It’s called LessPass, and essentially generates passwords based on 3 fields (site, username, master password) and works relatively well, because the advantages are quite obvious I’ll list the potential downsides:


Yup, just drop the markdown files in the folder where Silverbullet stores the markdown files and they’re accessible. BTW the format to reference a file is [[path/to/file]] and you can reference inexistent files and they will get created when you navigate to them.


I used to do that, but eventually found out about https://silverbullet.md/ it’s still just markdown files but I can edit them on the go with my phone.
Plus it has some nice extra syntax to query documents that’s quite handy.
Sure, but they have a setting to fix this by letting Plex know that 192.168.0.x range is local network (as if it needed it) except it’s behind a paywall.
How does it work on Android? One of my main use cases for Nextcloud is to be able to access some of my pdfs on my phone, the app seems to be focused on uploading which is something that while I do sometimes from my phone is much less often.