• 0 Posts
  • 14 Comments
Joined 2 years ago
cake
Cake day: July 1st, 2023

help-circle
  • Wireguard doesn’t necessarily need to have those limitations, but it will depend in part how your VPN profile is set up.

    If you configured your wireguard profile to always route all traffic over the VPN then yeah, you won’t be able to access local networks. And maybe that’s what you want, in which case fine :)

    But you can also set the profile to only route traffic that is destined for an address on the target network (I.e your home network) and the rest will route as normal.

    This second type of routing only works properly however when there are no address conflicts between the network you are on (i.e. someone else’s WiFi) and your home network.

    For this reason if you want to do this it’s best to avoid on your own home network the common ranges almost everyone uses as default, i.e. 192.168.0.* and 10.0.0.*

    I reconfigured my home network to 192.168.22.* for that reason. Now I never hit conflicts and VPN can stay on all the time but only traversed when needed :)


  • I agree that it’s a huge fuck up, my comment wasn’t in defence of the post office, just a related story :)

    Whenever I have delivered code for a client it has always been in a way where the client has complete ownership of the code and can maintain it themselves later (or ask a different company that isn’t us to come do it) because that’s the only sustainable approach, and all companies should absolutely demand that all work done for them is done this way.


  • I did consultancy work as part of renewing and replacing ancient software systems for an insurance company, and it’s amazing how little people actually know about how their own business processes are actually supposed to work.

    Orgs are in the position where everyone who built a system is gone, and all the current people who work there defer to the system for how the processes work, without actually properly understanding the rules. And so the system itself becomes the arbiter of correctness.

    This is obviously horrible because it ends up where nobody dares to touch the current system in case they break it in some way nobody understands.

    We ended up speaking to people across the whole business to painstakingly work out what the rules really were, putting together a new system and effectively “dual running” that side-by-side with the old system, so we could compare outputs and make sure they were the same. In some case they were different, and in some of those cases it was actually because the old system was actually wrong, but nobody noticed!

    It’s a mess.


  • Back in my days working as .NET developer on Windows 7, I came into work one morning to find a colleague fuming that his machine had died on him.

    He spent the whole morning reinstalling Windows and getting his environment set back up, and then finally pulled the branch he was working on and got back to work. Ran his test suite and bam, machine crashes.

    It was only at that point the penny dropped. We took a look at his branch, and sure enough he’d accidentally written a test that, when ran, deleted his entire C: drive!

    That particular lesson made me very careful when writing code that does things with the filesystem.



  • Personally, I don’t feel that analogy is a fair comparison.

    Begging a dev for new features for free would definitely be entitlement, because it’s demanding more, but what OP is upset about is reduction in the service they already had.

    I don’t think any free tier user of any service could have any right to be upset if new features were added only for paying customers, but changing the free tier level is different.

    In my opinion, even if you aren’t paying for it, the free tier is a service level like any other. People make decisions about whether or not to use a service based on if the free tier covers their needs or not. Companies will absolutely try to upsell you to a higher tier and that’s cool, that’s business after all, but they shouldn’t mess around with what they already offered you.

    When companies offer a really great free tier but then suddenly reduce what is on it, then in my opinion that’s a baiting strategy. They used a compelling offering to intentionally draw in a huge userbase (from which they benefit) and build up the popularity and market share of the service, and then chopped it to force users - who at this point may be embedded and find it difficult to switch - to pay.

    So yeah, it doesn’t matter in my opinion that the tier is free. It’s still a change in what you were promised after the fact, and that’s not cool regardless of whether there is money involved or not.




  • The fade should be slow and subtle. At first the client thinks they are just imagining it, but then they start getting customer support calls about the site being faded, and their bosses are pointing it out too in meetings, and as it happens more and more the panic really begins to set in.

    Finally they reach out to you in a desperation when there’s barely anything left of the site and ask you to urgently fix the problem, and you just shrug your shoulders sympathetically and explain it’s happening because they haven’t paid - but not like in a way that suggests you are doing it on purpose, but a way where it’s simply an unavoidable natural consequence, like if you didn’t pay your electricity bill your power would get cut and the site is slowly “dying” and fading away because of that.

    They’d pay so fast.


  • My biggest problem is security updates.

    The “x years of upgrades” model is okay when it’s for an app, where you can just keep using it with the old feature set and no harm is done.

    But Unraid isn’t an app, it’s a whole operating system.

    With this new licensing model, over time we will see many people sticking with old versions because they dont want to pay to renew - and then what happens when critical security vulnerabilities are found?

    The question was already asked on the Unraid forum thread, and the answer from them on whether they would provide security updates for non-latest versions was basically “we don’t know” - due to how much effort they would need to spend to individually fix all those old versions, and the team size it would require.

    It’s going to be a nightmare.

    Any user who cares about good security practice is effectively going to be forced to pay to renew, because the alternative will be to leave yourself potentially vulnerable.



  • I agree as far as the feature set is concerned, but software unfortunately doesn’t exist in a vacuum.

    A vulnerability could be discovered that needs a fix.

    The operating system could change in such a way that eventually leads to the software not functioning on later versions.

    The encryption algorithms supported by the server could be updated, rendering the client unable to connect.

    It might be a really long time before any of that happens, but without a maintainer, that could be the end.


  • The clue with Unraid is in the name. The goal was all about having a fileserver with many of the benefits of RAID, but without actually using RAID.

    For this purpose, Fuse is a virtual filesystem which brings together files from multiple physical disks into a single view.

    Each disk in an Unraid system just uses a normal single-disk filesystem on the disk itself, and Unraid distributes new files to whichever disk has space, yet to the user they are presented as a single volume (you can also see raw disk contents and manually move data between disks if you want to - the fused view and raw views are just different mounts in the filesystem)

    This is how Unraid allows for easily adding new drives of any size without a rebuild, but still allows for failure of a single disk by having a parity disk - as long as the parity is at least as large as the biggest data disk.

    Unraid have also now added ZFS zpool capability and as a user you have the choice over which sort of array you want - Unraid or ZFS.

    Unraid is absolutely not targeted at enterprise where a full RAID makes more sense. It’s targeted at home-lab type users, where the ease of operation and ability to expand over time are selling points.


  • Been using unraid for a couple of years now also, and really enjoying it.

    Previously I was using ESXi and OMV, but I like how complete Unraid feels as a solution in itself.

    I like how Unraid has integrated support for spinning up VMs and docker containers, with UI integration for those things.

    I also like how Unraid’s fuse filesystem lets me build an array from disks of mismatched capacities, and arbitrarily expand it. I’m running two servers so I can mirror data for backup, and it was much more cost effective that I could keep some of the disks I already had rather than buy all-new.