Testing the Jellyfin photos thing out now. I don’t know if it’s working right, but when I first looked at it the issue was I thought it seemed very video focused. I guess otherwise I’m learning docker after all.
Testing the Jellyfin photos thing out now. I don’t know if it’s working right, but when I first looked at it the issue was I thought it seemed very video focused. I guess otherwise I’m learning docker after all.
Fair enough, last time I tried docker, which was a long time ago, I had all sorts of issues with permissions and persistence. I guess it’s probably better now.
I don’t want a research project. I just was hoping there was an easy to use program to make the viewing better than samba shares. Maybe I just need a set of programs that will display thumbnails over samba.
Well, what you could do is run a DNS server so you don’t need to deal with IPs. You could likely adjust ports for whatever server to be 443 or 80 depending on if you’re internal only or need SSL. Also, something like zerotier won’t route your whole connection through your home internet if you set it up correctly, consider split tunneling. With something like zerotier it’ll only route the zerotier network you create for your devices.
syncthing will work with pretty large amounts of data, unless you mean having the storage space on each device is the “won’t work” issue.
Noise doesn’t matter in a data center which is where the switches live. The power use might be more than a 1gbit, but they’re in line with any dual power enterprise switch really.
I will have to see that. I would be concerned about pushing cat5e that fast. I am not sure about cat6, but again that speed is not fast enough to buy new cards for the computers and if we were buying cards I guess the 10G fiber cards are likely cost competitive now that servers are dumping them as obsolete.
Yea, I think 2.5G is really searching for a market, that may not exist. For home use, 1Gbit is in general plenty fast enough, and maxes out most US customers Internet too. For enterprise use 10G is common and cheap. The cards to get an SFP+ port into any tower or server is just really small. Enterprise is considering how to do 100G core cheaply enough, and looking for at least 25G on performance servers, if not also 100G in some cases. If you’ve got the budget you can roll 400G core right now in “not insane pricing”.
2.5G to the generic office (that might well be remote) is likely re-wiring and unnecessary. And that’s if you don’t find ac WiFi sufficient, i.e. sub 1G.
Fair enough. Last time I checked, I saw enough people warning against btrfs that I just figured it wasn’t going to catch up to ZFS and kind of forgot about it. Now I realize that may have been awhile ago, and if it’s not in RHEL, I haven’t considered it as enterprise ready - which recently is changing with Red Hat / IBM losing their darn minds, but my “working knowledge” is limited on stuff I don’t watch all the time.
I like xigmaNAS if you want a simple NAS and ZFS.
I’m going to say avoid btrfs, it’s still basically in beta. I want to see wide use in industry and functions like competitors - a la mdadm / vdev and ZFS have.
I agree with everyone to not let the TV access the internet. Instead, get a raspberri pi or le potato or the like with LibreElec (or whatever the current successor OS is) running Kodi. Point it at a SMB share and bam.
The “Right answer” is copy all the content to another device, upgrade disks, copy content back to new array that can take advantage of the larger disks. Or even safer, set up a completely new device in parallel and then copy data over, decommission the old device.
Some RAID systems will let you do what you say, but will not let you expand the usable storage space. Some will have particular disks they’ll even take. It really depends on the RAID system.
All you really have to do with Zerotier is set up your network routing appropriately, and the end server would do reverse NAT, presumably via IPTables in most situations. Zerotier is just a virtual cable, everything else is the same as if you plugged in an ethernet cable between the two endpoints and you can run network services there just the same. Oh, you do have to enable the ZeroTier client to route public IPs for this to work.
In your usecase, it probably doesn’t matter. I usually suggest DIY with more disks and XigmaNAS for ZFS and RaidZ2 or RaidZ3 depending on disk size and number. The cost is usually in the disks, and I tend to prefer smaller disks and more for cheap replacements when they die, cheaper initial purchase, and getting more spindles so I can use cheaper “everything” and still get decent performance. I usually wouldn’t consider a 2 bay NAS myself, mostly because I’d just do what you’re already doing and plug in a large single external disk. In the past (15 years ago now though) the single disks lasted quite a long time, though I did buy internal disks and used my own enclosures. Recent Amazon reviews imply that is the best plan even today in that model because the prebuild MyBooks etc at 12TB or whatever are supposedly horrifically unreliable, but maybe you’ve had better luck.
I guess I just don’t get how you have share with “large” group of people and private and secure? I mean, pastebin anyway had a timeout. And the paaster github even says you have to run your own instance for security and privacy. If I’m running my own server, I presumably don’t need to encrypt my data from myself. If I am running my own server, and the security is explicitly the link, it’s not actually secure because the link grants access. But that’s to be expected, anything more and you get into needing to authenticate everyone, which is the exact opposite of easy or quick.
And for anything I’m concerned enough about to not share with the internet - I wouldn’t be posting on a game lobby or forum. Or I mean, if I trust the forum privacy, why not just… idk… post the text content to the forum?
My issue here isn’t that I don’t see the need for a pastbin sort of service, my issue is I think for the vast majority of usecases you’ve listed and I can imagine, you’re getting security theater, not actual security and privacy.
This just seems like misunderstanding the point of a pastebin, and also what tools are appropriate. At least to me, depending on privacy demands, you’d use an existing cloud filesharing service like box for basic privacy, or if your correspondent understood encryption and privacy, you’d use something like Signal to share either a compressed image or data, or the actual file.
Isn’t the point of a pastebin to be publicly accessible, hence encryption seems irrelevant to me? I mean, I’ve only ever used them to share code or errors or logs with a forum or stack overflow or whatever. I have no reason to add key exchange or password exchange with “everyone who might view the forum / stack overflow”. It’s effectively public anyway.
Are people using these pastebin services for something else?
Probably just so you don’t accidentally waste time unknowingly rereading a book.