Also relevant: Backblaze Storage Pod 6.0
https://www.backblaze.com/cloud-storage/resources/storage-po...
https://www.backblaze.com/blog/open-source-data-storage-serv...
as an aside, backblaze has since switched to off the shelf supermicro systems
I dunno what the budget is, I've not had time to watch it all. however getting a second hand dell md3060 (they are rebadged from an OEM) for about £1k is also a good option.
Its 60drives and mostly bulletproof. Downside is that you'll either need SAS controller on your server, or find the vanishingly rare Sata controllers.
JBOD that badboy into ZFS and you'll have something fast enough for most things (streaming)
How we used them was hardware raid 7 in 4 groups of 13 with the rest as hot spares. LVM raid 0 and good to go (this was a time before production ZFS on linux)
I'm not sure what the compatibility is with larger sata drives given how old it is. I suspect you might be limited to JBOD.
I believe they're NetApp boxes, in fact.
I have a ZFS JBOD supporting a 40ish machine cluster, and it works really well, 99.99% of the time, which is good enough.
Mine is not very dense. 150TB/box.
Yeah I've deffo seen them on Netapps, but I'm not entirely convinced they are made by them. I saw some OEM versions of them "naked" as it were, but I can't remember what the company was called.
When you look at them, they really don't have the same style as netapp did at the time.
(or i'm wrong and senile.)
I watched this last week, it is a wonderful project. It took so much work and engineering.
Any suggestions for UK alternatives for cutting and bending the steel sheets needed for the chassis to the suggested US company in the video? I’ve reached out to a couple for quotes a couple of days back but haven’t heard back yet.
Here you go dude: - https://fractory.com/ - https://lasercutsend.uk/ - https://www.andoverlaser.co.uk/
The second is the most similar to the company in the video, but the first is a much more established company. If none of these can do what you need, I suggest looking around the large manufacturing cities, so sheffield/derby/birmingham where there are still lots of small bespoke workshops that service large companies like forgemaster, rolls royce and JLR etc....
What would a typical household/homelab use such a storage for? (“Games & Stuff”)
Data hoarders. I'm in a plex group on fb and there's people there with libraries that they could never personally watch all of. It sometimes seems like it's more a game of collecting all the things than it is about actually enjoying the collection.
You hear of media companies that delete old music and video from their own archives. People saving what they can may have the only copy left in existence.
Another part of it is the ability to play with enterprise hardware. That level of hardware has so many features which is cool for the technically inclined, but useless for a normal home user. When enthusiasm hits resources and the desire to acquire knowledge, this happens sometimes.
I have seen a couple of guys who acquired older generation storage "racks" which they "play with" in the weekends. Do they have the cooling? No. Does it affect their electricity bill? Very. But they want to learn that thing and want to play with it, which is understandable, as long as it's kept checked.
Not different from audiophiles who lose their way, actually.
I was a wannabe data-hoarder by accident, but I understood why I'm doing and decided to slim down drastically. I'm merging, deduplicating and deleting data step by step, because many of it is my own files from the days of yore, and I want to preserve some of them. To be frank, at this very moment I'm verifying that I have copied a bunch of files without corruption, so I can start working on them (sha256deep is an underappreciated tool).
Some of the datahoarders give me weird looks when I say, I'd rather have a single NUC with a couple of spinning drives for backing up what I care rather than having them all in a cabinet full of RAID arrays, but I already have them at work. I don't want another server at home (not because that I don't enjoy it, but I want to have some time touching actual grass).
A couple decades ago I came into posesion of some late model compaq servers, some fibre channel equipment, and a stack of small FC disks. Thanks to my MSDN sub I then had the necessary bits to build a proper MS server cluster. Thanks to that home lab I build the experience necessary to land a very good job and eventually ended up as a MS Server Clustering SME for a giant tech company doing work for one of the major CC companies. Home lab can be great because you can just break stuff on purpose to see how things work and what system resiliency looks like.
Fwiw you don’t _need_ to leave the enterprise stuff on 24/7, or have a huge hdd capacity (vs say $n enterprise drives of very limited capacity). It’s still gonna be expensive, but not silly expensive (and the ROI when you get promoted probably makes it worth it)
> you don’t _need_ to leave the enterprise stuff on 24/7
If you are using enterprise SSDs the you need to be aware that the JDEC standards[1] are such that the assumption for enterprise SSDs is that they are operating 24/7.
Which is why, for example, the standards specify "power off data retention" of 3 months for enterprise SSDs vs 1 year for client SSDs.
And conversely, for reliability, the standards specify "active use" 24/7 for enterprise vs 8 hours/day for client SSDs.
Like many things with ID, the choice of client vs enterprise SSDs is a 'pick two' scenario.
[1]https://files.futurememorystorage.com/proceedings/2011/20110...
In the post I have seen, where the guys got a single full rack and played with it on the weekends, running it for a day added a significant amount to their bills, so yes, newer systems are more efficient (generally due to compute efficiencies), but disks are disks. Spindles are not way more efficient than before.
On the ROI part, this is a case by case issue. I for one can do the "play" part at work, too. Also, I don't want to spare space for a 1U or 2U full-depth server at home. I'm not even adding disk boxes to this. I neither have the space, nor the desire.
Traditionally, hobbies cost money. I'm yet to hear anyone harangue folk on the ROI of their sourdough, blacksmithing or Storm Trooper cosplay hobbies. Perhaps this hobby is a little to close to sysadmin work for some, but I'm yet to see single-user SaaS weekend projects catch any flak on HN yet, instead, they are celebrated in "Ask HN: What are you working on" threads.
The point is the satisfaction you get in return of effort you put in, and perhaps kudos from like-minded folk when you execute particularly well.
Actually, I can reliably say that hobbies have some ROI, regardless of the hobby even, because you're getting experienced in what you do and subjects around your hobby. On the other hand if you do a hobby for its ROI, it's not a hobby anymore. It's just training. I prefer to have fun, not to train like a robot for some stats.
Recently I have watched a couple of Venus Theory's [0] videos. In one of them he asked the question why you're doing the thing you're doing, questioning the intention of creation. Is it self-satisfaction, or validation, he asks. I'm personally on the former camp. I used to share what I do for just putting it out, and adding a couple of pointers to it. If anyone commented on it, it's great (hint: nobody ever did). Otherwise I don't care. Having no feedback doesn't stop me, because I do what I do, enjoy the process and just put it out there (now less so because of the AI crawlers, alas).
While I like working/playing with computers, I have other hobbies, too, and I find them equally rewarding, and I don't care about their costs.
I also do not belittle the people who buy racks of hardware for their home. If I was not at the point I am currently, I'd probably do it, too. I'm just lucky to have access to it already, not needing these screeching hot banshees at home. Trying to scale down into a pragmatic minimalism also is both a result and reason of swimming in cables and big equipment in a small space when I was a teenager.
So, I got enough of these things at home, and I prefer to use them at their natural habitat. That's all.
Good for you.
This is correct, I personally aim to have all the highest quality versions of all movies, ie original Blu-ray. I have plenty of people that make use of it, it’s a hobby.
> It sometimes seems like it's more a game of collecting all the things than it is about actually enjoying the collection.
Aren't all collecting hobbies like this? Stamps, music on vinyl, movie posters, retro computers, cars, etc all have very little additional utility for size > n.
Either that or they share their library with others (or maybe a bit of both)
how much does it cost in term of electricity per month?
If you run all the drives 24/7 I would guess you are looking at somewhere around 400W assuming a power sipping minipc as the host. This can be extremely optimized if you intelligently spin down disks when idle, probably down to >50W average.
Cost will depend on your electricity contract, but will propbably not be a thing that would stop you if you want to do this.
From "negligible" to "I have another house inside my house" levels, depending on your hardware.
A close friend of mine runs a single beefy server at home, which is currently ~35% of his monthly bill if I'm not making mental-math mistakes.
As far as serious hobbies go, 35% of monthly electric bill is not breaking the bank. How much do people with other hobbies spend monthly on parts for their project cars, race entry fees, gym fees, "high performance" gear, Track fees, etc?
I didn't imply that it's a high amount. It's just an example I can refer reliably.
I'm not a stranger to expensive hobbies. I have at least a couple of them. Photography and high end stationery being two of them.
Knowledge archive. Everything I need to know to practice my job and life, I have an offline copy of. Wikipedia, SO, mediawikis, devdocs, git repos of dependencies etc.
If google decides to shove AI generated results up our throats, that's the reasonable alternative.
Currently I am building zimdex, as an alternative to the zim tools.
Also if that's your thing, check out the kiwix.org project. It's really nice.
Most of them should be pretty compressible though. How do you store them? Currently I'm running TrueNAS on a small NUC w/ 4SSDs and working on adding a mirrored pool via an external enclosure, but I'll be doing some bug fixing, it seems.
Store the Sentinel 2 imagery for a year - about 500TB.
Now I just have to find a way to avoid the $50k egress cost from AWS.
I know a person that just "collects" games. They don't play them, they don't distribute them, it's just dowloaded and (poorly) cataloged like a Pokemon collection, unironically trying to catch them all.
Z-library mirror maybe.
Memes
That's no meme- it's a fully operational crazy custom build. A remarkable video that doesn't end in "like and mash subscribe".
I think they meant "store memes on it".
That too! There seem to be a lot of YT videos that are memes on this theme as well though. He alludes to the fact that people make build videos that are essentially adding a couple of pre-existing more or less built components together. This is another level.
Amazed by how many things he built in the process of making his NAS.
High quality vid!
Chia farming.
"Linux ISOs"
The funny thing is, I keep a set of historical Linux ISOs to be able to work with older servers in my fleet.
Needing Debian 8 because that Lights Out connection requires JVM-something for the Java Web Start based console of the system.
Moreover, funnily, some newer servers work wonkier with more modern ipmitools and browser versions while connecting remotely. Intricacies of older embedded systems.
I do have an archive of Linux ISOs, but it is not anywhere near petabyte sized. Well I'm not trying to be comprehensive, everyone I download for many years now gets archived, and I am not sure that it is reached a terabyte yet.
full captures of VMs, snapshotting & ISOs eats up lots of storage
Previously ran an mp3 scraper recording 30 stations simultaneously
Good exposure to more music
[flagged]
Note that here DIY means designing the chassis, the drive loader mechanism, PCBs for backplate, power distribution (including crimping dozens of power cables), modding PCIe cards to fit.
It's not a boring "I bought a Chia mining server and inserted lots of hard drives" build.
You should watch the video before commenting; in the first 10 seconds it’s explained that everything is built from scratch, including chassis.
I would if I could but like I said, sometimes watching a video isn't an option. The move in some areas to publishing everything as a video instead of writing a proper article is problematic for multiple reasons. I am glad that most things on HN are print and much more discoverable and consumable in all situations.
Everything?
Well not everything of course, but at least the chassis and PCB boards.
I don’t think the most upvoted comment on this thread should be dismissing this as “not impressive”, not even having watched the video.
nit: the last letter of PCB stands for "board"
I’ll just leave this here: https://news.ycombinator.com/item?id=46512881