Back in the day, I did it (semi)manually: prepared few dirs of symlinks to be fed into mkisofs/mkhybrid, balanced them in size and grouped by relevance, then wodim/cdrecord in a shell loop.
I also tried to employ `dump` to be stored on an RW… but couldn't `restore` it later… so this is not an advice.
Anyway, optical discs are pretty expensive nowadays… (maybe only in my area, IDK), and I presume not any more reliable than earlier; so this whole idea seems impractical to me.
The use case here is that I back up my server onto an 8TB external hard disk. Current overall size is ~5TB of backups.
Blu-Rays would be for either cold archival of content that's "important but not really very" or for an off-site version of my main backup.
Bare internal hard disks are the cheapest per-gigabyte storage on the market right now, but the comparison isn't to those, as much as to LTO ($$$$) or RDX ($$$, but bad at archival.)
I found out that Nero does it, although Nero also installs a very large amount of crapware on the machine while doing so.
The other thing that's worth noting is that I *could* use something like winrar/7z to split the files to arbitrary files, but I'm then still responsible for making ISOs or manually burning discs.
The idea was a little bit to combine it with something like an Acronova Nimbie, at which point it's most of the way to being a "$$$" solution, but the incremental cost is still lower than, say, RDX, where a 4TB cartridge is $500. (5TB of BD-RE-DLs is around $150 on newegg in the USA, just looking.)
functional.cafe is an instance for people interested in functional programming and languages.