MakeMKV can rip the DVDs without touching the contents. I’d suggest either an ISO or more helpfully the contents in folder layout which should be preserved under a top level folder with the name of the disc and at the bottom level .vob files.
You certainly can use Handbrake but it is re-encoding and if you have no experience it’s easy to mess up (among other things de-interlacing doesn’t always work right without tweaking so it’s typically best for archiving to not re-encode DVDs before sharing).
-If- you do chose to use Handbrake (again I wouldn’t recommend it if archiving, it takes skill and there’s a reason why to this day full DVD rips are useful to people who want a copy while someone’s best attempt at an AVI file made 15 years ago looks awful and is considered useless given the low bitrates and old codecs) I’d plead you use software not hardware encoding, choose x264 or x265 (10bit for x265) and use the slow preset at CRF, constant quality 16 and make sure de-interlacing is set right on auto, also pass through the audio in original dolby digital as well as vobsub subtitles. But it really is best to not encode and just copy.
You can share directly to the DHT swarms by just creating your own torrent and eventually people will find it assuming it’s named correctly in the format of <movie name (year)>.
Don’t duplicate other people’s work if you can help it. There are various sites for sharing this type of stuff, I don’t want to get in trouble so won’t name the one but there is one listed in the piracy community megathread, a Russian one, semi-private. I would search disc titles first to make sure what you’re doing doesn’t already exist and focus on archiving and sharing original non-re-encoded copies of those which don’t presently exist elsewhere.
If you’re just backing up and not serving this data just get 2-3 4TB drives (new, recertified, or used) and an external dock and test the drive then back it up then test again and check SMART both times. Place one drive with a relative or trusted friend. Connect and power up each of the drives at least once annually, refresh the data with anything new at that time and check the smart stats, consider running at least a quick SMART test to ensure none are mechanically failing then back to being unplugged. Really every 3-6 months would be ideal to power on and check SMART but I wouldn’t pester a relative that often for the external one, 1-2 times a year should be fine for that.
This strategy protects you from cryptolocker malware by not leaving any of them live and accessible.
Cheapest or most flexible, choose one. If you want absolute cheapest but not that flexible you can buy a used office PC, a Thinkcenter or Dell optiplex are the most reliable ones though depending on the model they may accommodate anywhere from 1 to if you’re lucky 4 (though commonly only 2) drives via that many SATA ports (often half the SATA ports are 1.0/2.0 for DVD drives so you may not get full speed). Finding space inside them for more than 1 drive could also be a problem depending on form factor but mid-tower models often have room for 2 with space for a third lying on the case itself if you really want to push it.
Most flexible I suppose someone else’s old NAS build, a used case with room for at least 4 3.5" drives gives you a little room to expand.
You don’t need RAID, it’s not a back-up solution, RAID is for high data availability and integrity. If you really want to you can set-up a RAID 1 I suppose though know this means you’d require at minimum 4 disks for your data and one copy and 6 disks for two copies.
As to sourcing the drives, there are various companies, server parts deals is one that’s well known and decent though their presently available sizes may be larger than what you’re after. No matter whether the drive is brand new, recertified or bought used on ebay the recommendation is test, test, test. Even new drives can be bad. Run a full SMART test at least once, check the SMART data and make sure there are no failure indicators. If you want to be really thorough I’d suggest checking the SMART data when you get it, noting anything concerning, running an extended/full SMART test then after that finishes formatting the drive but unchecking quick format and doing a slower format option that writes zeros across the drive, then filling with your data, then doing another full/extended SMART test and again checking the SMART values before putting it away. Re-test and check SMART at least annually if you’re keeping the drives cold.
At least two copies, ideally three, at least one copy off-site for things such as fire. If you don’t have a relative, friend, or workplace where you can stash an off-site copy your option would be basically cloud storage back-up which for 4TB wouldn’t be too costly (backblaze personal would allow this much IF you keep one copy connected to a computer that has their app and is turned on at least monthly and they’re $100 a year though note they will delete your data if you go more than 30 rolling days without syncing so if there is a disaster you have a limited time to either get another drive and download it again or contact them and pay to have a copy shipped to you before it’s deleted).
You could also I suppose invest in a fireproof safe though that doesn’t protect against burglary where they steal your safe thinking it has valuables in it. You really need a copy off-site. Other options would be a bank safe deposit box though probably more costly.
One way to get friends to help is to buy more storage space than you need, say two 8TB drives and you offer to back-up a copy of their stuff at your house so you have a copy of their stuff+yours at your house and they have the same copy at theirs. Though you could also use separate drives.
All re-encoding unless it’s from lossless to lossless induces degradation. For archival purposes I’d suggest against re-encoding unless it’s to another lossless format or unless they’re in a lossless format or very high bitrate (>20MBps video for SD or 1080p HD) and you’re keeping a high bitrate in the new encoding. Also avoid hardware encoding, it’s faster but introduces more degradation and is less precise than software encoding. Removing duplicates is another matter.