Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

NAS for Dummies

Options
2

Comments

  • Registered Users Posts: 36,164 ✭✭✭✭ED E


    Must actually put mine on a meter now its fully populated.


  • Registered Users Posts: 7,466 ✭✭✭Inviere


    OK, trying to begin a preliminary build list here & am running into trouble already. I'll need (at bare minimum) x3 sata ports for this build (x2 data drives and a parity drive). This is bare minimum.

    I'd like to also use a cache drive for Unraid, preferably an M.2 drive for speed. This is where I'm running into trouble, the boards I've looked at seem to suggest that using an M.2 drive will disable 1 or 2 of the SATA slots on the board, presumably because the M.2 slot is using SATA resources? If using NVME, it's the same, sometimes disabling x2 sata ports.

    So I'm head scratching at the mo, has anyone any ideas of an AM4 board that'd allow me to use x3 SATA drives, an M.2 drive as a cache drive, and still leave me say x2 sata ports for the future?? I'd look at server hardware, but I'm trying to keep costs down, and I know zilch about server hardware anyway :(

    Edit - how does Unraid handle pci-e expansion cards for sata drives??


  • Registered Users Posts: 7,466 ✭✭✭Inviere


    Example - ASRock B450M Pro4

    https://www.amazon.co.uk/gp/product/B07FVYKFXF/ref=ox_sc_act_title_1?smid=A3P5ROKL5A1OLE&psc=1

    The specs say:
    4 x SATA3 6.0 Gb/s Connectors, support RAID (RAID 0,
    RAID 1 and RAID 10), NCQ, AHCI and Hot Plug - M2_2 and SATA3_3 share lanes. If either one of them is in use,
    the other one will be disabled.

    1 x Ultra M.2 Socket (M2_1), supports M Key type
    2242/2260/2280 M.2 PCI Express module up to Gen3 x4
    (32 Gb/s) (with Summit Ridge, Raven Ridge and Pinnacle
    Ridge)

    1 x M.2 Socket (M2_2), supports M Key type
    2230/2242/2260/2280 M.2 SATA3 6.0 Gb/s module

    ^^ So if I understand the above correctly, the motherboard has x4 Sata ports, and x2 M.2 ports. M.2 port 1 uses pci-e lanes, and M.2 port 2 uses sata lanes. If you use M.2 port 1, you'll have all four sata ports available, and if you use M.2 port 2, then sata port #3 will be disabled....correct?


  • Registered Users Posts: 13,280 ✭✭✭✭kowloon


    Inviere wrote: »
    ^^ So if I understand the above correctly, the motherboard has x4 Sata ports, and x2 M.2 ports. M.2 port 1 uses pci-e lanes, and M.2 port 2 uses sata lanes. If you use M.2 port 1, you'll have all four sata ports available, and if you use M.2 port 2, then sata port #3 will be disabled....correct?

    Yeah, looks like one is SATA and the other is for NVMe. Some NAS enclosure let you use an SSD as a cache drive, that might be an option for you if you were to fill that extra M.2 slot.


  • Registered Users Posts: 7,466 ✭✭✭Inviere


    K found a more suitable board - https://www.amazon.co.uk/gp/product/B06X9F3FKP/ref=ox_sc_act_title_1?smid=A2S99BUYURIQ27&psc=1

    It's an X370 though, so kind of mismatched with the Ryzen 3 1200 cpu, but it's the only iteration of AM4 boards that seems to have enough sata for my needs.


  • Advertisement
  • Registered Users Posts: 7,179 ✭✭✭Serephucus


    Something worth mentioning:
    You can always add extra SATA ports very easily. Something like an LSI HBA will give you 8x SATA ports (and that one even comes with the cables). You'll often see the 9211 being recommended on forums, but I find it a pain in the ass. You have to go flashing firmware on it, and usually the utilities don't play nice with UEFI, and it's generally annoying. the 9207 is a faster card, and is just plug-and-go.

    (I'm using two of these in my server, for what it's worth)

    There are SATA cards that are much cheaper, but typically they're very low performance if you're hitting all the drives at once (as you would do during a parity check/rebuild).

    Also, if you're just transferring over the network, a SATA SSD will be completely fine. You're only going to be transferring at ~120MB/s, and a SATA drive will do 4x that with no problems. The only reason you'd use NVME really would be for VM vdisks or something. Though you mentioned an SQL DB, so maybe it would be useful there, I don't know.


  • Registered Users Posts: 7,796 ✭✭✭Calibos


    That LSI HBA, have you bought them from that ebay vendor in your link Serephucus? Should I click 'Buy Now'. Exactly what I need. Flashing firmware always scared me. Any experience with Sata Couplers though cause I'd be using with semi hotswap bays with integrated Sata Data and Power cables so need to connect the SAS/SATA cables to those.

    I was actually looking at the Syba 8 Port but then read that the Marvell chipset on them can be flakey or interfere with mobo Sata chipsets.

    Yeah, still haven't built the media server I started planning 5 years ago and started buying parts for back in 2014 (Case and Hotswap bays) Life, Health and finances got in the way. Thankfully my little D-Link 323 Nas's I must have bought in 2007/2008 that were already ancient then went on working 24/7 for another 5 years after 2014. LOL. I am absolutely playing with fire now though!! I have to get this done soon and shunt all the content archived off the Nas onto family PC's back in one place accessible 24/7 along with the stuff still on the Nas's.

    While I did get hold of a cast off 4670K and mITX mobo from a family member after they upgraded, I'm now at the point where I'm thinking about upgrading my main VR/Gaming rigs 6700K and ATX mobo to a 9700K and use the former in the Media server build and then when I eventually upgrade my GTX1080 GPU, move that over to the server too and turn the Media server into a Steam In-Home Streaming game server too. Thats why I will be sticking with Windows on the Media Server and probably go with Stablebit Drivepool and Snapraid. The cast off 4670K mITX I'll now turn into a CCTV NVR.


  • Registered Users Posts: 7,466 ✭✭✭Inviere


    Serephucus wrote: »
    You can always add extra SATA ports very easily. Something like an LSI HBA will give you 8x SATA ports (and that one even comes with the cables).

    Also, if you're just transferring over the network, a SATA SSD will be completely fine. You're only going to be transferring at ~120MB/s, and a SATA drive will do 4x that with no problems. The only reason you'd use NVME really would be for VM vdisks or something. Though you mentioned an SQL DB, so maybe it would be useful there, I don't know.

    Serephucus you're an absolute gent, I owe you a piny sometime for all the help with this, again, thank you for the effort & guidance!

    This build has been doing laps around my brain for the while weekend :o Here's where I'm at currently, and below, why I've chose these parts...

    Motherboard - Asus PRIME X370-PRO
    CPU - AMD Ryzen 7 1700
    CPU Coller - Cooler Master Hyper 212 EVO
    RAM - Corsair Vengeance LPX 16 GB (2 x 8 GB) DDR4 2400MHz
    PSU - EVGA SuperNOVA 550 G3 80 Plus Gold
    Case - Phanteks Enthoo Pro
    Cache Drives - WD Blue SSD's x2
    Data Drives - WD 8TB Red's x3 (x2 Data, x1 Parity)

    Ok, so it's a far cry from the Synology DS918+ I started out with here in this thread. I've gone with the motherboard because it offers x8 on-board SATA ports. This will offer me medium term expandability without having to buy any additional cards. It's a desktop component, and ECC ram support seems to be very contradictory. I don't think I can afford to take a chance buying ECC ram for the board/bios version that may/may not support it, so feel with the above build, I have to use non-ecc ram.

    The cpu is a far cry from the R3 1200 I had in mind, but for the extra price, I feel the jump to the R1700 is worth it. I'll obviously need it to run media server duties, but I'm also looking at running a MySQL docker container, AND possibly a Pi Hole one separately. Who knows what else I might need it to do down the line, so going from a 4/4 to an 8/16 now at the build/planning stage, seems to save the headache down the road.

    The PSU is probably overkill, but I want something gold rated, and something proven that won't give trouble. I've used G2's in the past and found them very nice PSU's, so feel the G3 here should more than comfortably provide the power draw for the build, while not restricting me to future additions.

    The cache drives, I think Unraid basically raid-1's these, so x2 500GB drives yields 500GB storage. I'd happily go with 250GB variants of these, but these were the same price on Amazon so listed them here. I'll obviously do proper price checking when I've settled on the components & am ready to buy, so these are subject to change to 250GB models later on.

    The RAM is 2400, but I'm not at all fussed with fast ram for a NAS, I don't think there's any need? It's desktop ram, so I'm not sure it's sitting well with me to not use ecc-ram, but I'm limited by the mobo. If I go with a board that supports ecc-ram, I seem to be sacrificing SATA ports, so that means an additional SAS card, which is more cost. Do I 100% need ecc-ram? I know Unraid lives in ram when booted, and ecc-ram can save some serious headaches, but are those cases rare? Would appreciate thoughts on this point (my biggest concern is the running of this 'server' 24/7, are desktop components up to that??)

    Cpu Cooler - the Evo 212 seems to review really highly. Seems perfect for the use case?

    Case - The Phanteks Enthoo Pro seems to fit the bill very sell, excellent cooling options, plenty of storage for drives, excellent for cable management & thus airflow...and again, reviews really well.

    So guys & gals, thoughts on the above build candidate? Is ECC ram a must? Should I consider server grade hardware (is anyone willing to hold my hand through that, & is it considerably more expensive)?


  • Registered Users Posts: 7,466 ✭✭✭Inviere


    Just realised I'll need a cheap as chips GPU too for bios configuration etc, but I'll pick something very very basic up for that.


  • Registered Users Posts: 36,164 ✭✭✭✭ED E


    What, no IPMI? Serephs is in love with it and he's right.

    1. Install power and LAN
    2. Never need to touch it again.

    No VGA, no power buttons, nada.


  • Advertisement
  • Registered Users Posts: 7,179 ✭✭✭Serephucus


    Build looks good to me. There's 3200MHz RAM you can get for £2 more though, which I'd go for. The cores on the CPU talk to each other through Infinity Fabric, and the speed of this is coupled to the speed of the RAM.

    No IPMI I noticed. :P That one's up to you though, and it would drive the cost up quite a bit to jump to a server mobo. To be honest, they're not really any different than desktop boards for your use-case. They're just a lot less flashy, and go through a lot more testing and validation.

    For GPU, yeah, you'll need something for initial config. I use some piece of **** I got on Adverts 10 years ago for that. Quick Google shows me this: https://ie.webuy.com/product-detail?id=sgragef2101gb&categoryName=graphics-cards-pci-e&superCatName=computing&title=nvidia-geforce-210-1gb-dx10.1, though if I'm being picky, I'd probably look for a passive one. No need to have some little fan in there making extra noise if you don't need it.


  • Registered Users Posts: 7,466 ✭✭✭Inviere


    ED E wrote: »
    What, no IPMI? Serephs is in love with it and he's right.

    1. Install power and LAN
    2. Never need to touch it again.

    No VGA, no power buttons, nada.

    My plan is, rather than have the server running 24/7 (hugely wasteful given how much I'd use it), I'll have the BIOS wake the machine up every day at a given time, I'll set Unraid to start the array automatically, and then I'll set Unraid to stop the array and shutdown at a given time. So once everything is set up and configured, I shouldn't need remote management tools like IPMI (as nice as it is!).
    Serephucus wrote: »
    Build looks good to me. There's 3200MHz RAM you can get for £2 more though, which I'd go for. The cores on the CPU talk to each other through Infinity Fabric, and the speed of this is coupled to the speed of the RAM.

    No IPMI I noticed. :P That one's up to you though, and it would drive the cost up quite a bit to jump to a server mobo. To be honest, they're not really any different than desktop boards for your use-case. They're just a lot less flashy, and go through a lot more testing and validation.

    For GPU, yeah, you'll need something for initial config. I use some piece of **** I got on Adverts 10 years ago for that. Quick Google shows me this: https://ie.webuy.com/product-detail?id=sgragef2101gb&categoryName=graphics-cards-pci-e&superCatName=computing&title=nvidia-geforce-210-1gb-dx10.1, though if I'm being picky, I'd probably look for a passive one. No need to have some little fan in there making extra noise if you don't need it.

    Good spot on the RAM, I've changed it there in my Amazon basket. No IPMI, mainly for cost reasons - I'll prob only have this server running ~8 hours a day, fully automated, and accessible through the Unraid web interface. Any bios changes and I'll just connect a screen to it temporarily. I have a passive GPU in the basket (Asus GT 710, passively cooled). Ideally, the mobo will post without a gpu in it, in which case I'll leave it removed once I'm finished with it.

    I've also dropped from a 1700 to a 2600, passmark scores are very very close anyway. The reason being is I'm still reading about people with gen 1 Ryzen's having freezing/locking up issues in Unraid, with the countermeasures seemingly not working for everyone. Knowing me, that'd include me, so I've gone to a 2600. The issue apparently is resolved for the 2k series, and I'd still be running with 6/12, so should be plenty hopefully.

    DYING to get started on this, and in the meantime I'm looking more & more into Unraid, Docker, and MySQL for Kodi. Thank you all so much again!


  • Registered Users Posts: 7,179 ✭✭✭Serephucus


    I could be mistaken, but I'm pretty sure the stuff with Ryzen was BIOS/chipset related, and was resolved with Linux kernel patches (unRAID releases), and AGESA updates. Anyway, I'd probably with with the 2600 regardless. Didn't realise they were quite that close in overall perf. and the clockspeed bump would be nice.

    To mitigate not having IPMI somewhat, you can also set unRAID to be its own syslog server, so in the event of an unexpected crash at least you don't lose the log files.

    To give you even more to look at, here's what's currently on my server Docker-wise:

    482100.png


  • Registered Users Posts: 7,466 ✭✭✭Inviere


    Serephucus wrote: »
    I could be mistaken, but I'm pretty sure the stuff with Ryzen was BIOS/chipset related, and was resolved with Linux kernel patches (unRAID releases), and AGESA updates.

    Aye, there were several bios updates. One fix involves disabling c-states, another involved adding a line of code to an Unraid config file, another involves changing a power setting in the bios from "auto" to "normal" or somesuch. Works for some, others reporting no changes and their servers locking up. I think I'll just save the headache at this stage, if I ever need more cores I'll just pony up for a 2700 (might even drop in price when Zen 3 lands).
    To mitigate not having IPMI somewhat, you can also set unRAID to be its own syslog server, so in the event of an unexpected crash at least you don't lose the log files.

    Excellent to know! :cool:
    To give you even more to look at, here's what's currently on my server Docker-wise:

    I'm not even going to pretend I know what half of that stuff is! :o I'll get there tho!

    Can I ask you one final question, just going back to transcoding. Without ever having sat in front of an Unraid interface, how is transcoding managed/configured? Ideally, I don't want to transcode anything, I just want to serve the raw media files to x3 clients, all of whom can natively handle the files/codecs anyway...


  • Registered Users Posts: 7,179 ✭✭✭Serephucus


    Inviere wrote: »
    I'm not even going to pretend I know what half of that stuff is! :o I'll get there tho!

    Didn't expect you to, but it gives you some stuff to start Googling when you ask the inevitable: "What else can I do with this?". ;)
    Inviere wrote: »
    Can I ask you one final question, just going back to transcoding. Without ever having sat in front of an Unraid interface, how is transcoding managed/configured? Ideally, I don't want to transcode anything, I just want to serve the raw media files to x3 clients, all of whom can natively handle the files/codecs anyway...

    So unRAID on its own has nothing to do with the transcoding. This will depend entirely on what you use to host and play your media. What are your clients?

    If you know all of your clients can play the media fine, then you can just export the media as a normal share over SMB and browse the folders from your client's file explorer.

    Otherwise, you've got a bunch of options. The two main ones are Plex and Emby. Some people prefer Emby because it's open-source, though I think that's been changing in recent times because a bunch of Ex-Emby people forked it to create Jellyfin.

    I've always just used Plex myself. They've revamped their client and server apps a couple of times since I've used it, and personally I feel like they've gotten worse, so I might look into changing in the future, but for the moment it's joined the longer-every-day list of "I'll get around to it".


  • Registered Users Posts: 7,466 ✭✭✭Inviere


    Serephucus wrote: »
    So unRAID on its own has nothing to do with the transcoding. This will depend entirely on what you use to host and play your media. What are your clients?

    If you know all of your clients can play the media fine, then you can just export the media as a normal share over SMB and browse the folders from your client's file explorer.

    Ahah, I see. The clients are x2 Nvidia Shields running Kodi, and potentially an Xbox One running Kodi too (the Xbox one isn’t high on my priorities, so basically two shields). As I understand it, I’ll share the media as NFS shares, and Kodi will play nicely with NFS (maybe SMB will work the same, I’m not 100% just yet). I’ll tie in a MySQL setup on Unraid, and configure both Kodi installs to look up via the MySQL database (done via advanced settings.xml through Kodi).

    The Shields play anything I have with zero trouble, so I don’t want to transcode really and lose quality. I know I’d have to if using tablets/phones to stream media, but I 100% won’t be. Two shields, maybe an XBO, and that’s it.


  • Registered Users Posts: 7,179 ✭✭✭Serephucus


    Yeah, Shields should play anything fine, in my experience. Export via NFS/SMB should be all you need.


  • Registered Users Posts: 7,466 ✭✭✭Inviere


    Serephucus wrote: »
    Yeah, Shields should play anything fine, in my experience. Export via NFS/SMB should be all you need.

    No doubt there’ll be a problem in every step of the way....but sure that’s half the fun of the hobby :o

    Oh, have gone from x3 8TB Red’s to x4 6TB’s instead, more space for very little extra. Ok there’s an extra point of failure with more drives, but it’s all a balancing act. Anyway, you’ve done more than enough man, many many many thanks for the advice, info, and patience :) I will be sure to come back to annoy you when this build gets started!


  • Registered Users Posts: 17,416 ✭✭✭✭Blazer


    Just out of curiosity...you're going to a lot of work for a server/nas which won't be running 24/7.
    Why won't it be running that?
    Also constant startup/shutdown is not recommended for servers as it lessens the lifetime of components.
    Once a week would be fine, better once a month.
    I had looked at building one of these before but went for a synology (on my second one in 12 years) because it was quieter and while pricey at the start costs far less to run over time plus it has a great UI.


  • Registered Users Posts: 3,739 ✭✭✭scamalert


    Blazer wrote: »
    Just out of curiosity...you're going to a lot of work for a server/nas which won't be running 24/7.
    Why won't it be running that?
    Also constant startup/shutdown is not recommended for servers as it lessens the lifetime of components.
    Once a week would be fine, better once a month.
    I had looked at building one of these before but went for a synology (on my second one in 12 years) because it was quieter and while pricey at the start costs far less to run over time plus it has a great UI.
    i think you answered that yourself power consumption on pc parts, as ive asked before what would be running costs and total draw. As didnt follow every step but OP is essentially using pc parts if configured correctly to start shut down shouldn't really be any issues, as pc components at this age are well able to handle that for years, hardest step will be config part to make it happen clean and not have corruptions in the process.


  • Advertisement
  • Registered Users Posts: 7,466 ✭✭✭Inviere


    Blazer wrote: »
    Just out of curiosity...you're going to a lot of work for a server/nas which won't be running 24/7.
    Why won't it be running that?
    Also constant startup/shutdown is not recommended for servers as it lessens the lifetime of components.
    Once a week would be fine, better once a month.
    I had looked at building one of these before but went for a synology (on my second one in 12 years) because it was quieter and while pricey at the start costs far less to run over time plus it has a great UI.

    As for why it won't be running 24/7, well, there'll be nobody using it until the afternoons/evenings...so my primary reason is power usage. It doesn't make sense to leave it sitting there powered on (albeit drives spun down) for two thirds of the day every day.

    Is startup/shutdown that intensive? One startup and one shutdown per day, on what is essentially a desktop pc running a NAS OS...it didn't strike me as particularly damaging...but perhaps with NAS hdd's it is?

    I started here looking for advice on a Synology unit, it seemed to fit the bill well, and still does on paper. The advice I went with though was to build a pc for the same price, which would be way more capable, which for me made sense given I'd really love to have MySQL looking after my Kodi database. Who knows what else I might put on server duty too down the line, Pi Hole, Emby, etc.

    The build I have specced, is coming in at a little bit over €100 dearer than a Synology DS918+, but far more capable, offers far cheaper room for expansion in the future, will run docker images, Unraid offers a very friendly interface...ok it may be more expensive to run/power than the Synology, but for the extra capabilities, I consider it a worthy tradeoff.


  • Registered Users Posts: 5,108 ✭✭✭John mac


    Inviere wrote: »
    The build I have specced, is coming in at a little bit over €100 dearer than a Synology DS918+, but far more capable, offers far cheaper room for expansion in the future, will run docker images, Unraid offers a very friendly interface...ok it may be more expensive to run/power than the Synology, but for the extra capabilities, I consider it a worthy tradeoff.

    how much do all the parts come to ?
    (im looking at something to update my 12yo freenas setup )
    thanks
    following this with interest too ..


  • Registered Users Posts: 7,179 ✭✭✭Serephucus


    Inviere wrote: »
    Is startup/shutdown that intensive? One startup and one shutdown per day, on what is essentially a desktop pc running a NAS OS...it didn't strike me as particularly damaging...but perhaps with NAS hdd's it is?

    It's fine, don't worry about it.
    Inviere wrote: »
    I started here looking for advice on a Synology unit, it seemed to fit the bill well, and still does on paper. The advice I went with though was to build a pc for the same price, which would be way more capable, which for me made sense given I'd really love to have MySQL looking after my Kodi database. Who knows what else I might put on server duty too down the line, Pi Hole, Emby, etc.

    You'd be amazed at how convenient having a box on 24/7 is. I'd be interested to hear how much this thing pulls when idling, but I'd be surprised if it's more than an old-fashioned lightbulb.


  • Registered Users Posts: 7,466 ✭✭✭Inviere


    John mac wrote: »
    how much do all the parts come to ?
    (im looking at something to update my 12yo freenas setup )
    thanks
    following this with interest too ..

    So I started by looking at the Synology DS918+. That can be got for €530, so that's my baseline. That's obviously without any hdd's too.

    The build I've chosen, is more capable than the Synology, so you could always scale back my build to get it to match, or perhaps even beat, the Synology's price point, while still being more capable. Here goes:

    Asus Prime X370 Mobo - €100
    Phanteks Enthoo Pro Case - €95
    Corsair Vengeance LPX 16GB DDR4 3000 - €85
    Ryzen 2600 - €155
    Asus GT710 GPU - €45
    EVGA G3 550 - €90
    Unraid OS (Basic Package for up to 6 drives) - ~€50
    Amazon.de delivery charges ~€10

    Total - €630 - It's exactly €100 dearer than the Synology, but offers way, way more potential. Now, I'm adding a few extras in my build, but as they're extras, and not part of the Synology unit either, I've not included them (cache drives, additional ssd bracket, some fans, etc).


  • Registered Users Posts: 386 ✭✭Coyler


    For what it's worth, I looked at the Synology/QNAP route after running by XBMC (formerly Kodi) or Plex/NAS box for years. Thing is, the cost just seemed quite high for the plug and play benefit. Not knocking it. Each to their own.

    If you are anyway technical the flexibility rolling your own gives you so many options. For example I'm now looking at building a 3 or 4 GPU unRAID OS so the kids can all have their own separate VMs for playing Minecraft or what not as opposed to a single machine. Makes my job parenting their time on the computer a whole lot easier and they'll get to run their own LANs! Might be joining them in the future :) I'll run a Plex/NAS and what not on it as well. Zen 2 looks to be the job for this project.

    I'll also say its a good little project for anyone looking to learn more about the Unix world. If you work in IT, especially more Microsoft focused, it's an excellent string to add to your bow and this is a good place to start learn very applicable skills.

    As for power questions, just leave it running 24/7 and get an SSD for the OS if you're really concerned. Most modern computers will sit sipping power for most the day. A single run of your dryer or a roast chicken easily out-ways any concerns you should have for the box.


  • Registered Users Posts: 17,416 ✭✭✭✭Blazer


    IF you want to have a look around at the synology I can give you a share and let you take a look over it.
    Like I said I had a look at building a nas before and decided in the end to go with the synology. I get about 5-6 years out of each model before I upgrade it and generally sell on the old model for 100-150 euros.
    I've built my own pc for years so a nas is no hassle whatsoever but the convenience of the synology won me over.


    It all depends on what you want from it.
    If you fancy doing some of the stuff that Coyler mentioned above then you can't beat a homebuilt one.


  • Registered Users Posts: 5,108 ✭✭✭John mac


    would I be able to just move my existing 2 drives(freenas) into a new build ,
    add parity drive or 2 and extra drives for storage ?


  • Registered Users Posts: 7,466 ✭✭✭Inviere


    Just an off-topic but slightly related question re overall network/lan configuration. I can think of a few possible configurations, but am not sure if one offers any advantages/disadvantages over the other in terms of network speed (which will be relevant when sending data to the server to be written).

    Option 1: Router -> 4-port switch in the same room -> x3 outputs from switch (x2 outputs terminating in a bedroom each) -> final output terminating where the server will be in a utility cupboard

    Option 2: Router -> 4-port switch in the same room -> x1 output from switch feeding into bedroom 1 -> 4-port switch in bedroom -> x2 outputs from bedroom switch (one terminating in bedroom 2, the other terminating at the server)

    Option 3: Router moved across the room (requires cabling, but eliminates the need for a switch in this room) -> x3 outputs from router following the same path as option 1

    Basically, none of the cable runs are hugely long, and well within 5e spec. Am I better having shorter runs, with a switch in the middle, or longer cable runs without a switch in the middle? I tried to illustrate the room configuration in a diagram but lost patience :o


  • Registered Users Posts: 386 ✭✭Coyler


    John mac wrote: »
    would I be able to just move my existing 2 drives(freenas) into a new build ,
    add parity drive or 2 and extra drives for storage ?

    Old post but didn't see it. Short answer is if its from Freenas to Freenas, that's no problems. If you are crossing over to another OS, for example unRAID, you are going to have to do some work which probably involves just copying the data to the new RAID and adding any disks to that.


  • Advertisement
  • Registered Users Posts: 5,108 ✭✭✭John mac


    Coyler wrote: »
    Old post but didn't see it. Short answer is if its from Freenas to Freenas, that's no problems. If you are crossing over to another OS, for example unRAID, you are going to have to do some work which probably involves just copying the data to the new RAID and adding any disks to that.

    Thanks ,
    not wat I wanted to hear ,but will give it a go anyway , chance to get rid of stuff i dont need but keep it just in case .:)


Advertisement