Stage XVI: High Performance NAS

After completing Stage XV: Time Machine Backups using FibreChannel a few months ago it was time to update my HomeLab once again…..

I build my own Synology NAS System one year ago using 4 x SATA HDDs and an old Intel ATOM Dual Core System. I used this NAS for my Desaster Recovery Backups with VMware vDP and for storing all my ISO Images. I replicated all the data to another Synology NAS system inside my datacenter two at home.

The performance of the NAS system could be better and I am running out of space, so it was time to build a new one…..

I used XPEnology again for my operating system, but I wanted more performance 🙂

I took the Intel Avoton platform, because of the delivered performance and the power consumption.

I searched for a mainboard with passive cooling and minimum 8 SATA ports. I found the Asrock C2550D4I….

C2550D4I-1(L)

1040154-1

The C2550D4I has the Intel Avoton Quad Core CPU, 12 SATA ports, 2 Intel 1GbE ports, a dedicated IPMI port and 4 memory banks supporting ECC chips.

I used the following components for my High Performance NAS:

Asrock C2550D4I

16GB Corsair memory (Non-ECC)

12 x WD 1TB HDDs

1 Verbatim Micro-USB 2.0 Stick

3 HE Rack chassis from Chenbro

IMG_2668IMG_2667IMG_2666

I tweaked the BIOS/UEFI Settings to use all three SATA Controllers from the mainboard.

I disabled the serial ports within UEFI:

– Advanced > Super IO Configuration > Serial Port 1 Configuration > Serial Port (Disabled)
– Advanced > Super IO Configuration > SOL Configuration > Serial port (Disabled)

The last trick was to disable AHCI and enable IDE mode for the third SATA controller.

Intel RC Setup -> South Bridge Chipset Configuration -> SATA 3 Configuration -> SATA Mode Selection = IDE

My plan was to use the newest XPEnology DSM 5.2 release for my NAS, but there are several bugs within this release, so I used DSM 5.1-5055.

After installing the DSM 5.1-5055 release, four of my HDDs are marked as external SATA and could not be used for my entire volume creation. What happened here? After searching for the issue I found the solution in tweaking the DSM configuration:

/etc/synoinfo.cfg

/etc.defaults/synoinfo.cfg

I changed the following lines:

esataportcfg from “0x0ff000” to “0x0”

internalportcfg from “0x0fff” to “0x0fffff”

After a reboot all drives a now showed as internal. Bingo 🙂

screenshot_150

I created a RAID 10 volume including all 12 drives. I used the 1GbE ports, because my 10GbE LAN ports are full. I will upgrade my LAN environment to have more 10GbE ports available in the near future.

My plan is to use the official Synology 10GbE network card within my Custom High Performance NAS: E10G15

synology-ethernet-adapter-e10g15-f1

The performance is really nice, I reached 120MB/s over one 1GbE connection. The system can handle 600MB/s internally within the volume. Amazing. I will use my new NAS for vDP and ISO Images again. The system is fully VAAI compatible after installing the official VAAI NFS plugin onto all ESXi Servers.

You can find the next upgrade step here: Stage XVII: 10GbE for all