Table of Contents

Overall Hardware Recommendations

Minimum, Recommended, Ultimate Configurations

ggRock 20

Notes on the Configuration

Ready-Made Configurations

ggRock 40

Notes on the Configuration

Ready-Made Configurations

ggRock 100

Notes on the Configuration

Ready-Made Configurations

Recommended Configurations 20\40\100

Ready-made Configurations

Appendix I: Redundancy

Appendix II: ggRock Writeback Handling and Recommended Drive Sizes

Appendix III: Modifying Your Existing Network Boot Solution Server

How to Use the Modified Server

In Case there are Not Enough SATA Ports on the Motherboard

Appendix IV: Using U.2 drives

Setup (4TB example)

Comparison with SATA RAID10 (4TB Example)

Comparison with SATA RAID10 (8TB Example)

Appendix V: 1 Gigabit vs 10 Gigabit with ggRock

Comparison

What Does it Mean for Your Center

Topology and Networking Gear

$400 Networking Solution


Overall Hardware Recommendations

  1. Hardware RAID controllers are incompatible with ggRock.

    Your RAID controller should be configured in HBA mode.

    In most cases the built-in SATA Controller on the Motherboard is enough.

    RAID mode in BIOS must be disabled.

  2. Cores/threads count is more important than CPU frequency.

    CPUs such as Intel Core i3/i5/i7/i9, Intel Xeon E, AMD Ryzen or AMD Threadripper are not optimal, but may still be utilized in a ggRock server.

  3. RAM with ECC (error correction) is highly preferable (there might be situations where you can damage all the data on your disk due to RAM errors).

  4. SATA SSDs with PLP (power loss protection) are better suited for PXE/iSCSi purposes than consumer SSDs, because they prevent data loss on power failures. Since there are a lot of SSDs with PLP on the market, that have the same price as consumer SSDs, they are more preferable. Do not get QLC drives or HDDs.

  5. Your NICs for the client Machines should be compatible with PXE boot and the NIC for the server should be compatible with Linux.

  6. Redundant RAID levels, like RAID1 and RAID10, are highly recommended for data protection. RAID5/RAID6 and their variations are not generally recommended, because they have worse random IO performance than RAID10, unless we’re talking NVMe drives.

  7. For better performance and longer lifetime data drives should always have at least 20% free space. Please take that into consideration when you choose the size for your data drives.


Minimum, Recommended, Ultimate Configurations

With ggRock you have the freedom to choose the hardware you desire for your center and clients. However, we are going to provide you recommendations for center setups of various sizes in three tiers:

  1. Minimum - minimum specs that were tested and are known to provide good experience for the gamers. This doesn’t mean that ggRock won’t run on the specs below these, but we don’t guarantee correct or optimal operation. For this reason we don’t recommend going below these specs.

  2. Recommended - a recommended, cost-effective setup based on our experience and benchmarks just before hitting the point of diminishing returns. This recommendation will also be redundant but you can always make the choice to go for no-failover config.

  3. Ultimate - theoretical maximum performance for all operations, way past the point of diminishing returns. Redundant, future-proof and overall best experience for the gamer at any cost.


ggRock 20

This configuration includes operating 20 Machines or less from a single server running ggRock

Notes on the Configuration

Platform: Choosing between ATX (mATX) and 2U comes down to the availability of server racks and your desire to install them.

We recommend going for rack-mounted 2U case (blade).

You can easily build the ggRock server in a mATX or ATX case with mATX support.

Going 2U with the motherboard we recommend gives you great upgrade paths for the future and 10Gb NIC.

Recommended SKU: iStarUSA D-214-MATX 2U with SuperMicro X11SPM-TPF inside.

CPU: ggRock scales best with more cores/threads and less with higher frequency.

CPU is generally not very important for ggRock.

Getting any CPU with 4 threads and 2 GHz base clock is guaranteed to power your system sufficiently.

Recommended SKU: Intel Xeon Bronze 3204 to go with our recommended motherboard.

Getting Xeon scalable allows you to replace it with the Ultimate configuration SKU.

Both passively cooled with Dynatron B8 2U Passive.

RAM: The most important part of the server, and the one that is scaled the easiest. The reason why we recommend server motherboard and CPU is that it allows you to get your RAM up to 1.5Tb, as an opposite to 128Gb RAM in the consumer motherboards and CPUs. There is a point of diminishing returns with RAM though, but the more RAM you have the more diverse game library could be booted up on your PCs at the same time. See FAQ answer for more details. Having Registered ECC RAM (as recommended) allows to prevent failures due to partially corrupted RAM and better handle significant load.

Recommended SKU: 2 x Crucial 16GB DDR4-2933 RDIMM ECC, Dual Rank.

Note:

For minimum configuration we list 1 x 16GB RAM module (instead of 2 x 8GB in dual-channel) simply for the sake of upgradeability and urge you to get a 2nd RDIMM to take advantage of dual-channel performance and 32GB RAM.

OS Drive: We recommend a PLP drive, and 240Gb ones nowadays cost just as much as 120Gb ones.

Recommended SKU: 2 x Samsung 883 DCT Series SSD 240GB - SATA 2.5"

Game Drive: We recommend a PLP drive in a RAID10 configuration. The size of each drive should be chosen accordingly to your game library. Bear in mind that you will only, at best, have 75% of the RAW capacity for actual games and software. From 4 0.98TB drives in RAID10 you will get 1.96TB*0.75 ~1.45TB of usable storage. We think that having 1.45TB for the game library is generally sufficient for a 20 Machines center.

Recommended SKU: 4 x Micron 5200 ECO 960Gb, 2.5in SATA 6Gb/s, TLC

Server NIC: If you can get a 10Gb NIC alongside your motherboard then it’s a great deal and should be used. ggRock is faster than other network boot solutions currently on the market and can fully take advantage of a 10Gb NIC installed on the server. If you have ended up with a different motherboard - you can get an Intel X540-T2 (T2 for failover, get T1 for cheaper option). You can read more about NICs in 1 Gigabit vs 10 Gigabit with ggRock appendix below.

Recommended SKU: NIC is installed on the SuperMicro X11SPM-TPF

Client NIC: We recommend sticking with a regular 1Gb NICs that are embedded on the board for most cases. We have not determined the point of diminishing returns for higher speed client interfaces. You can get faster NICs and that will result in faster game download/installation/loading, provided your Internet connection can back it up.

Recommended SKU: Any integrated NIC with PXE boot support

Other: 600W 80 Bronze PSU, SATA cables, IcyDock ExpressCage MB326SP-B (6 Bay 2.5 SATA Cage) for drives easy access, mounting rails for your cabinet, fans for the case.


ggRock 20 Ready-Made Configurations


ggRock 40

This configuration includes operating 40 Machines from a single server running ggRock

Notes on the Configuration

Platform: For 40 PCs we recommend going for rack-mounted 2U case (blade). Assembling a server in an ATX (mATX) case is still an option but is not advised. Going 2U with the motherboard we recommend gives you great upgrade paths for the future and 10Gb NIC out of the box.

Recommended SKU: iStarUSA D-214-MATX 2U with SuperMicro X11SPM-TPF inside.

CPU: ggRock scales best with more cores/threads and less with higher frequency. But CPU is generally not very important for ggRock. Getting any CPU with 4 threads and 2 GHz base clock is guaranteed to power your system sufficiently. Recommended SKU: Intel Xeon Bronze 3204 to go with our recommended motherboard. Getting Xeon scalable allows you to replace it with the Ultimate configuration SKU. Both passively cooled with Dynatron B8 2U Passive.

RAM: The most important part of the server, and the one that is scaled the easiest. The reason why we recommend server motherboard and CPU is that it allows you to get your RAM up to 1.5Tb, as an opposite to 128Gb RAM in the consumer motherboards and CPUs. There is a point of diminishing returns with RAM though, but the more RAM you have the more diverse game library could be booted up on your PCs at the same time. See FAQ answer for more details. Having Registered ECC RAM (as recommended) allows to prevent failures due to partially corrupted RAM and better handle significant load.

Recommended SKU: 4 x Crucial 16GB DDR4-2933 RDIMM ECC, Dual Rank.

OS Drive: We recommend a PLP drive, and 240Gb ones nowadays cost just as much as 120Gb ones.

Recommended SKU: 2 x Samsung 883 DCT Series SSD 240GB - SATA 2.5".

Game Drive: We recommend a PLP drive in a RAID10 configuration. The size of each drive should be chosen accordingly to your game library. Bear in mind that you will only, at best, have 75% of the RAW capacity for actual games and software. From 4 1.92TB drives in RAID10 you will get 1.96TB*0.75 ~2.9TB of usable storage. We think that having 2.9TB for the game library is generally sufficient for a 20 Machines center.

Recommended SKU: 4 x Micron 5200 ECO 1.92Tb, 2.5"" SATA 6Gb/s, TLC. You can also use NVMe drives, read about it in our using U.2 NVMe drives appendix.

Server NIC: If you can get a 10Gb NIC alongside your motherboard then it’s a great deal and should be used. ggRock is faster than other network boot solutions currently on the market and can fully take advantage of a 10Gb NIC installed on the server. If you have ended up with a different motherboard - you can get an Intel X540-T2 (T2 for failover, get T1 for cheaper option). You can read more about NICs in 1 Gigabit vs 10 Gigabit with ggRock section below.

Recommended SKU: NIC is installed on the SuperMicro X11SPM-TPF

Client NIC: We recommend sticking with a regular 1Gb NICs that are embedded on the board for most cases. We have not determined the point of diminishing returns for higher speed client interfaces. You can get faster NICs and that will result in faster game download/installation/loading, provided your Internet connection can back it up.

Recommended SKU: Any integrated NIC with PXE boot support

Other: 600W 80 Bronze PSU, IcyDock ExpressCage MB326SP-B (6 Bay 2.5 SATA Cage) for drives easy access, mounting rails for your cabinet, fans for the case.


ggRock 40 Ready-made configurations


ggRock 100

This configuration includes operating 100 Machines from a single server running ggRock

Notes on the Configuration

Platform: For 100 PCs we strongly recommend going for rack-mounted 2U case (blade). Assembling a server in an ATX (mATX) case is almost out of the question simply due to the number of networking equipment required to power such a center. Going 2U with the motherboard we recommend gives you great upgrade paths for the future and 10Gb NIC out of the box.

Recommended SKU: iStarUSA D-214-MATX 2U with SuperMicro X11SPM-TPF inside.

CPU: ggRock scales best with more cores/threads and less with higher frequency. But CPU is generally not very important for ggRock. Getting any CPU with 4 threads and 2 GHz base clock is guaranteed to power your system sufficiently.

Recommended SKU: Intel Xeon Silver 4210 to go with our recommended motherboard. This CPU is passively cooled with Dynatron B8 2U Passive.

RAM: The most important part of the server, and the one that is scaled the easiest. The reason why we recommend server motherboard and CPU is that it allows you to get your RAM up to 1.5Tb, as an opposite to 128Gb RAM in the consumer motherboards and CPUs. There is a point of diminishing returns with RAM though, but the more RAM you have the more diverse game library could be booted up on your PCs at the same time. See FAQ answer for more details. Having Registered ECC RAM (as recommended) allows to prevent failures due to partially corrupted RAM and better handle significant load.

Recommended SKU: 4 x Crucial 32GB DDR4-2933 RDIMM ECC, Dual Rank.

OS Drive: We recommend a PLP drive, and 240Gb ones nowadays cost just as much as 120Gb ones.

Recommended SKU: 2 x Samsung 883 DCT Series SSD 240GB - SATA 2.5".

Game Drive: We recommend a PLP drive in a RAID10 configuration. The size of each drive should be chosen accordingly to your game library. Bear in mind that you will only, at best, have 75% of the RAW capacity for actual games and software. From 4 0.98TB drives in RAID10 you will get 1.96TB*0.75 ~1.45TB of usable storage. We think that having 1.45TB for the game library is generally sufficient for a 20 Machines center.

Recommended SKU: 6 x Micron 5200 ECO 1.92Tb, 2.5"" SATA 6Gb/s, TLC. You can also use NVMe drives, read about it in our using U.2 NVMe drives here section.

Server NIC: At this size we recommend getting a 40Gb (or at least 25Gb) NIC to for optimal operations under heavy load from all 100 Machines.

Recommended SKU: NIC is installed on the Intel Ethernet Converged XL710-QDA2

Client NIC: We recommend sticking with a regular 1Gb NICs that is embedded on the board for most cases. We have not determined the point of diminishing returns for higher speed client interfaces. You can get faster NICs and that will result in faster game download/installation/loading, provided your Internet connection can back it up.

Recommended SKU: Any integrated NIC with PXE boot support

Other: 600W 80 Bronze PSU, IcyDock ExpressCage MB326SP-B (6 Bay 2.5 SATA Cage) for drives easy access, mounting rails for your cabinet, fans for the case.


ggRock 100 Ready-Made Configurations


Recommended Configurations 20\40\100


Appendix I: Redundancy

Redundancy, usually manifested in an additional set of drives, allows us to guarantee your ESports Center uninterrupted operation on hardware failure. For the sake of redundancy we urge you to use Registered ECC RAM, PLP SSDs, RAID1 (RAID10) ggRock drive arrays for both Linux OS and ggRock Disks. We would also recommend getting a UPS that would allow you to gracefully shut down your server in case of a power failure.

To understand whether you need redundancy simply imagine a busy Friday evening and having one of your two ggRock Disk RAID0 drives failing. This will result in your entire game library, OS image being gone, alongside all PCs getting disconnected. To repair this issue you would need to:

  1. Diagnose what has failed, and then which drive had failed.

  2. Urgently order a drive of matching capacity.

  3. Wait for it to be delivered and install it.

  4. Install Windows, get through all of the updates, install and update all of your games.

All in all, we estimate this at around 2 full days of center being offline, and a lot of working hours spent. Given a 40 Machine center at $1 US per hour that would translate into at least a $1000 US loss. With this in mind, we stand by our decision to include redundancy in our Recommended configurations.


Appendix II: ggRock Writeback Handling and Recommended Drive Sizes

ggRock uses a special, centralized caching system which is dependent on the drives where the data resides. For this reason, there are no dedicated writeback drives with ggRock. However, this means that writebacks are being handled on the same drives as the games / client OS resides.

Moreover, SSDs require at least 20% of their space free for optimal operations and longevity. This means that you can only get 75% of your raw storage. From a 2 x 1.92TB RAID0 you will get around 2.9Tb of usable storage, leaving a 20% headroom.

We recommend provisioning at 10GB of writeback space on average per each Machine for comfortable operation. For a 40 Machines center that means having 400GB left over on the drive from the usable storage cap. 1TB for 100 Machines.

These facts should be considered when choosing drive sizes. If your game library is 1.3TB at the time and you have 40 Machines with 2 x 0.98 TB (1.45TB usable storage) drives you are at risk of running out of space with further game updates or an especially busy day.

Please also consider that games tend to take up increasingly more drive space, and it is always a good thing to provision a couple years ahead. Depending on your game library and PC count we recommend always going for a “next tier” policy. If you currently fit in 1.45TB with all writebacks considered - get 2 x 1.92TB drives. If you are content with 2.9TB of storage - consider going for 2 x 3.86TB drives for a combined 6TB of storage that will serve you for years.


Appendix III: Modifying Your Existing Network Boot Solution Server

For modifying your existing server used with another network boot solution you will need:

  1. 1 x 240GB PLP SSD for ggRock (Linux OS). We recommend Samsung 883 240GB.

  2. 2 x 1TB or 2TB PLP SSDs for Machines game library and Windows OS. We recommend Micron 5200 ECO 960Gb for 1TB and Micron 5200 ECO 1.92Tb for 2TB.
    Bear in mind that 2x1TB will give you 1.45 TB available storage and 2x2TB will give 2.9TB

  3. At least 3 spare SATA headers for the new drives.

How to Use the Modified Server

During boot you will need to enter your BIOS’s Boot Menu (usually F11) and select a drive that you have ggRock or other network boot solution installed. In case of a failure you will need to reboot the server and choose the other drive to boot. This is called a dual boot installation.

In Case there are Not Enough SATA Ports on the Motherboard

If you don’t have any spare SATA ports on your board but you have PCIe slots available you might extend your current SATA ports with a PCIe-SAS/SATA extension card.

There is only one requirement: it should have an HBA controller.

Examples of RAID cards that support HBA mode:

The difference is in the throughput that these cards will provide you with.

Make your choice to allow 6GB/s speeds to each of your drives.

Bear in mind that you will need a SAS-SATA cable splitter ($12) to use these cards.

An article detailing guidelines for choice of an SKU: https://www.servethehome.com/buyers-guides/top-hardware-components-freenas-nas-servers/top-picks-freenas-hbas/


Appendix IV: Using U.2 drives

Disclaimer: Please check prices at the time of purchase.

These figures are as of December 2019.

When your center has a library that exceeds 2TB and more than 40 Machines, depending on U.2 drives availability, we would suggest you consider using them.

Key benefits:

  1. Guaranteed hot-pluggability

  2. NVMe speeds (3.5GB/s read, 3.1GB/s write)

  3. 2.5” form factor for bays

  4. No need to create striped raid (RAID0) for performance gains

  5. Can be cheaper than 2xSATA drives and faster than their RAID0

  6. Better upgradeability path than from SATA RAID10 4 drives

Setup (4TB example)

Disclaimer: Please check your PCIe lane configuration on your motherboard or contact us to help you with that.

Having several PCIe x16 ports does not guarantee that all of them have all lanes connected:

  1. 2 x StarTech.com x4 PCI Express to SFF-8643 Adapter + 2 x Cable Matters Internal 12G U.2 Cable (Mini SAS HD to U.2, SFF-8643 to SFF-8639 Cable) total ~$130 if your motherboard supports PCIe bifurcation (our recommended pick does)

  2. 2 x Micron 9300 Pro 3.84TB NVMe U.2 total ~$1400

If your motherboard supports bifurcation and you want to connect 2 U.2 drives to one PCIe slot you can use Supermicro AOC-SLG3-2M2 PCIe Add-On Card + 2 x DiLinKer M.2 to U.2 (SFF-8639) PCIe NVMe SSD Cable) total ~$100 to get two drives connected.

You can also get just 2 x DiLinKer M.2 to U.2 (SFF-8639) PCIe NVMe SSD Cable) total ~$50 if you gave 2 spare M.2 slots on your motherboard. This is not a recommended approach, since most M.2 slots are not connected to the CPU directly and you will not get full NVME drive speeds.

There is an option to use 1 M.2 adapter and 1 PCIe adapter, but this is not a recommended approach is not a recommended way and we urge you to contact us prior to getting such a setup.

Comparison with SATA RAID10 (4TB example)

For this example, we will utilize 4TB of U.2 storage with redundancy.

We’ll get 4 x 2TB SATA 2.5” drives and 2 x 4TB NVMe 2.5” drives:

  1. 4 x Micron 5200 ECO 1.92Tb total ~$1700
    Speeds: ~1GB/s reads, 0.5GB/s writes

  2. 2 x Micron 9300 Pro 3.84TB + peripherals total ~$1530
    Speeds: ~3.5GB/s reads, 3.2GB/s writes

In this case we are getting 3 times the performance of the drive itself for the same sum.

It’s important to remember that ggRock uses RAM heavily to avoid direct drive requests, but when the request misses - it is being served from the drive.

More PCs and more variety in games - more misses there will be.

Comparison with SATA RAID10 (8TB example)

Now let’s move up to 8TB example with redundancy.

We’ll get 4 x 4TB SATA 2.5” drives and 2 x 8TB NVMe 2.5” drives

  1. 4 x Micron 5200 ECO 3.8Tb total ~$2400
    Speeds: ~1GB/s reads, 0.5GB/s writes

  2. 2 x Micron 9300 Pro 7.6TB + peripherals total ~$3000
    Speeds: ~3.5GB/s reads, 3.2GB/s writes

  3. 4 x Micron 9300 Pro 3.84TB + peripherals total ~$3200
    Speeds: ~7.0GB/s reads, 3.2GB/s writes

8TB case is less representative and for 8TB storage SATA pulls back a bit of advantage and becomes a less expensive option. However, for clients demanding 8TB of storage for their images and writebacks the argument of the speed of direct drive reads becomes more relevant, and going for a 2 x 7.6TB option still seems like a feasible option.

There is also an option to put 4 x 4TB U.2 drives into a RAID0 for unheard of read speeds for a relatively low price premium. However, this requires research as some motherboard and CPU designs have PCIe Gen 3.0 bottlenecks, which prevent NVMe RAID0 from reaching its max potential.


Appendix V: 1 Gigabit vs 10 Gigabit with ggRock

Comparison

We want to be clear that ggRock will work perfectly well on current generation 1Gb NICs on both server and client side.

Even in our current version ggRock is capable of much higher speeds than other network boot solutions currently on the market.

Competitor network boot solution

1Gb LAN

ggRock

1Gb LAN

As you can see from the two examples above, ggRock (on the right) completely maxed out 1Gb LAN interface (110-120MB, roughly 1000Mb per second). All of that data is real data passed through to the client machine, and not extra metadata or data that is not requested by the client. This, naturally, results in much faster game loading times and game downloading/installation time.

The 1Gb LAN example is not representative of the current maximum performance of ggRock.

To the right you can see a test with a virtualized Unlimited LAN interface. Here, ggRock reaches:

  • Max 10Gb interface speeds for sequential reads (game/level loading, downloads)

  • 2.5Gb interface speeds for random 4KB Q8T8 operations (gaming)

All benchmarks are done in the same conditions over 1Gb and Unlimited LAN interfaces using CrystalDiskMark 6.0.2 x64, using 5 runs of 50MiB test sizes.

What Does it Mean for Your Center

We are currently in the process of conducting more sophisticated, real-world based benchmarking.

However, based on the synthetic tests we can tell you that ggRock will:

Make Game Installs and Maintenance Faster

Current feedback from Pioneer center suggests that being able to utilize full 1Gb interface speeds allows them to spend less time installing and updating games. Seeing as sequential operations with ggRock max out at around 10Gb, having such a NIC on the server will give you 10x speed for certain workloads.

Testimony: “I'm almost done installing all my games in < 1 hour. This was not possible with the other network boot solution” (quote redacted to exclude competitor’s name)

Game/Level Loading

With faster image I/O we are expecting to have noticeably better loading times for games (initial loading screens) and game level loading.

At some point, the cost of the solution will overshadow the benefits - e.g. the ROI will not be sufficient to make it economical.

Initial tests have yielded the following results:

Rainbow Six Siege loading times (ggRock is 37% and 42% faster)

Competitor network boot solution

1Gb LAN

First boot - 2:09.16

Second boot - 2:12.51

ggRock

1Gb LAN

First boot - 1:21.04

Second boot - 1:16.54

Our goal now is to get three questions answered:

  1. Where is the point of diminishing returns with installing 10Gb NICs, 40Gb NICs and so on into the server depending on the number of Machines in the ESports center?

  2. Where is the point of diminishing returns with installing 1/2.5/5/10Gb NICs into the client Machines.?

  3. Is there any tangible effect on the FPS compared to competitor network boot solutio?

Gaming

We don’t have any confirmation right now that higher speed LAN interfaces are going to measurably impact actual gaming experience and FPS in the majority of modern titles. One exemption from this might be Anthem, due to the abundance of loading screens between locations. This topic is in the benchmarking roadmap for the team.

Topology and Networking Gear

Lastly, we want to touch on the topology question, since costs for networking equipment should 100% be included in the equation when considering an upgrade from 1Gb LAN to 10Gb LAN. One point to be made:

Given you are running Cat 5e+ in your center, the initial migration from 1Gb LAN to 10GB lan shouldn’t cost more than $300 US. $160 for the NIC, $130 for one switch and $10 for some SFP+ cables. Here’s why.

$400 Networking Solution

Get a Mikrotik CSS326-24G-2S+RM + Intel X710-DA2 (T2 for failover, get T1 for cheaper option) and an SFP+ cable.

You only need to attach this switch to your ggRock server and your current networking setup, configure the forwarding and you should be done.

Now you can leverage full 10Gb LAN interface speeds from your server to your combined ESports installation.

Naturally, you will not be able to leverage 10Gb on any given machine due to a set of bottlenecks, but combined you will not be bottlenecked by a 1Gb NIC on the server.

This means that you could reliably boot windows and load games for all of your connected machines much faster.

There surely might be bottlenecks in IOPS of switches down the line, and all of that can be discussed with you on the case-by-case basis.

Networking gear models that we recommend you to look at:

  1. MikroTik CRS305-1G-4S+IN - great PoE switch, use one of the SFP+ ports for ggRock and rest occupied with a set of Mikrotik CSS326-24G-2S+RM for full-center 10Gb coverage

  2. MikroTik CRS326-24S+2Q+RM - QSFP+ to SFP+ switch. It will allow you to connect all of your switches down the line to the SFP+ ports, eliminating bottlenecks on that part. With this switch, if you are running a bigger center (100+ machines) you can also consider getting a 40Gb NIC for ultimate performance

  3. MikroTik S+RJ10 rev 2.16 - to convert some of the SFP+ ports of the aforementioned CRS326 to RJ45 10GBASE-T copper.

  4. MikroTik CRS312-4C+8XG-RM - interesting option for 4SFP+ and a versatile setup for small center needs

Did this answer your question?