Today, we are reviewing the Supermicro ARS-111GL-NHR. This is a 1U air-cooled server based around the NVIDIA Grace Hopper. For those who need a bit of a catch-up, the NVIDIA GH200 is less of a single offering and instead is a family of 72-core Arm Neoverse V2 CPU plus NVIDIA Hopper packages with memory onboard. The advantage of a server like this is that there is a fast NVLink-C2C connection between the CPU and GPU. We are going to get to the server review, but if you want to learn more about the package, see our NVIDIA GH200 guide.
Along with this review, we are going to geek out a bit on this picture, and for good reason. While it may look mundane today, it is part of what could be another multi-billion dollar market for NVIDIA that we are going to cover in a piece on our Substack here.
This is a system review that was a bit of an opportunistic one. We got word that Supermicro had one of its first GH200 systems in April available when stopping by at the company. Supermicro let us pull it out, for photos, but then we also got to run it for a bit before it had to go for other duties.
Supermicro ARS-111GL-NHR External Hardware Overview
Looking at the front of the server, we have a lot of vents and eight drive bays, but those are a bit more involved than you might think.
Since the NVIDIA GH200 in this is air-cooled, we have massive vents on the front.
Even the Supermicro logo, status LEDs, and power button are surrounded by vents due to cooling needs of a 900W package inside.
The drive bays are all EDSFF. Specifically, they are E1.S NVMe drive bays. While there are eight shown on the front of this server, our unit does not have all bays connected.
The rear of the server is for I/O and power supplies.
First, we have two PCIe Gen5 x16 slots. These have an NVIDIA BlueField-3 DPU for Ethernet and storage along with an NVIDIA ConnectX-7 for InfiniBand.
Here is what that top NVIDIA BlueField-3 DPU looks like:
In the center, we see something quite different from other Supermicro servers. We see an out-of-band management port and a single USB 3 Type-A and mini DisplayPort.
On the end, we see two power supplies for redundancy.
Each of the power supplies is a 2KW unit. These are rated at 96% efficiency.
Next, let us get to the more exciting part: getting inside the system.
“You might notice these two pieces of metal” but you’ll have to click through to substack to discover what they are!
Maybe the clickbait shouldn’t come at the cost of making the article incomplete?
@A Concerned Listener – They are not being used in this system and it is an entirely different topic that was almost as big as this entire review.
I know folks may not love the fact that we have multiple publications now, but a lot of work is happening to decide what goes on the main YouTube channel versus short-form and STH versus Substack. We will not always get it right, but that is a very different topic since it is not in this server.
I can’t make sense of the spider graph for this system, how does this rank higher for networking density than GPU Density? Also adding to the above regarding the clickbait, that’s not why we come to STH
I’m subbed to the Substack via my company. It’s like another STH less about the hardware and more about the business of what’s going on through the lens of hardware. I’m enjoying it so far. I’ve been reading STH for more than a decade so I’m happy to spend some of my subscription allocation on supporting them. I’d rather do that and get exclusive content then see STH turn into Tomshardware with spammy auto-play video ads.
I’ll just tell you, the paywalled article that links to is worth the monthly sub alone to me. It’s a great PK article that I didn’t know about.
I don’t understand why people are complaining. They’re promoting another pub they have and this is longer than their average server review so it’s not like they took half of a review away.
I’m getting approval to sub at work.
Accolades belong to the STH team for this review. With this and last week’s GH200 I’m learning too much here so thanks