QuantaGrid D74A-7U
Accelerated Parallel Computing Performance for the Most Extreme AI-HPC Workloads
- Multiple-GPUs Server for HPC / AI Training ( e.g LLMs / NLP ).
- Powered by 2x 4th AMD EPYC9004 Processors and Compatible with Next-Gen AMD EPYC™ Processor in the future.
- Introducing NVIDIA HGX Architecture and Flexible with 8x NVIDIA H100/H200 GPUs or 8x AMD MI300X GPUs.
- 18x SFF All-NVMe Drive Bays for GPUDirect Storage and Boot Drive.
- 10x OCP NIC 3.0 TSFF for GPUDirect RDMA.
- Modularized Design for Easy Serviceability.
Multiple-GPUs Server for HPC / AI Training ( e.g LLMs / NLP )
|
Modularized Design for Easy Serviceability
The architecture is designed around a toolless modular philosophy. Major components can be removed from the system chassis without unmounting from the rack for better serviceability and increased system uptime. The modularity in this innovative design allows for forward support of next generation CPUs and GPUs. |
Modularized Design for Easy Serviceability
The architecture is designed around a toolless modular philosophy. Major components can be removed from the system chassis without unmounting from the rack for better serviceability and increased system uptime. The modularity in this innovative design allows for forward support of next generation CPUs and GPUs. |
Processor | |
---|---|
Processor Family |
AMD EPYC™
|
Processor Type |
AMD EPYC™ 9004 Series Server Processors
|
Max. TDP Support |
cTDP up to 400W
|
Number of Processors |
2 Processors
|
Internal Interconnect |
16 GT/s
|
L3 Cache |
Up to 384 MB
|
Form Factor | |
---|---|
Form Factor |
7U
|
Dimensions | |
---|---|
W x H x D (inch) |
17.63" x 12.12" x 34.88"
|
W x H x D (mm) |
447.8 x 307.85 x 886mm
|
Chipset | |
---|---|
Chipset |
SoC
|
Storage | |
---|---|
Default Configuration |
(18) 2.5" hot-plug NVMe SSD drives
|
Memory | |
---|---|
Total Slots |
24
|
Capacity |
Up to 6TB (256Gx24) of memory capacity
|
Memory Type |
4800 MHz DDR5 RDIMM
|
Memory Size |
16G, 32G, 64G, 128G, 256G RDIMM/3DS DIMM
|
Expansion Slot | |
---|---|
Default Configuration |
(2) PCIe 5.0 x16 OCP 3.0 SFF slots (MB)
(10) PCIe 5.0 x16 OCP 3.0 TSFF slots (I/O Board) |
Expansion Slot_GPGPU baseboard |
SKU - #1
(8) NVIDIA H100/H200 SXM GPU Modules with HGX Baseboard SKU - #2 (8) AMD MI300X GPU Modules with Industry-Standard-Based Universal Baseboard (UBB 2.0) |
Network Controller | |
---|---|
LOM |
Dedicated (1) GbE management port
|
Optional NIC |
Please refer to our Compatible Component List for more information
|
Front I/O | |
---|---|
Front I/O |
(1) Power button/LED
(1) Reset button (1) ID button/LED (1) System status LED (2) USB 3.0 (1) VGA port |
Power Supply | |
---|---|
Power Supply |
(6) 3+3 High Efficiency Redundant Hot-Plug 4000W 80 Plus Titanium PSUs
|
Onboard Storage | |
---|---|
Onboard Storage |
Onboard Storage (2) 2280 M.2 (Optional for Boot Drive)
|
Fan | |
---|---|
Fan |
(8) hot-swap 9276 dual rotor fans (N+1 redundant)
|
Video | |
---|---|
Video |
Integrated AST2600
Maximum display resolution is up to 1920x1080p 32bpp@60Hz |
System Management | |
---|---|
System Management |
Redfish v1.11
IPMI v2.0 Compliant, on board "KVM over IP" support |
Rear I/O | |
---|---|
Rear I/O |
(1) Power button
(1) ID button/LED (1) USB 3.0 port (1) Mini Display port (1) COM Port (micro USB type-B) (1) RJ45 dedicated mgmt port |
Operating Environment | |
---|---|
Operating Environment |
Operating temperature: 5°C to 35°C (41°F to 95°F)
Non-operating temperature: -40°C to 70°C (-40°F to 158°F) Operating relative humidity: 20% to 85%RH Non-operating relative humidity: 10% to 95%RH |
TPM | |
---|---|
TPM |
TPM 2.0 SPI module (optional)
|
Weight (Max. Configuration) | |
---|---|
Weight (Max. Configuration) |
116.58 kg (257.01 lbs)
|