10 GPU 2 XEON DEEP LEARNING AI SERVER
- GPU: 10 NVIDIA A100, A40, A30, V100, RTXA6000, RTXA5000
- NVLINK: 4 NVLINK
- CPU: 80 CORES (2 Intel Xeon Scalable), Single/Dual Root
- System Memory: 8 TB (32 DIMM)
- STORAGE: 12 3.5″ SATA SSD/HDD OR NVMe PCIe U.2
About
XINMATRIX® 10 GPU Deep Learning System is a scalable and configurable 4U Rackmount System with up to 10 Double Width NVIDIA GPU. Dual Intel latest Xeon Scalable Processor. The amount of GPU, CPU, System Memory and Storage are configurable according to requirements.
Deep learning is one of the fastest growing segments in the machine learning/artificial intelligence field. It uses algorithms to model high-level abstractions of data in order to gain meaningful insight for practical application. Such data manipulation has application in various fields, such as computer vision, speech recognition and language processing, and audio recognition.
Specification
Specification
GPU
Options:
A100, A30, A40, V100, RTXA6000, RTXA5000, T4
A100, A30, A40, V100, RTXA6000, RTXA5000, T4
CPU
80 CORES
(Dual 40-Core Intel Xeon Scalable)
System Memory
32 DIMM slots
8TB DDR4-3200 ECC RDIMM
Storage
12X Hot-swappable 3.5" drive bays
SATA, NVMe SSD/HDD
Network
2X 10GbE ports + 1X IPMI
Dual-Port InfiniBand or other high speed PCIe card (Optional)
System Weight
~ 48 KG
System Dimension
4U Rackmount
830mm(D) x 4U(H) x 19”(W)
Maximum Power Requirements
4,800Watts (200-240Vac input)
Redundant (3+1) PFC 80-PLUS Platinum
Operating Temperature Range
10° C ~ 35° C (50° F ~ 95° F)
Support & Warranty
Three Years on-site parts and services, NBD 8x5
You May Also Like
Related products
-
2 GPU 2 EPYC DEEP LEARNING AI SERVER
SKU: SMXB8252T75- GPU : 2 NVIDIA RTXA6000, A40, RTX8000, T4
- CPU: 128 CORES (2 AMD EPYC ROME)
- PCIe Gen 4.0 support
- System Memory: 4 TB (32 DIMM)
- 26 2.5" SATA/NVMe U.2 SSD Hotswap bays, 2 NVMe M.2 SSD
-
8 GPU 2 EPYC DEEP LEARNING AI SERVER
SKU: SMX-GS4845- GPU : 8 NVIDIA A100, V100, RTXA6000, RTX8000, A40
- NVLINK : 4 NVLINK
- CPU: 128 CORES (2 AMD EPYC ROME)
- PCIe Gen 4.0 support
- System Memory: 4 TB (32 DIMM)
- Type A: 12 3.5" SATA/NVMe U.2 Hotswap bays
- Type B: 24 2.5" SATA/SAS NVMe U.2 Hotswap bays
-
4 GPU 2 EPYC DEEP LEARNING AI SERVER
SKU: SMX-B8251- GPU : 4 NVIDIA A100, V100, RTXA6000, A40, RTX8000, T4
- NVLINK : 2 to 6 NVLINK
- CPU: 128 CORES (2 AMD EPYC ROME)
- PCIe Gen 4.0 support
- System Memory: 2 TB (16 DIMM)
- 8 3.5" SATA/NVMe U.2 Hotswap bays
Our Partners
Previous
Next