10 GPU 2 XEON DEEP LEARNING AI SERVER
- GPU: 10 NVIDIA A100, A40, A30, V100, RTXA6000, RTXA5000
- NVLINK: 4 NVLINK
- CPU: 80 CORES (2 Intel Xeon Scalable), Single/Dual Root
- System Memory: 8 TB (32 DIMM)
- STORAGE: 12 3.5″ SATA SSD/HDD OR NVMe PCIe U.2
About
XINMATRIX® 10 GPU Deep Learning System is a scalable and configurable 4U Rackmount System with up to 10 Double Width NVIDIA GPU. Dual Intel latest Xeon Scalable Processor. The amount of GPU, CPU, System Memory and Storage are configurable according to requirements.
Deep learning is one of the fastest growing segments in the machine learning/artificial intelligence field. It uses algorithms to model high-level abstractions of data in order to gain meaningful insight for practical application. Such data manipulation has application in various fields, such as computer vision, speech recognition and language processing, and audio recognition.
Specification
Specification
GPU
Options:
A100, A30, A40, V100, RTXA6000, RTXA5000, T4
A100, A30, A40, V100, RTXA6000, RTXA5000, T4
CPU
80 CORES
(Dual 40-Core Intel Xeon Scalable)
System Memory
32 DIMM slots
8TB DDR4-3200 ECC RDIMM
Storage
12X Hot-swappable 3.5" drive bays
SATA, NVMe SSD/HDD
Network
2X 10GbE ports + 1X IPMI
Dual-Port InfiniBand or other high speed PCIe card (Optional)
System Weight
~ 48 KG
System Dimension
4U Rackmount
830mm(D) x 4U(H) x 19”(W)
Maximum Power Requirements
4,800Watts (200-240Vac input)
Redundant (3+1) PFC 80-PLUS Platinum
Operating Temperature Range
10° C ~ 35° C (50° F ~ 95° F)
Support & Warranty
Three Years on-site parts and services, NBD 8x5
You May Also Like
Related products
-
10 GPU 2 XEON DEEP LEARNING AI SERVER
SKU: SMX-R4051- GPU: 10 NVIDIA H100, A100, L40, A40, RTX6000, 2-slot GPU
- CPU: 2 4th Generation Intel Xeon Scalable Processors
- System Memory: 8 TB (32 DIMM)
- STORAGE: NVME
-
4 GPU 2 XEON DEEP LEARNING AI SERVER
SKU: SMXESC4000G4- GPU: 4 NVIDIA A100, V100, RTXA6000, A40, RTX8000, T4
- NVLINK: 2 NVLINK
- CPU: 56 CORES (2 Intel Xeon Scalable), Single/Dual Root
- System Memory: 2 TB (16 DIMM)
- STORAGE: 8 3.5" SATA SSD/HDD OR NVMe U.2
-
8 GPU 2 EPYC DEEP LEARNING AI SERVER
SKU: SMX-GS4845- GPU : 8 NVIDIA A100, V100, RTXA6000, RTX8000, A40
- NVLINK : 4 NVLINK
- CPU: 128 CORES (2 AMD EPYC ROME)
- PCIe Gen 4.0 support
- System Memory: 4 TB (32 DIMM)
- Type A: 12 3.5" SATA/NVMe U.2 Hotswap bays
- Type B: 24 2.5" SATA/SAS NVMe U.2 Hotswap bays
Our Partners
Previous
Next