Sale!

Dihuni OptiReady AI-V100-5-1 5 NVIDIA Tesla V100 32GB GPU 2S Xeon 6226 256GB Ubuntu Deep Learning Server

$62,763.00

(Add to cart to Buy / Request Quote)

Out of stock

Safe Checkout

Request Formal Quote, Volume Pricing, Stock or Product Information

  • Competitor Match/Beat on Custom Servers and Select Products (send competitor quote)
  • Leasing Options Available (requires 5 years business operations)
  • Purchase Orders Accepted / Net Terms subject to approval
  • Custom Servers - Configure Below, Add to Cart and Request Quote for formal pricing

The Dihuni OptiReady AI-V100-5-1 is a GPU optimized server based on Supermicro’s leading GPU server platform. It has been designed for Machine Learning (ML), Deep Learning, Artificial Intelligence (AI) and comes with Ubuntu Linux and NVIDIA NGC docker software installed.

NVIDIA NGC Pre-Loaded

This Deep Learning server is available with NVIDIA NGC containers that can be preloaded. NGC empowers researchers, data scientists, and developers with performance-engineered containers featuring AI software like TensorFlow, Keras, PyTorch, MXNet, NVIDIA TensorRT™, RAPIDS and more. These pre-integrated containers feature NVIDIA AI stack including NVIDIA® CUDA® Toolkit, NVIDIA deep learning libraries which are easy to upgrade using Docker commands.

Key Features

  • 5 x NVIDIA Tesla V100 32GB PCIe3.0 GPU Installed
  • 2 x Intel Xeon 6226 12 Cores/24 Threads 2.7GHz CPU Installed
  • 256 GB DDR4-2666MHz (32GB x 8) Installed
  • 2 x Intel 960GB SATA SSD Installed; 24 Hot-swap 2.5″ drive bays
  • 11 PCI-E 3.0 x16 (FH, FL) slots 1 PCI-E 3.0 x8 (FH, FL in x16 slot); support for up to 10 double width GPU
  • 2x 10GBase-T LAN ports via Intel C6226
  • 8 Hot-swap 92mm RPM cooling fans
  • 2000W (2 2) Redundant Power Supplies Titanium Level (96% )
  • Ubuntu Linux (latest compatible version) installed
  • NGC Docker Container for Deep Learning Pre-loaded (Optional, please select below)

10 Double Width GPUs for Deep Learning (5 Installed, 5 open for expansion)

The Dihuni OptiReady AI-V100-5-1 takes full advantage of the new Xeon Scalable Processor Family  PCIe lanes to support  up to 10 double-width GPUs to deliver a very high performance Artificial Intelligence and Deep Learning system suitable for autonomous cars, molecular dynamics, computational biology, fluid simulation, advanced physics and Internet of Things (IoT) and Big Data Analytics etc. With NVIDIA Tesla cards, this server delivers unparalleled acceleration for compute intensive applications. The model is based on leading Supermicro server platform.

Server Systems Management

Supermicro Server Manager (SSM) provides capabilities to monitor the health of server components including memory, hard drives and RAID controllers. It enables the datacenter administrator to monitor and manage power usage across all Supermicro servers allowing users to maximize their CPU payload while mitigating the risk of tripped circuit. Firmware upgrades on Supermicro servers became easier now with a couple of clicks. Administrators can now mount an ISO image on multiple servers and reboot the servers with those images. The tool also provides pre-defined reports and many more features that will make managing Supermicro servers simpler. Download the SSM_brochure for more info or download Supermicro SuperDoctor® device monitoring and management software.

Technical Specifications

Mfr Part # AI-V100-5-1
Motherboard Supermicro Super X11DPG-OT-CPU
CPU Dual Socket P (LGA 3647); Intel® Xeon® Scalable Processors,
Dual UPI up to 10.4GT/s; Dual UPI up to 10.4GT/s; Support CPU TDP 70-205W
Cores Up to 28 Cores with Intel® HT Technology
GPU / Coprocessor Support Please refer to: Compatible GPU list
Memory Capacity 24 DIMM slots; Up to 3TB ECC 3DS LRDIMM, 1TB ECC RDIMM, DDR4 up to 2666MHz
Memory Type 2666/2400/2133MHz ECC DDR4 SDRAM
Chipset Intel® C622 chipset
SATA SATA3 (6Gbps) with RAID 0, 1, 5, 10
Network Controllers Dual Port 10GbE from C622
IPMI Support for Intelligent Platform Management Interface v.2.0; IPMI 2.0 with virtual media over LAN and KVM-over-LAN support
Graphics ASPEED AST2500 BMC
SATA 10 SATA3 (6Gbps) ports
LAN 2 RJ45 10GBase-T LAN ports; 1 RJ45 Dedicated IPMI LAN port
USB 4 USB 3.0 ports (rear)
Video 1 VGA Connector
COM Port 1 COM port (rear)
BIOS Type AMI 32Mb SPI Flash ROM
Software Intel® Node Manager; IPMI 2.0; KVM with dedicated LAN; SSM, SPM, SUM; ,; SuperDoctor® 5; Watchdog
CPU Monitors for CPU Cores, Chipset Voltages, Memory.; 4+1 Phase-switching voltage regulator
FAN Fans with tachometer monitoring; Status monitor for speed control; Pulse Width Modulated (PWM) fan connectors
Temperature Monitoring for CPU and chassis environment; Thermal Control for fan connectors
Form Factor 4U Rackmountable; Rackmount Kit (MCP-290-00057-0N)
Model CSE-418GTS-R4000B
Height 7.0″ (178mm)
Width 17.2″ (437mm)
Depth 29″ (737mm)
Net Weight: 80 lbs (36.2 kg); Gross Weight: 135 lbs (61.2 kg)
Available Colors Black
Hot-swap Up to 24 Hot-swap 2.5″ SAS/SATA drive bays; 8x 2.5″ drives supported natively
PCI-Express 11 PCI-E 3.0 x16 (FH, FL) slots; 1 PCI-E 3.0 x8 (FH, FL, in x16) slot
Fans 8 Hot-swap 92mm cooling fans
Shrouds 1 Air Shroud (MCP-310-41808-0B)
Total Output Power 1000W/1800W/1980W/2000W
Dimension
(W x H x L)
73.5 x 40 x 265 mm
Input 100-120Vac / 12.5-9.5A / 50-60Hz; 200-220Vac / 10-9.5A / 50-60Hz; 220-230Vac / 10-9.8A / 50-60Hz; 230-240Vac / 10-9.8A / 50-60Hz; 200-240Vac / 11.8-9.8A / 50-60Hz (UL/cUL only)
+12V Max: 83.3A / Min: 0A (100-120Vac); Max: 150A / Min: 0A (200-220Vac); Max: 165A / Min: 0A (220-230Vac); Max: 166.7A / Min: 0A (230-240Vac); Max: 166.7A / Min: 0A (200-240Vac) (UL/cUL only)
12Vsb Max: 2.1A / Min: 0A
Output Type 25 Pairs Gold Finger Connector
Certification Titanium Level; [ Test Report ]
RoHS RoHS Compliant
Environmental Spec. Operating Temperature:
10°C ~ 35°C (50°F ~ 95°F); Non-operating Temperature:
-40°C to 60°C (-40°F to 140°F); Operating Relative Humidity:
8% to 90% (non-condensing); Non-operating Relative Humidity:
5% to 95% (non-condensing)

Weight 149 lbs
Dimensions 29 × 17.2 × 7 in

Reviews

There are no reviews yet.

Be the first to review “Dihuni OptiReady AI-V100-5-1 5 NVIDIA Tesla V100 32GB GPU 2S Xeon 6226 256GB Ubuntu Deep Learning Server”

Your email address will not be published. Required fields are marked *

You may also like…

Shopping Cart
Scroll to Top