ABOUT
The Tensor TS1-1595769-DPN is a 1U rack mountable Deep Learning NVIDIA GPU server supporting 2x Intel Xeon E5-2600 v3/v4 processors, a maximum of 1 TB DDR4 memory, and four Tesla V100 Volta GPUs (SXM2), with up to 150GB/s NVLINK 2.0 GPU-GPU interconnect.
GPUs have provided groundbreaking performance to accelerate deep learning research with thousands of computational cores and up to 100x application throughput when compared to CPUs alone. Exxact has developed the Deep Learning DevBox, featuring NVIDIA GPU technology coupled with state-of-the-art 150GB/s NVLINK 2.0 GPU-GPU interconnect technology, and a full pre-installed suite of the leading deep learning software, for developers to get a jump-start on deep learning research with the best tools that money can buy.
Features:
- Supports 2x Intel Xeon E5-2600 v3/v4 processors (Socket R3)
- 4x Tesla V100 SXM2 32 GB (15.7 TFlops of single precision, 900 GB/s of memory bandwidth, and 32 GB of HBM2 memory per board) or V100 SXM2 16 GB GPUs
- NVIDIA DIGITS software providing powerful design, training, and visualization of deep neural networks for image classification
- Pre-installed standard Ubuntu 16.04 with Deep Learning software stack
- Google TensorFlow software librarary
- Automatic software update tool included
- A turn-key server with up to 150GB/s NVLINK 2.0 GPU-GPU interconnect
SPECIFICATIONS
- E5-2600 v3
- E5-2600 v4
- 3x PCI-E 3.0 x16 slots
- 1x PCI-E 3.0 x8 slot (low-profile)
- 2x RJ45 Gigabit Ethernet LAN Ports
- 1x RJ45 Dedicated IPMI Port