The Tensor TS2-306052-DPN is a 2U rack mountable Deep Learning NVIDIA GPU server supporting 2x IBM POWER8 with NVLINK processors, a maximum of 1 TB DDR4 memory, and four Tesla P100 Pascal GPUs (SXM2), with up to 80GB/s NVLINK GPU-GPU interconnect.
GPUs have provided groundbreaking performance to accelerate deep learning research with thousands of computational cores and up to 100x application throughput when compared to CPUs alone. Exxact has developed the Deep Learning DevBox, featuring NVIDIA GPU technology coupled with state-of-the-art NVLINK GPU-GPU interconnect technology, and a full pre-installed suite of the leading deep learning software, for developers to get a jump-start on deep learning research with the best tools that money can buy.
- Supports 2x IBM POWER8 with NVLINK processors
- 4x Tesla P100 SXM2 (10.6 TFlops of single precision, 732 GB/s of memory bandwidth, and 16 GB of memory per board) GPUs
- NVIDIA DIGITS™ 3.0 software providing powerful design, training, and visualization of deep neural networks for image classification
- Pre-installed standard Ubuntu 14 w/ Caffe, Torch, Theano, BIDMach, cuDNN, OpenCV, CUDA Toolkit, and MXNet
- Google TensorFlow software libarary
- Automatic software update tool included
- A turn-key server with up to 80GB/s NVLINK GPU-GPU interconnect
Docker Images Available on Exxact Solutions
Docker images available for the following software:
- NVIDIA Caffe
Click HERE for more information on Docker.
- Integrated onboard
- 2x PCI-E 3.0 x16 slots
- 3x PCI-E 3.0 x8 slots
- 4x RJ45 10GBASE-T Ethernet LAN Ports
- 1x RJ45 IPMI LAN Port