Highlights:
  • Rack Height: 1U
  • Processor Supported: 2x Intel Xeon Scalable family
  • Drive Bays: 2x 2.5" Hot-Swap, 2x 2.5" Internal
  • Supports up to 4x NVIDIA Tesla P100 SXM2
Contact sales for pricing

The TensorEX TS1-672696-DPN is a 1U rack mountable Deep Learning & AI Server supporting 2x Intel Xeon Scalable Family processors, a maximum of 1.5 TB DDR4 memory, and four Tesla P100 Pascal GPUs (SXM2), with up to 80GB/s NVLINK GPU-GPU interconnect.

GPUs have provided groundbreaking performance to accelerate deep learning research with thousands of computational cores and up to 100x application throughput when compared to CPUs alone. Exxact has developed the Deep Learning DevBox, featuring NVIDIA GPU technology coupled with state-of-the-art NVLINK GPU-GPU interconnect technology, and a full pre-installed suite of the leading deep learning software, for developers to get a jump-start on deep learning research with the best tools that money can buy.

Features:

  • Supports 2x Intel Xeon Scalable Family processors (Socket 3647)
  • 4x Tesla P100 SXM2 (10.6 TFlops of single precision, 732 GB/s of memory bandwidth, and 16 GB of memory per board) GPUs
  • NVIDIA DIGITS software providing powerful design, training, and visualization of deep neural networks for image classification
  • Pre-installed standard Ubuntu 18.04 w/ Deep Learning software stack
  • Google TensorFlow software library
  • Automatic software update tool included
  • A turn-key server with up to 80GB/s NVLINK GPU-GPU interconnect

EMLI (Exxact Machine Learning Images)


Most Popular

Compare

*Additional NGC (NVIDIA GPU Cloud) containers can be added upon request.

Conda EMLI

Separated Frameworks

Container EMLI

Flexible. Reconfigurable.

DIY EMLI

Simple. Clean. Custom.

Who is it for?

For developers who want pre-installed deep learning frameworks and their dependencies in separate Python environments installed natively on the system.

For developers who want pre-installed frameworks utilizing the latest NGC containers, GPU drivers, and libraries in ready to deploy DL environments with the flexibility of containerization.

For experienced developers who want a minimalist install to set up their own private deep learning repositories or custom builds of deep learning frameworks.

Frameworks*

TensorFlow V1
TensorFlow V2
PyTorch
MXnet
Caffe
Caffe2
Chainer
Microsoft Cognitive Toolkit

Libraries*

NVIDIA cuDNN
NVIDIA Rapids
Keras
Theano
OpenCV

Software Environments

NVIDIA CUDA Toolkit
NVIDIA CUDA Dev Toolkit
NVIDIA Digits
Anaconda

Container Management

Docker

Drivers

NVIDIA-qualified Driver

Orchestration

Micro-K8s

Free upgrade availableFree upgrade availableFree upgrade available
Processor & Chipset
Number of Processors Supported
2
Processor Socket
LGA 3647
Processor Type
Xeon
Processor Supported
  • Bronze 31XX
  • Bronze 32XX
  • Silver 41XX
  • Silver 42XX
  • Gold 51XX
  • Gold 52XX
  • Gold 61XX
  • Gold 62XX
  • Platinum 81XX
  • Platinum 82XX
Thermal Design Power (TDP)
N/A
UPI
3
GPU
4x NVIDIA Tesla P100 SXM2 GPUs
Memory
Maximum Memory
1.5 TB
Memory Technology
  • DDR4 SDRAM
  • DDR4 NVDIMM (Intel Optane DCPMM)
Memory Standard
DDR4-2933/PC4-23400
Number of Total Memory Slots
12
Controllers
Controller Type
SATA3
Display & Graphics
Graphics Controller Manufacturer
ASPEED
Graphics Controller Model
AST2500
Network & Communication
Ethernet Technology
10GBASE-T
I/O Expansions
PCI Express
  • 2x PCI-E 3.0 x16 slots
  • 2x PCI-E 3.0 x16 slot (Low-profile)
Drive Bays
Hot-swap
  • 4x 2.5" drive bays
  • Supports 2x NVMe drives
Interfaces/Ports
Total Number of USB Ports
2
Number of USB 3.0 Ports
2 (rear)
Number of SATA Interfaces
N/A
Number of NVMe M.2 Slots
2
LAN
  • 2x RJ45 10GBASE-T LAN ports
  • 1x RJ45 IPMI LAN port
Onboard Video
1x VGA Connector
Power Description
Number of Power Supplies
2
Maximum Power Supply Wattage
2000 W
Certification
80 Plus Titanium
Physical Characteristics
Color
Black
Form Factor
Rack-mountable
Rack Height
1U
Height
1.7"
Width
17.2"
Depth
35.2"