Docker is the world's leading software container platform. Developers use Docker to eliminate "works on my machine" problems when collaborating on code with co-workers. Operators use Docker to run and manage apps side-by-side in isolated containers to get better compute density. Enterprises use Docker to build agile software delivery pipelines to ship new features faster, more securely and with confidence for both Linux and Windows Server apps.

Docker containers wrap a piece of software in a complete filesystem that contains everything needed to run: code, runtime, system tools, system libraries - anything that can be installed on a server. This guarantees that the software will always run the same, regardless of its environment.

What is a Container?

Using containers, everything required to make a piece of software run is packaged into isolated containers. Containers wrap applications into an isolated virtual environment to simplify data center deployment. Unlike hypervisor-based virtual machines (VMs), containers do not bundle a full operating system - only libraries and settings required to make the software work are needed. This makes for effcient, lightweight, self- contained systems and guarantees that software will always run the same, regardless of where it's deployed.

Containers vs. Virtual Machines

VMs allow multiple copies of the operating system, or even multiple different operating systems, to share a machine. Each VM can host and run multiple applications. VM hypervisors, such as Hyper-V, KVM, and Xen are essentially based on emulating virtual hardware which equates to high systems requirements. By comparison, a container is designed to virtualize a single application, and all containers deployed on a host share a single OS instance. Typically, containers spin up faster, run the application with bare-metal performance, and are simpler to manage since there is no additional overhead to making an OS kernel call.

Docker Important Features

Docker provides both hardware and software encapsulation by allowing multiple containers to run on the same system at the same time each with their own set of resources (CPU, memory, etc) and their own dedicated set of dependencies (library version, environment variables, etc.). Docker also provides portable Linux deployment: Docker containers can be run on any Linux system with kernel is 3.10 or later. All major Linux distros have supported Docker since 2014. Encapsulation and portable deployment are valuable to both the developers creating and testing applications and the operations staff who run them in data centers.

  • Docker's powerful command-line tool, `docker build`, creates Docker images from source code and binaries, using the description provided in a "Dockerfile".
  • Docker's component architecture allows one container image to be used as a base for other containers.
  • Docker provides automatic versioning and labeling of containers, with optimized assembly and deployment. Docker images are assembled from versioned layers so that only the layers missing on a server need to be down- loaded.
  • Docker Hub is a service that makes it easy to share docker images publicly or privately.
  • Containers can be constrained to a limited set of resources on a system (e.g one CPU core and 1GB of memory).

Docker automates the repetitive tasks of setting up and configuring development environments so that developers can focus on what matters: building great software. Developers using Docker don't have to install and configure complex databases nor worry about switching between incompatible language toolchain versions.

Any App, Language, or Stack
Build, test, debug and deploy Linux and Windows Server container apps written in any programming language without risk of incompatibilities or version conflicts.

Awesome Developer Experience
Reduce onboarding time by 65%: Quickly build, test and run complex multi-container apps and stop wasting time installing and maintaining software on servers and developer machines. All dependencies run in containers, eliminating "works on my machine" problems.

Built-In Container Orchestration
Docker comes with built-in swarm clustering that's easy to configure. Test and debug apps in environments that mimic production with minimal setup.

Docker is the secret weapon of developers and IT ops teams everywhere, allowing them to build, ship, test, and deploy apps automatically, securely, and portably with no surprises. No more wikis, READMEs, long runbook documents and post-it notes with stale information.

Ship 13x More
Docker users ship software on average 13 times more frequently. Teams using Docker push software updates quickly and get fixes and new features to customers faster.

Quickly Scale
Built in orchestration scales to thousands of nodes and containers. Docker containers spin up and down in seconds, making it easy to scale application services to satisfy peak customer demand, and back down when demand ebbs.

Improve It Efficiently
Save up to 10X in personnel hours in app maintenance and support. Docker makes it easy to deploy, identify, and resolve issues and reduce overall IT operational costs. Reduce downtime when deploying updates or quickly roll back with minimal disruption.

Distribute &Share Content
Build, manage, and distribute Docker images in secure Docker Registries located on-premises or in the cloud. Image updates, configuration changes, and build history are automatically synchronized across the organization.

Simply Share Applications
Docker guarantees that apps will work the same everywhere. With Docker images, the entire stack and configuration is part of the image, and there's no need to configure host environments other than installing Docker.

Guarantee App Security
Securely collaborate on apps with authorized users and protect code as it moves to production. Docker Content Trust and built-in security ensures that the right code is available to the right people at the right time.

Docker is at the heart of the modern app platform, bridging developer and IT, Linux and Windows. Docker works in the cloud just as well as on-premise; and supports both traditional and microservices architectures. Docker sets enterprises on the path to digital transformation by enabling all apps to be agile, cloud-ready and secure at optimal costs.

One Platform for All Apps
Docker provides a unified framework for all apps - monolith or microservices, Linux or Windows, on-premises or cloud - a standard container and workflow for secure, agile and portable apps.

Innovate Faster at Scale Docker containers accelerate delivery of new apps with microservices architecture by automating deployment pipelines. New features can be released (and rolled back in case of problems) frequently to quickly address customer needs.

Break Down Silos
Open interfaces, APIs, and plugins makes it easy to integrate Docker into an existing environment and extend Docker to different systems. A common interface allows dev and ops to work together without conflict or disruption.


Docker containers are platform-agnostic, but also hardware-agnostic. This presents a problem when using specialized hardware such as NVIDIA GPUs which require kernel modules and user-level libraries to operate. As a result, Docker does not natively support NVIDIA GPUs within containers. To work around this problem, NVIDIA drivers were fully installed inside the container and mapped in the character devices corresponding to the NVIDIA GPUs (e.g. /dev/nvidia0). This method was not suffcient because the host driver must exactly match the version of the driver installed in the container. Due to this requirement, the portability of these containers were substantially reduced which undermined one of Docker's important features.

NVIDIA Docker was developed to enable portability in Docker images that leverage NVIDIA GPUs. NVIDIA Docker is an open-source project hosted on Github that provides the two critical components needed for portable GPU-based containers:

nvidia-docker is essentially a wrapper around the docker command that transparently provisions a container with the necessary components to execute code on the GPU. It is only absolutely necessary when using nvidia-docker run to execute a container that uses GPUs.

Here are some of the benefits of GPU containerization:

  • Reproducible builds
  • Ease of deployment
  • Isolation of individual devices
  • Run across heterogeneous driver/toolkit environments
  • Requires only the NVIDIA driver to be installed
  • Enables “fire and forget” GPU applications
  • Facilitate collaboration

Exxact Deep Learning GPU Solutions

Our deep learning GPU solutions are powered by the leading hardware, software, and systems engineering. Each system comes with our pre-installed deep learning software stack and are fully turnkey to run right out of the box.