MLX-MHQH29-XTC-00
MPN: MHQH29-XTC

Mellanox MHQH29-XTC ConnectX IB Infiniband Host Bus Adapter


Highlights
Media Type Supported:
Optical Fiber
Network Technology:
40GBase-X
Product Type:
40Gigabit Ethernet Card
Condition:
New

ConnectX adapter cards provide the highest performing and most flexible interconnect solution for Enterprise Data Centers, High-Performance Computing, and Embedded environments. Clustered data bases, parallelized applications, transactional services and high-performance embedded I/O applications will achieve significant performance improvements resulting in reduced completion time and lower cost per operation.

ConnectX based adapter cards simplifies network deployment by consolidating cables and enhancing performance in virtualized server environments.

World-Class Performance Over InfiniBand

ConnectX delivers low-latency and high-bandwidth for performance-driven server and storage clustering applications. These applications will benefit from the reliable transport connections and advanced multicast support offered by ConnectX. Network protocol processing and data movement overhead such as InfiniBand RDMA and Send/Receive semantics are completed in the adapter without CPU intervention. Servers supporting PCI Express 2.0 with 5GT/s will be able to take advantage of 40Gb/s InfiniBand, balancing the I/O requirement of these high-end servers.

TCP/UDP/IP Acceleration

Applications utilizing TCP/UDP/IP transport can achieve industry-leading throughput over InfiniBand. The hardware-based stateless offload engines in ConnectX reduce the CPU overhead of IP packet transport, freeing more processor cycles to work on the application.

I/O Virtualization

ConnectX support for hardware-based I/O virtualization provides dedicated adapter resources and guaranteed isolation and protection for virtual machines (VM) within the server. I/O virtualization with ConnectX gives data center managers better server utilization and LAN and SAN unification while reducing cost, power, and cable complexity.

Storage Accelerated

A consolidated compute and storage network achieves significant cost-performance advantages over multi-fabric networks. Standard block and file access protocols leveraging InfiniBand RDMA result in high-performance storage access. Fibre Channel frame encapsulation (FCoIB or FCoE) and hardware offloads enable simple connectivity to Fibre Channel SANs.

Software Support

All Mellanox adapter cards are compatible with TCP/IP and OpenFabrics-based RDMA protocols and software. They are also compatible with InfiniBand and cluster management software available from OEMs. The adapter cards are compatible with major operating system distributions.