NVIDIA® DGX POD
NVIDIA DGX POD™ offers a proven design approach for building your GPU-accelerated AI data center with NVIDIA DGX, leveraging NVIDIA’s best practices and insights gained from real-world deployments.
The DGX POD™ is an optimised data centre rack containing up to nine DGX servers, twelve storage servers, and three networking switches to support single and multi-node AI model training and inference using NVIDIA AI software.
The DGX POD™ is also designed to be compatible with leading storage and networking technology providers. XENON offers a portfolio of NVIDIA® DGX POD™ reference architecture solutions including: NetApp, IBM Spectrum, DDN and Pure Storage. All incorporate the best of NVIDIA® DGX POD™ and are delivered as fully-integrated and ready-to-deploy solutions to make your data centre AI deployments simpler and faster.
Talk to a Solutions Architect
The NVIDIA DGX POD™ offers a proven design approach for building your GPU-accelerated AI data center with NVIDIA DGX, leveraging NVIDIA’s best practices and insights gained from real-world deployments.
Proven reference architectures with storage from NetApp, IBM, DDN or Pure Storage. Reference configurations can include:
- Nine DGX servers
- 12 storage servers
- 10 GbE (min) storage & management switch
- Mellanox 100Gpps intra-rack high speed network switches
Contact XENON for more information on a DGX POD™ to suit your needs.