Enquire about this solution

Graphcore IPUs


XENON GraphcoreGraphcore’s state of the art IPU cloud and server solutions have been built to support new breakthroughs in machine intelligence.

The Intelligence Processing Unit is a highly flexible, easy-to-use, parallel processor designed from the ground up for machine intelligence workloads.

Graphcore IPUs deliver the best compute performance per dollar compared to GPUs, and excel at large datasets. Each Graphcore Bow systems is built from clusters of 4x IPU-2000 units (1RU) paired with a single server (1RU) for host controller functions. Suitable servers from a range of manufacturers as well as XENON are supplied by XENON with the Graphcore clusters.

model no.
description
key features
model no. Bow-2000 «NEW»
description

Graphcore’s Bow-2000 IPU-Machine is designed to support scale-up and scale-out machine intelligence compute.

key features
  • Each 1U blade features 4 Bow IPU processors
  • 1.4 petaFLOPS FP16.16 AI compute
  • 3.6GB In-Processor-Memory™ and up to 256GB Streaming Memory™
  • IPU-Link™ – 512Gbps for intra Bow Pod communication

Learn more

model no. Bow Pod16 «NEW»
description

Bow Pod16 is your easy-to-use starting point for building better, more innovative AI solutions with IPUs.

key features
  • Features 4 Bow-2000 machines, each containing 4 of our new pioneering Bow IPU processors
  • Delivers up to 5.6 petaFLOPS of AI compute
  • 5.6 petaFLOPS FP16.16, 1.4 petaFLOPS FP32
  • 14.4 GB In-Processor-Memory™ and Up to 1TB Streaming Memory™

Learn more

model no. Bow Pod64 «NEW»
description

Bow Pod64 is a powerful, flexible building block for the enterprise datacenter, private or public cloud.

key features
  • Features 16 Bow-2000 machines, each containing 4 of our pioneering Bow IPU processors
  • Delivers up to 2 22.4 petaFLOPS of AI compute
  • 22.4 petaFLOPS FP16.16, 5.6 petaFLOPS FP32
  • 57.6GB In-Processor-Memory™ and Up to 4.1TB Streaming Memory™

Learn more

model no. Bow Pod256 «NEW»
description

Bow Pod256, a system designed for production deployment in your enterprise datacenter, private or public cloud.

key features
  • Features 64 Bow-2000 machines, each containing 4 of our pioneering Bow IPU processors
  • Delivers up to 89.6 petaFLOPS of AI compute
  • 89.6 petaFLOPS FP16.16, 22.4 petaFLOPS FP32
  • 230.4GB In-Processor-Memory™ and Up to 16.3TB Streaming Memory™

Learn more