Graphcore

Graphcore’s state of the art IPU cloud and server solutions have been built to support new breakthroughs in machine intelligence.

The Intelligence Processing Unit is a highly flexible, easy-to-use, parallel processor designed from the ground up for machine intelligence workloads.

model no.
description
key features
model no. IPU-POD16 «NEW»
description

Ideal for exploration, the IPU-POD16 gives you all the power, performance and flexibility you need to fast track your IPU prototypes and speed from pilot to production.

key features
  • Four IPU-M2000s
  • Delivers a powerful 4 petaFlops of AI
  • Includes 14.4GB In-Processor Memory and 1024GB Streaming Memory
  • Can be expanded later into a larger IPU-POD system

Learn more

model no. IPU-POD64 «NEW»
description

IPU-POD™ systems are designed to accelerate large and demanding machine learning models for flexible and efficient scale out.

key features
  • Features 16 IPU-M2000™ compute blades, based on the innovative GC200 Intelligence Processing Unit (IPU)
  • Can deliver up to 16 petaFLOPS of AI compute
  • 94,208 IPU Cores
  • Up to 4.15TB (includes 57.6GB In-Processor Memory and 4.1TB Streaming Memory)

Learn more

model no. IPU-POD128 «NEW»
description

IPU-POD128 is the first system to utilise the new IPU-Gateway Links, the horizontal, rack-to-rack connection that extends IPU connectivity across multiple PODs.

key features
  • Features 32 IPU-M2000™ compute blades, based on the innovative GC200 Intelligence Processing Unit (IPU)
  • Can deliver up to 32 petaFLOPS of AI compute
  • Up to 8.3TB (includes 115.2GB In-Processor Memory and 8.2TB Streaming Memory)
  • Supports multiple server configurations

Learn more

model no. IPU-POD256 «NEW»
description

IPU-POD256 supports standard frameworks and protocols to enable smooth integration into existing data centre environments.

key features
  • Features 64 IPU-M2000™ compute blades, based on the innovative GC200 Intelligence Processing Unit (IPU)
  • Can deliver up to 64 petaFLOPS of AI compute
  • Up to 16.6TB (includes 230.4GB In-Processor Memory and 16.4TB Streaming Memory)
  • Supports multiple server configurations

Learn more