NetApp gives AI the FlexPod treatment

By on
NetApp gives AI the FlexPod treatment

Teams with NVIDIA for on-prem brainbox reference architecture.

One of NetApp’s biggest successes is its FlexPod reference architectures for on-premises infrastructure comprising NetApp arrays and data fabric software, plus Cisco servers and networking kit.

Analyst firm IDC says the product accounts for a third of the converged systems market and over US$2 billion of annual revenue. FlexPods are so good at what they do that Microsoft will deploy them for its VMware-on-Azure service.

And now NetApp’s trying to re-use the FlexPod recipe for on-premises AI workloads.

This time around it’s teamed with NVIDIA, which makes AI-centric servers called “DGX” that pack a pair of Xeons and up to 16 Tesla 100 GPUs.

NetApp believes that users want to start testing and/or using AI, but are held back by on-premises infrastructure that’s not up to the job and a fear that building the right hardware stack will be complex and costly.

The new “ONTAP AI proven architecture” attempts to change that, by explaining how to build rigs based on NetApp’s new high-end AFF A800 array, Cisco networking and NVIDIA’s DGX servers.

As FlexPods are both proven and widely admired, NetApp has credibility in its attempt to extend its expertise into AI workloads.

But NetApp’s ANZ Director for Solutions Engineering Dhruv Dhumatkar said the “ONTAP AI” brand is “a little bit of marketing” rather than an indicator of changes that make the new offering especially potent for AI workloads.

Instead the company believes the speed of the AFF A800 is just what DGX servers need if they’re to work at their best. Dhumatkar also advanced the same arguments NetApp uses for everything – ONTAP’s data fabric is very good and will make you agile - to explain why the new architectures matter.

It has a point this time around, as the company recognizes that the data to feed into AI models may well reside in a cloud. ONTAP can bridge on-prem and cloud data stores which should make it easier to feed the DGX boxen with data.

NVIDIA, for its part, told us that NetApp is not its only partner in this field.

“We’re working with a growing ecosystem of vendors building GPU accelerated data center solutions for AI and deep learning,” the company said in a statement. “In addition to our work with NetApp, we teamed with Pure Storage on their DGX-based solution called AIRI.”


Got a news tip for our journalists? Share it with us anonymously here.
Copyright © . All rights reserved.

Most Read Articles

Log In

  |  Forgot your password?