August 07, 2018. Technical NextGen Data Centre Author: Ruairi McBride

Gain Some IQ on AI

On the 1st August 2018 NetApp announced a new partnership with NVIDIA and launched the NetApp ONTAP AI Proven Architecture.

This strengthens their already growing foothold in this new and exciting branch of the IT industry and after what has been announced, ONTAP AI is surely going to have everyone talking.

This meet in the channel play gives data scientists a proven architecture to use in their data pipeline for deep learning, avoiding design guesswork and allows for fast efficient deployments of AI environments.


Machine learning (ML) and artificial intelligence (AI) have some very unique demands from an IT perspective.

Firstly, they both have a demand for huge amounts of information; a capacity requirement that is constantly growing.

Second, they require that storage to respond with an ultra-low latency. Unlike big data, you need to keep all the data generated and not burn the hay to find the needle, so expandability over time is a must.

And finally, the type of computation that they undertake is more suited to a GPU rather than a CPU.


Nvidia Logo


Now, whether you would class this as a modernisation of your infrastructure or a next generation Data Centre play, one thing is certain, this is definitely cutting-edge equipment.

For example, using one NVIDIA DGX-1 is equivalent to replacing 400 traditional servers.

If you look at Gartner’s top 10 picks for 2018 and beyond, the majority of these have an aspect of AI/ML in there so it’s probably only natural that we are seeing IT vendors moving into this space.

NetApp is announcing the ability to combine an AFF800A (the flagship All Flash Array) with Five NVIDIA DGX-1 with Tesla V100’s tied together over 100Gbe with a pair of nexus 3232C’s from Cisco which equates to 5000TFLOPS.


NetApp NVIDIA Architecture


Whilst the messaging around this offering highlights it as a future-proof play you don’t need to buy everything in one go, but instead, build upon NetApp’s key messages of flexibility and scaling. 

But if you were to plan ahead or really did need to start big, there is no reason you could not have a twelve High Availability pair with sixty (60x) DGX-1 with close to 75PB.

There is also no reason you couldn’t implement a data pipeline with an A700s or even A300 or A220, it all really depends on what performance and scalability you require.

Tie this together with edge devices running ONTAP select for data ingest and then the ability to use Cloud Volumes ONTAP in the AWS, Azure or possibly FabricPool for an archival tier you can truly see why integrating the Data Fabric into this story is such a nice fit. 

Just imagine adding MAX Data into this mix and it will be like strapping on two F9 first stage boosters to this already Full Thrust rocket.

Now you may be thinking this is a supercomputer niche corner case but in reality, it is being utilised in pretty much every industry vertical affecting almost every aspect of our daily lives.

From the finance industry to health, automotive, retail, agriculture, oil & gas and even legal industries to name a few are already seeing a surge in software and companies dedicated to this as a way of doing business.

We have the horror stories of Facebook and no doubt you have invested in one of the big three home automation voice recognition featuring Alexa, Siri or Assistant.

Maybe you have travelled using Uber or Tesla’s autopilot or even Waze on your phone.

Maybe you have a hobby like flying drones from DJI or utilise 3DR’s software, or you can’t work out without your Fitbit or Fenix.

The point is, you are providing data back to some central point that is analysed to give the company better decisions as to what to proceed to market with as a next generation product, or where to improve something already in the field.

Whilst the Luddites worry that AI will lead to Skynet and the doom of humanity, it is probably better to think of it in an advancement in human intelligence and another milestone down the path of evolution, and I look forward to seeing how this architecture develops.

Related Posts

  • Arrow Bandwidth S4 Episode 13 | Bandwidth Big Data Edition, AI in the Wild

    In this episode David Fearne, Global Practice Leader of Data Intelligence, talks about AI in the wild, what projects Arrow has undertaken and what they learnt from each one.

    > More

  • A Day in the Life of an Arrow Apprentice: Starting a Career in Business Administration

    This week, as part of our Day in the Life series, we're meeting Brooke from the Arrow Support Services team who recently joined Arrow as an apprentice.

    > More

  • What's New in ONTAP 9.3?

    Marcus Burrows shares his favourite features from NetApp's new data monitoring software, ONTAP 9.3.

    > More

Blog Updates

Popular Posts

Challenges and Solutions for Managed Services Providers in IoT

> More

VMC NetApp Storage

> More

Bandwidth On Point Takeover | Episode 1 - IoT Foundations

> More

Being a Disruptor in the Market

> More

Dell Technologies World - "Make it Real"

> More

Blog Archive