Dragons, DevOps, Multicloud - NetApp Insight 2018
NetApp's hybrid multicloud concept was previously presented at NetApp Insight 2018 Las Vegas and was highlighted again in Barcelona.
"We want to help customers benefit from an on-premise cloud-like experience"
explained Martin Cooper, Senior Director Solutions Engineering EMEA at NetApp.
"By this I mean that our customers' databases should function like a cloud - quickly, uniformly and autonomously."
This guideline accompanied visitors to NetApp Insight through keynotes, presentations and demonstrations - and is also reflected in the new products presented.
What's new?
NetApp wants to realize the on-premise cloud-like experience with its Hyper-Converged Infrastructure (HCI). This makes it possible to define workloads via public cloud orchestration, but then deploy them in your own data center, i.e. on the HCI system.
To support this system, NetApp announced a number of innovations in its cloud services portfolio at the conference in Barcelona. For example, Azure NetApp Files as a data management service for Microsoft's cloud or NetApp SaaS Backup for Salesforce for simpler, legally compliant backups.
In addition, Cloud Volumes Service for Google Cloud Platform is now also available in Germany. One functionality that was highlighted in particular is Cloud Insights, a SaaS service for infrastructure monitoring.
The software scans IP addresses in its own data center and provides the user with feedback within one day about which machines or assets they are operating in their on-premise structure.
The tool collects all available data and prepares it for users in a simple format. Cloud Insights can also be easily used in the public cloud to obtain a complete overview of all systems or to check cost-saving potential. To do this, a virtualized workload is moved from one public cloud to another in order to see the comparison between the prices.
"You can really see the multicloud approach that NetApp is pursuing"
said Peter Wüst, Senior Director Cloud Infrastructure and Cloud Data Services EMEA at NetApp.
"All cloud providers sound similar at first, but they all have very different strengths. Customers want to be able to take advantage of these different benefits and not be reliant on just one provider.
That's why we decided many years ago to help our customers not just move from one IT silo to the next, but to take the opportunity to build an open structure that supports the use of different clouds as well as their own data center. We call this structure Data Fabric."
NetApp also wants to support existing SAP customers who want to migrate to the cloud, thereby providing security for the future.
"We want to take the risk out of the projects by integrating standardized workflows and tools into the SAP environment"
explained Bernd Herth, Senior Technical Marketing Engineer, Solutions and Integrations SAP.
"For us, this means decoupling the data, and once SAP data is decoupled, it is easier to move it between on-premise and the cloud. Many of NetApp's services can also be used by existing SAP customers. However, the customer's own performance requirements must be taken into account, as cloud instances also have limits."
Another helpful function for existing SAP customers: In LaMa, SAP Landscape Management, you can move an SAP database to the cloud at the touch of a button.
There are also new features in the area of AI. With Ontap 9.5, NetApp wants to support companies in modernizing their data services. The software offers cloud integration, all-flash performance, increased efficiency and ease of use.
Critical workloads are accelerated, data management is standardized and the tiering of data sets is automated. In concrete terms, this means high performance with consistently low latency in all storage environments.
The industry's first latency guarantee, the Flash Performance Guarantee, is also designed to enable customers to run AI applications at a predictably low latency. This enables companies to minimize costs and risks when using AI by using Flash.
These new data services extend the NetApp data fabric concept for the management of data between its point of origin, the central data platform and the cloud. This enables companies to fully exploit the potential of artificial intelligence.
Data instead of movies
NetApp's collaboration with Dreamworks to produce a customized data fabric for the film studio was announced at NetApp Insight 2018 in Las Vegas.
The guests in Barcelona were Kate Swanborg, Technology Communications and Strategic Alliances Executive, and Senior Technologist Scott Miller, both from Dreamworks, to talk in detail about the partnership.
"Half a billion digital files are created for one movie"
explained Scott Miller.
"We normally work on a film for two to four years, and we always produce several at the same time. Sometimes there are up to ten films being actively worked on - that makes five billion digital files that we have to store and manage securely."
Dreamworks has relied on NetApp since 2006 for data management services and to avoid planned downtime, such as during an upgrade. The company has two primary clusters, one for movies and one for TV series, as well as a secondary cluster used for backups.
Dreamworks chose NetApp primarily because its offerings significantly reduce latency and thus improve the production and quality of the films. Detailed scenes and dynamic action sequences require an ever-increasing level of performance while maintaining data stability.
The film "Dragon Taming Made Easy 3 - The Secret World", planned for spring 2019, was used as an illustrative example of this. In the first film in this trilogy, it was almost impossible to fit three or four dragons in the same frame - in this film, there are often several dozen.
"What we see here is our data"
said Kate Swanborg about the movie trailer that was shown during the keynote on December 4.
"Dreamworks is known for its films, but what we really produce is data. This makes it all the more important to have a reliable partner to help us manage it."
Roadmap
NetApp Insight 2018 in Barcelona was very much focused on multicloud, and this is where NetApp sees the trend continuing: the company will continue to focus on helping customers orchestrate their own data center with a "cloud first" strategy and benefit from multicloud at the same time. Peter Wüst commented:
"Our goal is to give customers the freedom to choose which cloud they want to use and to help them find the most efficient and cost-effective offering for their requirements."
NetApp also endeavors to integrate new tools into its offering as soon as they are offered by hyperscalers such as Azure or AWS. This also applies to SAP environments.
"We will continue to ensure that as soon as we launch new systems on the market, we also make them usable for the SAP ecosystem"
assures Bernd Herth.
"But unfortunately this is a lengthy process that doesn't just depend on us. Nevertheless, we always try to develop joint solutions with our partners. However, we don't know what else will happen at SAP in the container and cloud environment - for my part, I'm excited."
NetApp continues to be committed to DevOps. The old world of client/server is no longer relevant, and the modern world is very much driven by developers - as can also be seen in the public cloud.
IT is becoming increasingly important for businesses, as internal development teams play a key role in a company's ability to generate new revenue streams. With data services and DevOps techniques and methods, NetApp aims to help customers become more agile.
"DevOps brings the old world together with the new world"
explained Peter Wüst.
"NetApp would be making a fatal mistake if it had no answer to DevOps, but only served this classic world of applications and storage."