Category Archives: NetApp

  • 0

10 Good Reasons to Choose NetApp for Machine Learning

Category : NetApp

Artificial intelligence (AI) can help your team get greater insight from enterprise data and enhance digital services to increase customer engagement. But it’s such a new field that the right infrastructure choices for AI / Machine Learning (ML) aren’t always clear.

Whether you do AI work on-premises or in the cloud, as you ramp up processes and move them into production, bottlenecks inevitably occur. Lack of I/O performance stalls your AI pipeline. Moving, copying, and managing rapidly growing data sets eats up valuable staff time. The methods that worked during proof of concept become impractical if not impossible at scale.

This is where NetApp can help. NetApp Data Fabric solutions and services accelerate and simplify your AI / ML efforts—from the edge of your network, to the core of your data center, to the cloud. Here are ten reasons to partner with NetApp for your AI / ML needs.

Source: https://blog.netapp.com/infographic-10-good-reasons-to-choose-netapp-for-machine-learning/

Author: Matt Watts

 


  • 0

Building an Economic Moat: Strategy Nerd Stuff for Service Providers

Category : NetApp

The investment firm Morningstar coined the term “economic moat,” referring to how likely a company is to keep competitors at bay for an extended period. Warren Buffet said, “In business, I look for economic castles protected by unbreachable ‘moats’.”

From the Berkshire Hathaway 2000 Annual Meeting:  “So we think in terms of that moat and the ability to keep its width and its impossibility of being crossed as the primary criterion of a great business. And we tell our managers we want the moat widened every year. That doesn’t necessarily mean the profit will be more this year than it was last year because it won’t be sometimes. However, if the moat is widened every year, the business will do very well. When we see a moat that’s tenuous in any way—it’s just too risky. We don’t know how to evaluate that. And, therefore, we leave it alone. We think that all of our businesses—or virtually all of our businesses—-have pretty darned good moats.” (Read more at https://www.businessinsider.com/buffett-on-moats-2016-4#q52x0HDAvguCCfxS.99.)

There are many ways to build an economic moat, but the one I suggest service providers look to is service. Perhaps you’re thinking: Well, great. AWS is a service provider, too (as well as Google, Azure and IBM), and they are the breachers. How can I compete directly with them? And to that I say, unless you are really big, don’t try to compete with them. As a reference point, I understand that AWS has about 5,000 releases per year (including all small fixes and whatnot). How many releases per year are you doing?

So, …

if you aren’t huge, be specific.

“I fear not the man who has practiced 10,000 kicks once, but I fear the man who has practiced 1 kick 10,000 times.” – Bruce Lee. 

From my aggregated view of the worldwide cloud and hosting industry, this approach requires a change of direction. How so? Many service providers are so focused on winning the deal (almost any deal) that they are trying to be something to everyone, and end up being too little to most. Said differently, an ocean that is a meter deep may have great visual appeal, but in reality it is useless. A river, although not as wide and all-encompassing, offers specific direction, power and simplicity.

What does that mean to you as a service provider? Focus, focus, focus. Find an area to be the best. That route may be vertical-specific, or application-specific, or route-to-market specific, or other specific. Rackspace did with it Fanatical Support, and they’ve done it again with their Managed Cloud services. Some service providers end up with a focus accidentally, and some do it on purpose. if you aren’t headed toward a specific goal, make a plan to get there.

“It gets extra tough when a fanatical small competitor … sets their sights on your particular marketplace,” Buffett said. “How do you compete against a true fanatic? You can only try to build the best possible moat and continuously attempt to widen it.” (Read more.)

We can help you find your focus through our Fueled by NetApp program. Building out an area of expertise enables you to create a market and be the master of it, making it difficult for others to compete. You can change the conversation with your customers from a commodity, price-based discussion to a business outcome conversation, with proof to back it up.

This is an economic moat. Once you have built a moat, expanding it with additional areas of focus strengthens your position and business outcomes.

You may be saying, OK, but how do I get started? The following overview describes how to get moving toward your economic moat of specificity.

  1. Identify the focus–two paths
    1. You need three to five successful customers. Remember my last blog, “Customer Focused Go-to-Market Strategies for Growing Your Service Provider Business”: This isn’t about you (inside out); it’s about your customer’s success (outside in).
      1. Success equals your customer achieving desired business outcomes enabled by your services.
      2. This may require some consideration and discussion with your customers.
    2. Repeated motions (sales, operations, support, etc.) due to success. This is wide open to many areas of focus, but, in the end, you have a high level of confidence in this particular repeated motion because it is so frequently successful!
  2. Define the success – customer first, then you
  3. Plan for scaling success
  4. Go to market with success
  5. Monitor success
  6. Optimize success
  7. Expand into additional area of focus by building parallel processes

You can claim your unfair share of the market by building an economic moat based on a specific focus on what you deliver best. We support service providers all over the world who are doing this today. Start partnering with our Fueled by NetApp consultants today to create a business strategy that is specific to your areas of focus. We’ll help you productize, develop, and promote your cloud-based services for a better hosting experience for your customers. For more insights, ask for a Fueled by NetApp engagement.

Source: https://blog.netapp.com/building-an-economic-moat-strategy-nerd-stuff-for-service-providers/

Author:  Mara McMahon


  • 0

Blending the Power of ONTAP Cloud on AWS with VMware Cloud on AWS

Category : NetApp

You may wonder what NetApp is up to with our latest efforts with VMware and Amazon Web Services. Most of Netapp’s customers are well aware of our long partnership with VMware. We have many mutual customers who are using NetApp®storage behind their on-premises VMware environments. But we also know that our customers are looking to build hybrid cloud solutions that allow even more portability for their virtual applications and their data.

Watching our customers moving to the hybrid cloud model, we recognized an opportunity to improve the connection between the datacenter and cloud storage.  Since customers look to AWS as one of the target platforms for their data, we created NetApp ONTAP® Cloud on AWS to address those needs.

 

 

VMware is also strategically aligned with AWS on hybrid cloud. We recognized how important it is for NetApp to build strategic partnerships with VMware and AWS to help customers accelerate their digital transformation through the delivery of enterprise-grade, service-oriented IT capabilities for hybrid cloud environments.  To help our customers build hybrid cloud solutions using the tools and partners they use today with their on-premises operations as they expand to the cloud.  Let’s look at how NetApp and VMware are delivering solutions using the AWS platform and how these can help organizations achieve their cloud operational goals.

So, what is VMware doing with AWS? VMware Cloud on AWS is an on-demand, elastically-scalable cloud service that is delivered, sold, and supported by VMware. VMware and AWS offer a seamlessly integrated hybrid cloud that extends on-premises vSphere environments to a VMware SDDC running on AWS elastic, bare-metal infrastructure. Customers can run applications across operationally consistent VMware vSphere®-based private, public, and hybrid cloud environments, with optimized access to AWS services.  Customers can use their existing tools and skillsets within a common operating environment based on familiar VMware software.

Second, a number of NetApp customers are already using ONTAP Cloud for AWS, a software-only storage service for dev/test, disaster recovery, and production applications. This means that for your AWS cloud storage platform, ONTAP Cloud delivers the same set of enterprise-class storage features you’re used to from the FAS or All Flash FAS systems in your data center. With this service, you can take snapshots of your data without requiring additional storage or impacting your application’s performance, and you can tie your cloud storage to your data center by using industry-leading NetApp SnapMirror® replication technology.  This delivers the efficiencies found in your enterprise storage array to your AWS cloud storage. Plus, if you’re concerned about security, ONTAP Cloud offers NetApp managed encryption of your at-rest data, while you retain the encryption keys.

How does ONTAP Cloud on AWS fit with VMware Cloud on AWS?  Once you have your VMware Cloud virtual machines and your ONTAP Cloud instance running, your VMs can then consume and manage storage from ONTAP Cloud.  With the data stored in ONTAP Cloud, the benefits of ONTAP can be applied to the data just like in the on-premises environment.  Let’s take a look at a couple of use cases to illustrate how these work together.

Let’s start with a dev/test use case. Customers would use VMware Cloud on AWS to mirror applications to an EC2 instance to stretch out to use cloud-based resources. For each dev or test environment, the customer will want to make clones for each of the VMware-based virtual machines. In some cases, the source data is very large and the copies need to be writable and re-created quickly. A benefit of using ONTAP Cloud storage for these copies comes from FlexClone® thin cloning technology, whereby any volume can be cloned instantly and in a transactionally constant manner for the point in time when it’s taken, regardless of source data size. A redirect-on-write mechanism is used to allocate and write blocks only when data actually changes in the clone, which makes ONTAP FlexClone volumes extremely storage efficient. Although Amazon EBS offers the ability to create snapshots, the first snapshot is essentially a complete copy of the source data. In contrast, ONTAP Cloud reduces consumption starting with the first snapshot, not needing to make complete copy of the initial snapshot. ONTAP Cloud also offers additional storage efficiencies, such as deduplication and compression, that EBS doesn’t offer.

Another way customers could use VMware Cloud on AWS and ONTAP Cloud is for user data or home directories for virtual desktop environments. Customers with on-premises VDI environments could use VMware Cloud on AWS as a disaster recovery site. Because ONTAP Cloud uses the same data services as the on-premises storage system, it’s easy to manage the replication of user data to the cloud. To replicate user data or home directories to the cloud, just use SnapMirror to incrementally synchronize on-premises appliances with ONTAP Cloud. Or, if you just want a backup, you can use SnapVault® software with ONTAP Cloud to create a cloud-based backup of your data.

We’re not finished thinking about how we might do even more together. Customers are asking us to provide unique and powerful benefits by enabling easier sharing of data across multiple virtual machines as-is, since this sharing is often required for companies that are leveraging the cloud to perform analytics or when executing DevOps. We know that our customers want a simplified way to introduce secure hybrid cloud storage options with minimal change to existing operations, and this is a first step in creating that bridge.

Source: https://blog.netapp.com/blending-the-power-of-ontap-cloud-on-aws-with-vmware-cloud-on-aws

Author: Doug Chamberlain


  • 0

Is Your Infrastructure Integrated Into Your Software Delivery Processes?

Category : NetApp

Any company, including NetApp, in the business of delivering software knows that writing and committing code are just the beginning. Serious work begins at code commit because the work must be checked, validated, and tested to make sure of quality and compatibility before being delivered. This work needs to be done quickly, safely, and in a manner that can be supported long term. This process is referred to as continuous delivery and allows businesses to accelerate the delivery of competitive and differentiating software.

Manual processes in the delivery and deployment of applications reduce an organization’s ability to deliver valuable software at will. Even the most talented and well-meaning team members increase latency and risk in the process. Inconsistencies in the environment as a result of handoffs and manual steps can lead to missed deliveries, reduced developer productivity, and longer time to value.

One significant change that we’ve seen over the last few years is the rapid increase in infrastructure extensibility, or the ability for infrastructure to be managed through software or code. This capability has helped accelerate and mature the capabilities of continuous integration and continuous delivery (CI/CD) and eliminate time-consuming and error-prone handoffs. No surprise that organizations that implement automation of infrastructure in their CI/CD process perform higher than those that do not.* It is with this fact in mind that we see the implementation of CI/CD as an important milestone in DevOps maturity.

Supporting this initiative is a growing set of tools built to automate the environment builds and processes that validate and deploy software on demand. Tools such as Jenkins, CloudBees, and Apprenda are just a few of the top choices in this space. It is no surprise that NetApp has actively developed and released integrations that take advantage of our portfolio’s capabilities within these tools, allowing development and operations teams to work more closely together to deliver on the promises of CI/CD.

In fact, we recently launched a new page on NetApp.com with content focused on understanding how NetApp accelerates CI/CD success. Learn more about how NetApp leverages its APIs, software development kits, and partner integrations to help our customers deliver better software more quickly by delivering infrastructure that becomes a seamlessly integrated part of the software delivery process. Want to find out more? Download the white paper Why Code Is Driving Infrastructure Investment to learn the four steps you can follow to reach maturity in DevOps methods and infrastructure practices to achieve the next-generation data center.

 

 

*Findings from 2017 State of DevOps Report

Source: https://blog.netapp.com/is-your-infrastructure-integrated-into-your-software-delivery-processes

Author: Josh Atwell


  • 0

Are You Ready to Fast Track Your Transformation to a Data Thriver?

Category : NetApp

Over the next three years, digital transformation (DX) will reshape the entire macroeconomy as the majority of global business revenue centers around digital or digitally enhanced products and services. Structured or unstructured, generated by humans or by machines, and stored in the datacenter or the cloud, data is the new basis for competitive advantage. By leveraging the vast quantity and diversity of data to uncover patterns and pursue breakthrough ideas, an enterprise can win in the increasingly competitive business landscape.

Is your company already being disrupted by other new creative business models? Have you assessed your maturity in data-driven DX? Are you a Data Survivor or a Data Thriver? Data Survivors are those where business has identified a need to develop a DX enhanced and driven business strategy, but the execution is still on a project basis. Subsequently, progress is not predictable, nor repeatable. Data Thrivers are enterprises that are aggressively disruptive in the use of digital technologies to affect new markets; ecosystem feedback is a constant input to business innovation.

An interesting example is a leading financial organization, Citibank. A couple of years ago, the company was confronted with the challenge of customer retention and revenue growth in light of changing customer demographics, growing online interactions and a global customer base. Customers were looking for flexible online banking to support their needs, without the hassle of traveling to banking locations. In response, Citibank launched its CitiExpress Bank in a Box, enabling prospects to perform a number of activities — such as apply for a loan, open a bank account and even start a transaction — via the Citi Velocity mobile app and then complete those activities at a bank ATM. The idea behind this initiative was to be customer centric, support a globally common platform, have digital partnerships and create new distribution channels. The initiative has been quite a success for the bank, contributing millions of dollars in incremental revenue.

IDC recently conducted a worldwide study based on LOB executives, IT leaders, and technology-savvy workers from large and medium-sized companies. Their personas included chief data officers, analytics professionals and DevOps/cloud architects. The study revealed that Data Thrivers exist across industries and organizations, and that they are attracting new customers and enjoying new revenue streams faster than those organizations that are not Data Thrivers.

Here’s a sneak peek into how Data Thrivers are embracing and accelerating data-driven DX:

  • Their most important business objectives for investing in DX initiatives are balanced  between tactical and strategic priorities, including acquiring new customers and launching new digital revenue streams.
  • Their IT investment and integration strategies for DX vary from modernization of infrastructure to leveraging cloud services (public and private); from adding new DevOps skills to implementing containers and NoSQL databases.
  • They are placing more importance on varied data formats (including semistructured and unstructured) and working with data dispersed across on/off-premises; and they are aggressively using hybrid cloud.
  • Their data related challenges vary from security and compliance to data access, quality and analysis. New data roles and technologies are being used to manage challenges..

Transformation to become a Data Thriver is an ongoing process. Success, relevance and innovation start with a leadership culture that is willing to evolve by investing in new technology, processes, and business models that drive more value to both employees and customers. IDC advises businesses to undertake a holistic transformation involving people, processes and technology to achieve targeted business outcomes. To learn more about leveraging a data-driven DX framework to fast track your transformation to Data Thriver, download the IDC White Paper, “Become a Data Thriver: Realize Data-Driven Digital Transformation (DX),” sponsored by NetApp Inc.

Source: https://blog.netapp.com/are-you-ready-to-fast-track-your-transformation-to-data-thriver/

Author: Ritu Jyoti


  • 0

Using AI for early skin cancer detection

Category : NetApp

The skin we’re in is the only skin we’ve got. With skin cancer ranking at the top of the most commonly diagnosed type of cancer, detection solutions are highly sought after in the medical field.  Now researchers and scientists are turning to Artificial intelligence (AI) and machine learning to help. This technology has already shown promise in early detection in Alzheimer’s and breast cancer.

Alexander Wong, Professor of Systems Design Engineering at Waterloo in Ontario, Canada, is working with a team using AI to detect melanoma skin cancer through the analysis of skin lesions. While melanoma is less common than some other types of skin cancer, it’s the deadliest because it’s more likely to grow and spread. But if caught early, Wong says skin cancer is also one of the most treatable forms of cancer.

“We’re inspired to come up with AI technology that allows you to analyze and help clinicians diagnose cancer at an early stage. This helps the odds of it be readily treated before it is too late.”“The other important thing to realize is that skin cancer is one of the more readily analyzable forms of cancer given that it exists on the skin, and not internally,” shares Wong. “That’s why we’re inspired to come up with AI technology that allows you to analyze and help clinicians diagnose cancer at an early stage. This helps the odds of it be readily treated before it is too late.”

What can AI see that doctors can’t?

As it stands now, dermatologists rely on visual examinations of skin abnormalities, which are subjective to the human eye.

Wong and his team have built an AI-powered imaging system that provides more comprehensive information to doctors, including changes in the concentration and distribution of eumelanin, a chemical that gives skin its color, and hemoglobin, a protein in red blood cells. These are both indicators of melanoma.

Elucid Labs commercialized the system and dermatologists have already begun testing it. Wong says the system should be available to more doctors within the year.

Meanwhile, Stanford University computer scientists claim they’ve created an AI algorithm that can identify skin cancer as well as a professional doctor. They did this using a database of nearly 130,000 images of moles, rashes and lesions.

Brett Krupel, one of the computer scientists working on the project, explained how the algorithm works. In short, it all comes down to pixels.

An image is an array of pixels. Each pixel is represented by three numbers: red, green, and blue intensity.  The classifier of the system takes all of these raw numbers as input, and spits out a malignant probability.

Krupel says the classifier has millions of parameters that were tuned from images in the team’s training data.

As for whether or not this algorithm be more exact than dermatologists, Krupel said: “We don’t claim superiority, but I believe it is possible given a large enough dataset to train on. Open health records would help tremendously.”

So what’s next?

Wong says AI and machine learning have progressed exponentially within the last few years.

“It [AI} has allowed us to obtain a new level of diagnostic knowledge for aiding clinicians make better decisions beyond our wildest dreams, and I firmly believe that AI will continue to benefit all levels of clinical care,” he declares.

Source: https://newsroom.cisco.com/feature-content?type=webcontent&articleId=1887275&CAMPAIGN=Corporate%20Communications&Country_Site=GL&POSITION=Social+Media&REFERRING_SITE=LinkedIn&CREATIVE=Cisco++

Author: Melissa Jun Rowley


  • 0

ONTAP Select Is Now on the IBM Cloud Marketplace

Category : NetApp

ONTAP Select is now live on the IBM Cloud Marketplace!

 

Here are some of the extensive benefits that you get with ONTAP Select:

  • Market-leading data management and mobility across on-premises, cloud, and hybrid cloud environments with ONTAP software
  • Flexibility to quickly spin up or spin down resources as you need for seasonality, data temperature, and optimal storage economics
  • Storage efficiency, performance and High security from a dedicated, hosted private cloud
  • Seamless connection between on-premises and cloud-based environments
  • An optimal system for DevOps, so you can accelerate time to value while lowering costs
  • Advanced management and high-availability and data protection functions when you combine ONTAP Select with NetApp Snapshot™, FlexClone®, SnapMirror®SnapVault®, and SnapRestore® technologies
  • Full access to the native VMware stack and management tools

A high degree of collaboration and integration across NetApp, IBM, and VMware went into creating this solution. So, you benefit from a very robust, highly available, and highly efficient software-defined storage environment. For example:

  • Your teams can quickly clone virtual machines and deliver them in seconds by using the robust set of APIs.
  • You get superior inline deduplication, compression, compaction, and encryption, with capabilities that outshine other products on the market.
  • With extensive validation, you can rely on high performance with low latency and at-rest encryption.
  • When you integrate with VMware vSphere, implementation is easy.
  • You can be confident knowing that this solution is based on the proven, reliable ONTAP data management software, which is available in 40+ IBM cloud data centers around the globe.

Test-drive this solution today and see how easy it is to get started. To learn more about how to manage and protect your data—one of the most valuable assets for your business—visit www.netapp.com/select.

Source: https://blog.netapp.com/ontap-select-is-now-on-the-ibm-cloud-marketplace/

Author:

Jay Subramanian


  • 0

NetApp HCI Officially Arrives

Category : NetApp

Today we are excited to announce that NetApp HCI is finally here and available!

Back in June when we announced NetApp HCI at our analyst day in Boulder, CO, the hyper converged market was projected to grow from $371.5 million in 2014 to nearly $5 billion by 2019*. Now, four months later, the hyper converged market is expected to double within two years, reaching over $10 billion in 2021**. This clearly demonstrates that NetApp joined this rapidly growing and evolving market at the right time.

While NetApp HCI is a new product offering, it combines trusted and best-of-breed components from NetApp and VMware to deliver a true enterprise–scale hyper converged infrastructure that addresses the evolving market. Architected on SolidFire Element OS and VMware vSphere, and fully managed by VMware vCenter, NetApp HCI allows us to provide unique capabilities such as:

Predictable Performance: Any time you have multiple applications sharing the same infrastructure, the potential exists for one application to interfere with the performance of another.  NetApp HCI delivers unique Quality of Service capabilities that binds storage performance on three dimensions, min, max and burst. This unique capability means hundreds to thousands of applications can be consolidated with predictable performance.

Flexible and Scalable: A key tenet of most, if not all, hyper converged solutions is simplicity, but that does not always mean flexibility. NetApp HCI has a node-based, shared-nothing architecture that delivers independent scaling of compute and storage resources. This avoids costly and inefficient over-provisioning and simplifies capacity and performance planning. Start small with two 2RU chassis and then scale by node. Need storage capacity or performance? Just add a storage node. Want more processing power or memory for virtualization? Simply add compute. Grow how you want. Nondisruptively.

Simple and Automated: The key to agile and responsive IT operations is to automate routine tasks, eliminating the risk of user error associated with manual operations, freeing up resources to focus on driving differentiated business outcome. The NetApp Deployment Engine (NDE) simplifies Day 0 deployment by reducing the number of manual steps from over 400 to less than 30. Once you have deployed NetApp HCI, direct integration with VMware vCenter lets you easily automate and manage day-to-day tasks, including hardware-level operations and alerts, from Day 1 to Day 1500 and beyond. Finally, a robust API enables seamless integration into higher-level management, orchestration, backup, and disaster recovery tools. Watch the demos below to  learn more about how our HCI NetApp Deployment Engine works and the NetApp HCI vCenter Plugin.

 

 

 

NetApp Data Fabric: NetApp HCI is also an integral part of NetApp’s Data Fabric. The Data Fabric is NetApp’s vision for the future of data management, simplifying and integrating data management. It enables customers to respond and innovate more quickly because their data is accessible from on-premises to public cloud. Integration with the Data Fabric allows NetApp HCI to provide robust data services including file services via ONTAP Select, object services via StorageGrid Webscale, replication services via SnapMirror and backup and recovery services via AltaVault.

We look forward to existing and new customers utilizing the benefits of NetApp HCI. To get a quick tour of NetApp HCI or to get more information, visit netapp.com/hci-product.

Source: https://blog.netapp.com/netapp-hci-officially-arrives/

Author: Cynthia Goodell


  • 0

One-Click SAP System Migration to the Cloud

Category : NetApp

Have you ever dreamed of quickly moving your dev/test SAP HANA system into the cloud? Fully automated with a one-click workflow? Without the hassle of installing or investing in additional tools and hardware, creating a VPN, or shipping “snowballs”?

SAP customers running agile development projects are often confronted with requests from their project teams to provision additional test or sandbox systems. In the past, those systems needed to be provisioned using resources in the customer’s own data center. Today, the possibilities of using on-demand cloud resources are an intriguing alternative: no additional hardware to install and maintain, pay-as-you-go cost models that charge costs directly to the requesting cost centers, and almost infinite resources. But customers still need to address the questions of how to move their SAP systems to the cloud, how to operate and manage those systems, and how to guarantee the safety of their data while their systems run in the cloud.

In this blog post, I’ll show how NetApp® Data Fabric technology can help fully automate the move of SAP HANA databases from on-premises to Amazon Web Services (AWS). For this proof-of-concept use case, we used the NetApp Cloud Sync service to transfer the data, NetApp ONTAP® Cloud software as enterprise-grade storage management for the cloud, and SAP Landscape Management (LaMa) for the SAP-specific one-click provisioning workflow.

 

The Magic of Cloud Sync

Cloud Sync, part of the NetApp Data Fabric cloud services, offers a way to copy data from a local NFS server to Amazon Simple Storage Service (Amazon S3). The copy process is fast, reliable, and secure, and can be managed by a simple cloud-like interface or REST API calls.

ONTAP Cloud: Enterprise-Grade Storage as a Service

NetApp ONTAP Cloud is an enterprise-grade storage system that runs as a cloud appliance with most of the hyperscalers, such as AWS and Microsoft Azure. It offers super-fast NetApp Snapshot™ copies, data compression, data encryption, and all the NetApp tools to provide automated application-consistent backups for SAP—a huge advantage that reduces overall project risks. Along with SAP HANA backup in a few seconds, these features are all available as a service when you use ONTAP Cloud together with NetApp SnapCenter® software for your projects.

SAP Landscape Management

Copying SAP HANA databases or SAP systems requires that many tasks be combined in the right order: for example, shutting down the SAP system, cloning or moving the data, and configuring the network and firewalls. These activities all require interaction with SAP systems and with infrastructure components on both sides—on-premises and with cloud resources. This is the domain of SAP Landscape Management, which enables you to extend the built-in workflows with custom-tailored functions.

One-Click Custom Provisioning Workflow

In this proof of concept, we used SAP Landscape Management with a custom provisioning workflow to automate the required steps, from system shutdown on the source to system start-up in the cloud. We did all of this with a one-click configurable and extendable workflow, which can be used as a base for upcoming pilot projects. The following short video demonstrates the whole workflow and explains the steps and activities.

 

Source: https://blog.netapp.com/one-click-sap-system-migration-to-the-cloud/

Author: Bernd Herth


  • 0

Introducing Elio

Category : NetApp

Meet Elio, NetApp’s new virtual support assistant with IBM Watson cognitive computing, part of Digital Support.


Support