08 Jun HPE Reveals Its XaaS, HCI, AI, And Storage Hand At Las Vegas Discover 2019
Last week I attended HPE Discover 2019 in Las Vegas with Moor Insights & Strategy analyst colleagues Rhett Dillingham, Matt Kimball (analysis here), Steve McDowell (analysis here), and Will Townsend (analysis here).
Hewlett Packard Enterprise is a company I follow very closely and as such, I never miss the opportunity to attend its premier customer and partner event (see my Discover coverage of the 2018 and 2017 conferences for more background if interested). HPE is a company that’s gone through some transition over the last few years and last year’s Discover marked the first conference with Antonio Neri at the helm as CEO. The company has stabilized, the sentiment is improving and financially is in growth mode. I am convinced more would be talking about HPE if its competitors weren’t growing, too.
I went into this year’s conference hoping to learn more about HPE’s strategy, execution and new product announcements around HCI, AI, storage, and services. In this column, I’ll keep things fairly high level—look at deeper dives from my colleagues Will Townsend (networking/carrier), Matt Kimball (compute/HCI), Steve McDowell (storage/HCI), and Rhett Dillingham (cloud). Let’s dive in.
HPE broadens “everything-as-a-service”
Perhaps the biggest piece of news from the conference was the announcement that HPE will make its entire portfolio available “as-a-service” by the year 2022 (that’s soon). That is a big, bold announcement, and one that was smart to roll out at Discover and where the company has the whole industry watching it. I get more questions about GreenLake from enterprises, HPE competitors, and investors than anything else. HPE says it will accomplish “as a service by 2022” via a variety of subscription, pay-per-use, and consumption-driven offerings, while it continues to offer its hardware and software to customers in the traditional license-based, CapEx model. More and more IT companies are realizing that giving enterprises flexibility and choice is the way to go. To me, this announcement is HPE taking that realization to its next logical conclusion after first announcing GreenLake.
A big part of this expansion is the work HPE is doing to scale out its GreenLake hybrid cloud service portfolio. GreenLake has been kicking butt since its launch in 2018; it’s HPE’s fastest growing segment, boasts an impressive 99% renewal rate, and has over 600 customers with over $2.8 billion in contract value. Now it’s going after a new target—the mid-market enterprise whose IT infrastructure is frequently constrained by barriers in budget and staffing. To that end, HPE announced five new GreenLake offerings, designed to give these mid-size businesses pre-configured, as-a-service solutions optimized for compute, database, private cloud, storage, and virtualization workloads. HPE says this will save these businesses the headache of designing and testing configurations themselves and give them more flexibility and speed-to-market. GreenLake mid-market will be delivered through channel partners and I am very interested to see how the company and its partners shield the complexity and deliver a great experience. This isn’t an easy thing to do.
I was also impressed by the launch of HPE GreenLake for Aruba, what the company calls a Network as a Service offering designed to bring the GreenLake Portfolio to the edge. Utilizing Aruba’s vast networking portfolio, HPE says this offering gives customers more flexibility in how they acquire and support their network infrastructure.
HPE also took the opportunity of the event to launch a couple new GreenLake support tools. These include GreenLake Quick Quote, an automated quoting tool for pricing, and GreenLake ChatBot, an AI-driven bot designed to answer customers’ various GreenLake queries.
What I’d like to see now is for HPE to provide a quarterly update on how “everything as a service” is going financially and when new offering are online.
Broadening storage portfolio
One of the standout pieces of storage news was the launch of HPE Primera, a solution that leverages the AI of HPE’s InfoSight offering to deliver better simplicity, availability, and performance for mission-critical storage. Traditionally, enterprises frequently had to trade cloud agility for the high-end resiliency necessary for mission-critical applications. HPE says that Primera is an on-demand solution that delivers both cloud-like agility and high-end resiliency. HPE says this new offering will bolster its Intelligent Data Platform, the company’s portfolio of solutions designed to move customers away from a traditional storage model towards one where business value is derived from intelligent data. HPE says Primera will deliver “rack to app in less than 20 minutes.” The company took some big competitive swings and livened up the crowd for sure. If customers can get “rack to app in less than 20 minutes”, this is a big deal.
HPE also announced that HPE SimpliVity would be getting the InfoSight treatment—the first hyperconverged infrastructure solution to include HPE’s predictive analytics capabilities. Adding InfoSight will give customers insights to predict when the infrastructure will need additional hyperconverged nodes, as well letting them know when backup resources are at capacity risk. This gets us farther along the continuum to more of an autonomous datacenter.
Additionally, it launched HPE Nimble Storage dHCI, which it calls “a disaggregated hyperconverged infrastructure platform.” HPE says this solution is geared towards HCI workloads with unpredictable growth and claims it will drastically simplify VM management. Furthermore, HPE claims Nimble Storage dHCI will deliver 99.9999% data availability, with sub-milliseconds of latency. I’m interested to see after more analysis how this helps with HCI’s inherent symmetry issues of needing to add compute and memory every time IT needs more storage.
For a more in-depth look at HPE’s storage news from the conference, see my colleague Steve McDowell’s deep dive on the topic here.
Bolstering Aruba Central
HPE also announced some enhancements to Aruba Central, the company’s intelligent, cloud-based network management platform. Aruba Central basically provides network management, AI-driven analytics, service assurance, and networking security for edge environments. At the conference, HPE announced some new integrations and turnkey solutions with partners like Microsoft, ABB, and PTC, which HPE says will enable real-time intelligence and control within industrial edge settings. For more, see Moor Insight & Strategy networking analyst Will Townsend’s take on the news. I’m interested to see how this differs from Cisco Meraki as the presentation sounds a lot like it.
HPE and Google Cloud container as a service
HPE was all-in on the hybrid cloud before it was cool and its announcement with Google is another delivery vehicle on the long line of HPE hybrid cloud solutions. In a joint announcement, HPE and Google Cloud announced their intent to bring a true hybrid cloud for containers to market, to be delivered with an as-a-Service option (of course). The solution will consist of HPE’s on-prem infrastructure, its Cloud Data Services, and GreenLake, combined with Google Cloud’s Anthos offering. For a more in-depth look at HPE’s cloud news from the conference, see my colleague Matt Kimball’s deep dive on the topic here. If you want to learn more about Google Anthos, I wrote about it here.
A piece of big-picture news from the conference was the announcement of what HPE calls “Cloudless Computing.” Basically, the concept is to provide automatic, intelligent, and resilient connectivity across every cloud. You might be wondering, “wait, I thought I could do hybrid and multi-cloud today.” Don’t worry, you can. Enterprises can do hybrid and multi-cloud, but the reality is that you will do a lot of work to move any app to the public cloud with the ability to move it from cloud vendor to cloud vendor, on-prem and public. HPE’s idea of “cloudless computing” is envisioned to make moving around workloads seamless and easier.
HPE describes Cloudless Computing as being divided into three different fabrics. These include the trust/security fabric, centered around HPE’s state-of-the-art silicon root of trust, the connectivity/optimization fabric, which HPE says eliminates the need for app developers to worry about network policy minutia, and the value fabric, which is essentially the embrace of an open economic framework that HPE says will drive innovation. It’s a big idea, and I find it hard to argue with these three areas.
HPE says this helps application developers, by giving them the resources to consume and integrate HPE’s infrastructure, tools, and services into their solutions. For the open source crowd, HPE says the concept of Cloudless Computing levels the playing field and gives them fair access to the marketplace. For users, HPE says this delivers higher levels of choice and agility, at a low cost. In short, everybody, in theory, will benefit.
I will be digging much more deeply into this.
HPE clearly covered a lot of ground at Discover 2019. My overall take is that the company managed to advance its cause and positively fill in many of the blanks in its portfolio around HCI, AI, and mission-critical storage. Furthermore, it significantly raised the bar with its “everything-as-a-service by 2022” mission statement and plan for a Cloudless Computing world. HPE is looking less and less like a company in transition, and more like an industry leader with a clear-eyed vision of its place in the future of IT. Nice work, HPE. I’ll keep watching.
Note: Moor Insights & Strategy writers and editors may have contributed to this article.