07 Nov IBM Raises The Bar For Storage, Again
The big news in the technology world this past week was IBM Corporation’s purchase of Red Hat in one of the largest software company acquisitions in history. While that is a foundation-shaking move for both IBM and Red Hat, it will not impact the day-to-day lives of most working IT professionals.
IT professionals focus on delivering consistently reliable and quality service to their customers. While the Red Hat integration will take a while to yield updated product and technologies roadmaps for the new combined company, IBM’s recent plethora of storage announcements is much more critical when thinking about your IT organization’s needs.
I love covering IBM’s storage announcements. The storage group, while sitting within an unquestionable behemoth of a technology company, moves with the pace and agility of a much smaller organization. It is an organization that has palpable energy within it.
This agility and pace has allowed the IBM storage team to deliver a cadence of impressive new technology and product announcements. What I find fascinating about this team is its steadfast alignment towards a compelling vision of how enterprise data should be managed.
IBM’s storage story focuses on how data is generated, managed, and consumed within an enterprise. It is that understanding that generates a set of technologies that can be leveraged to deliver a cohesive solution to any IT organization.
Would you have guessed that IBM now has the broadest portfolio of NVMe and NVMe-over-fabric enabled products in the industry? It surprised me. I wasn’t surprised to learn that IBM is the world leader in tape archive solutions, given that I started in this industry by changing nine-track tapes for gas money. IBM and tape are permanently cemented in my mind. At the same time, a portfolio that reaches from arguably the fastest storage arrays in production to the lowliest tape drive is quite a span.
All about software
IBM has always been about delivering cohesive software-first solutions to IT, which it closely couples with well-engineered hardware. This has been true since IBM delivered its first mainframe seventy years ago, continuing today with its broad range of offerings in compute and storage that span on-premises and cloud architectures.
As IT organizations move from tape to cloud for data protection and archival storage, and as data comes alive with the rise of AI-driven analytics and edge-driven compute, managing that data becomes a logistical challenge. Knowing where data is, what it’s used for, and what the organization’s requirements are in protecting that data is what drives long-term technology choices.
Managing the flow of an organization’s data, while delivering insights on the characteristics of that data, is what IBM’s Spectrum Storage suite is all about. There are IBM Spectrum Storage products that manage data in virtualized environments, provision storage into hybrid-cloud deployments, manage the complexities of data-protection and backup, and that provide software-defined storage solutions for file, block, and objects.
IBM’s recent announcements included a flurry of new features across the Spectrum Storage line, including a new solution for SAP HANA installations that leverages IBM storage, IBM Spectrum Protect, and IBM Spectrum Copy. It’s a big list, but two software-related announcements dominated my attention.
IBM Storage Insights
IBM Storage Insights is a cloud-based management tool that leverages IBM AI technology to detect storage networking performance issues and proactively generate support cases to help IT before issues become problems. This class of capability is rolling out from vendors across the IT industry and is fast becoming table stakes for selling storage solutions into the enterprise.
I’m glad to see IBM release this product. Given IBM’s heritage in artificial intelligence, combined with its half-century-long institutional legacy in supporting datacenter technology, it could become something special.
Mapping unstructured data
Analytics-driven workloads and edge computing both drive new requirements for unstructured data. Object storage is one of the fastest growing storage technologies in enterprise IT, with solutions from most top-tier storage companies, including IBM. The challenge of unstructured data lies within the ability to understand what the data is, where it came from, and what an IT team should do with it.
IBM Spectrum Discover is a new offering from IBM that aims to tame unstructured data and provide insights into exactly what that data is. Spectrum Discover, which arose from technology generated inside of IBM Research, scans and creates metadata, creating catalogs for an unstructured data set. This is designed to provide a roadmap to an organization’s data. IBM Spectrum Discover will support data within IBM Cloud Object Storage and IBM Spectrum Scale systems and, most surprisingly, is expected to support Dell EMC Isilon sometime in 2019.
Understanding unstructured data is a challenge for enterprises, one that becomes more difficult every day. It’s also not an easy problem to deploy traditional software tools to solve. It’s great to see IBM deliver technology into this space. It should set the bar for others to follow.
Performance and density
As excited as I get about software, it is only as good as the underlying hardware. IT buyers today should feel spoiled by the quality and breadth of solutions available to them. The rise of solid-state storage, NVMe interconnects, and integrated server-class processing makes it an excellent time to buy storage solutions.
IBM is expanding its portfolio of hardware offerings and, at the same time, turning into the technology provider with the broadest offering of NVMe and NVMe-over-fabric storage products. IBM focuses on performance with NVMe, and on storage density with its unique flash packaging technology.
The best illustration of IBM’s focus on density is the DS8880F, which nearly doubles its flash capacity to a whopping 737TB (compared to its previous 368TB). At the same time, the IBM FlashSystem 900 and 9100 products are doubling in capacity with new 18TB modules that will support up to 44TB effective capacity. This new capacity is up from 22TB provided by the previous generation module.
The updated third-generation Storwize V7000 gains integrated NVMe and a range of expansion options, while delivering what IBM describes as a 2.7x increase in maximum throughput for some workloads—all while being clustered with other V7000 arrays to serve a cluster-wide 32PB of capacity.
IBM announced further hardware features and tweaks across its storage line. Check out its website if you want to go deep on the details.
In the fight for bragging rights, vendors publish “hero numbers” that show that they have the world’s fastest array. This year that claim was credibly made by Dell EMC, Pure Storage, and IBM. The truth is that most platforms that are built on modern-technology—leveraging NVMe interconnects both inside and outside of the box, and sporting reliable media—will all satisfy most demanding storage needs.
Beyond hardware bragging rights, IT teams need to focus on the broader question of how those storage solutions are integrated into their overall IT architecture. Software is the glue that ties together storage hardware with cloud, hybrid-cloud, and other storage and compute models to service the changing demands of data within the enterprise.
IBM focuses on delivering a holistic solution to datacenters. Given the breadth and depth of its storage portfolio, its supporting software suites, and its penetration into cloud, IBM is one to watch as enterprises’ bits and bytes are turned into actionable data.