IT Consumption Model Changes Start IT Industry Restructuring

by Nick Lippis 

 

The IT industry is living in a Picasso moment; it is creating a new future via software and in the process shedding its decades-long love affair with hardware. As the hardware era ends and the software-defined world progresses, the over $400B worldwide compute, storage, and networking market and vendors like Dell, Cisco, IBM, HP, and many others that supply it are struggling with the change. Legacy vendors are disappearing or restructuring, as evidenced by the HP breakup and Dell’s purchase of EMC and VMware. But unlike an artist painting a masterpiece, this technology transformation is not controlled by any one person, company, or institution, making it a leveler, offering advantage to only those that embrace it. There is no army large or strong enough to stop this progression as it fundamentally reshapes the IT landscape. While the shift from hardware to software sounds simple and perhaps trivial, its implications are anything but.

As a direct response to the software-ization of IT infrastructure, IT consumption models are changing IT service delivery, supply chain buying patterns, and vendor-customer relationships. These changes, in turn, impact both traditional IT suppliers and start-ups as they strive to understand the changing consumption patterns of their largest customers. Vendor business models will have to fundamentally change, while venture capitalists and stockholders will have to reset return expectations.

While hardware is a capital spend, software is license-based and applied to operational cost. In the past, vendors could count on corporate procurement groups overspending on capital in an effort to reduce IT delivery time. Typically when an IT department orders a server it can take up to 200 days, or nearly 6 months, for that server to be delivered into their data center, installed, and configured for appropriate workload. Knowing this, corporate procurement departments would traditionally bulk up on their hardware orders to ensure overcapacity, thus avoiding the long delay to IT service delivery at the expense of under-utilization. Most corporate data centers run at less than 20 percent utilization, meaning that they are 80 percent over-provisioned. Today that’s changing, enter cloud computing.

Cloud computing services such as Amazon’s AWS, Microsoft’s Azure, Google’s Cloud Platform are making on-demand service delivery a reality as well as both the envy and requirement of most if not all corporate IT business leaders. As such, IT executives have started building their own private clouds to offer self-service IT delivery for their business unit managers. Hybrid cloud, or the ability to host workload in public and private cloud servers, is a major requirement, but don’t think all workload will move to the public cloud as some suggest. Will Google, Amazon, Microsoft, Rackspace and others assume the $500B liability and regulatory compliance requirements of any of the financial services’ most sensitive workloads? Will the federal government place the social security or defense department servers in the public cloud? Not a chance. But there are plenty of non-sensitive workloads that will move to the cloud.

Just like Internet Service Providers in the late 90s provided the way for corporations to build private internets for corporate workload, so too are hyperscale firms like Facebook, Google, Microsoft, and Apple showing corporate IT business leaders how to build their own private clouds. This cloud model commoditizes compute, storage, and networking hardware thanks to the disaggregation of software functions from propriety hardware platforms. If cloud infrastructure software becomes free to run on standard x86 platforms, merchant silicon, or both – but most importantly through open application programming interfaces (API) – corporate programmers will be able to unlock and access once-closed software to create business value. Software is king here. In particular, automated orchestration software rules, in that it allows corporations to rapidly deploy new prototypes, services, and applications for a more agile response to changing business and market dynamics.

For decades IT has been a seller’s market. The change to software is turning it into a buyers market. It is up to IT business leaders to guide this transformation toward an open software-defined infrastructure market rather than exchange one closed market for another. At the Open Networking User Group, the concept of ‘open IT frameworks’ has emerged and seeks to create open APIs between every layer of the IT stack so that IT departments maintain control of the infrastructure they purchase. In short, open IT frameworks allow corporations to replace software modules, equipment, and vendors within an IT stack without negatively impacting what is above or below, providing vendor lock-in mitigation, cost control, and self-service delivery.

Moreover, as we’ve seen at the Open Networking User Group, those who adopt open IT frameworks are able to respond to market dynamics more quickly and are more competitive in their respective industry sectors. Open IT frameworks are fundamentally changing the basis of competition within the world economy. The masterpiece painting that emerges once the dots are connected is one of a secure, telemetry rich, software and services cloud-dominated IT industry landscape that promises to unleash a wave of corporate productivity that surpasses the internet age.


Author Bio

Nick Lippis pictureNick Lippis is the co-founder and co-chairman of the Open Networking User Group (ONUG); an organization driven by a board of IT executives from Fidelity, Gap Inc., Credit Suisse, Citigroup, Pfizer, FedEx, Morgan Stanley, JPMorgan Chase, BNY Mellon, Bank of America, Cigna, Yahoo, Intuit and UBS, to enable greater choice and options for IT business leaders by advocating open interoperable hardware and software-defined infrastructure that spans across the entire IT stack all in the effort to create business value.

Author's Bio

Guest Author