Sustainability will be a recurring theme this year despite the historically low cost of oil
Sustainability is by definition a long-term goal. Any number of change agents (scarcity, politics, and natural disasters) could cause oil prices to increase. Designing and optimizing data centres for efficient use of power is as good for the bottom line as it is for the planet.
If the Internet were a country, its energy consumption would be the sixth largest in the world. In October 2014, the U.S. government issued an energy cut challenge aimed at data centres. If all US data centres were merely 20 percent more efficient, we could save $2 billion, resources better spent on innovation than inefficient cooling systems and zombie servers. Name-and-shame campaigns by watchdog groups have urged operators to rely more on renewable energy sources, an increasingly available option that carries less price volatility. Meanwhile, federal government mandates have prompted consolidation of agency datacentres, saving $1 billion. Proposed legislation would create better standard measurements of efficiency (I’ve previously discussed why PUE isn’t a sufficient metric).
Europe is several steps ahead of the U.S., regulating data centres in commercial and government spheres alike. Energy costs remain high in Asia; efficiency will be a growing priority there. Finally, the demand for cloud is driving the pursuit of energy and operating expense savings as data centre capacity is scaled up and metro networks are upgraded.
The IT operational data we’ve been collecting will create challenges before it offers answers
I’ve long argued that IT operational data can be a treasure trove of insight, if only we measure the right things. As data lakes swell into oceans, an ecosystem of management solutions will spring up around the use of open source, Hadoop-based data lakes. As Hadoop technology matures, savvy enterprises are moving from large-scale storage and batch analytics to more nuanced and real-time processing that is more integrated with business needs. To truly serve the enterprise, the sharing of large data sets will have to become more lightweight and transparent, built for ease-of-use and accessible to business end users (self-service Big Data). Some industry leaders have begun to harness the power of their data; their resulting advantage will become clear to the marketplace, spurring a new wave of innovative solutions.
Looking ahead, I see 2015 as a learning period; IT leaders and data experts will assess platforms and architectures, determining the associated costs and opportunities. An intelligent blend of open-source data management platforms and established data warehouse solutions is likely to be a winning strategy for most.
The limitations of the public cloud will become apparent
Specifically, Fortune 1000 executives will be faced with the reality that a vast number of existing applications and services will never be able to run in public cloud environments. Business critical apps that currently live on premise may need to stay there. Titans of industry, no matter how advanced their technology infrastructure, have legacy investments worth preserving. Moving these systems into the cloud would be hugely disruptive. In most cases, the financial incentive or business case for enduring such disturbance can’t be justified.
Complex enterprise IT environments are comprised of a web of applications that would have to be analyzed, tested and rewritten before porting to the cloud. Ensuring that they still work as expected is a tall order, and often impossible with state-based applications. Even when existing applications can be reconfigured, it may not make financial sense. Legacy systems that require large amounts of data to be passed around would rack up shocking bills from cloud providers that charge by the megabyte. Understanding your current data centre systems is fundamental to deciding if moving an application to the cloud will benefit or break your business.
Commoditization economics will continue to move up the stack
I’m not only referring to the technology stack here (server, network, and storage) but also to the IT operational management solutions stack. As the virtualization of server technology comes full circle, commoditization logically follows. The proprietary silos around hardware and software are dissolving thanks to open source development (historically popular in Europe), robust competition, and tech-savvy customer aversion to vendor lock-in. As systems and business models open up, the impetus behind commoditization intensifies; vendors will increasingly be compelled to build products on open standards that makeend-customer custom integration faster, easier and more reliable.
Upon such emerging standards, a whole new cycle of innovation and commoditization can be realized. In a recent article, Peter Wagner refers to new technologies arising out of these cycles as “commoditization accelerants”—solutions that make it easier for customers to use commoditized technology. These accelerants find a commoditization trend or potential that is being held back by an undeveloped capability, then solve for the missing link. I can foresee lots of opportunities for IT operations management systems to become more powerful as we venture beyond proprietary boundaries to solve vexing problems creatively, synergizing and bringing to fruition the advances we’ve made in virtualization, analytics, and automation.
The use of automatic analytic solutions, including advanced predictive analytics, will go mainstream
There’s an underlying theme to all the insights I’ve shared here. All the data we’re creating, collecting, and sharing—and the technology infrastructure we use to do it—creates new problems as fast as it solves old ones. Data centres run the world, so it’s imperative that we get better at running them. As IT operations management continues to mature, I am beginning to see the pay off from bringing advanced analytics into core processes. Analytics solutions can identify patterns and exceptions across multiple complex systems that generate many terabytes of data daily. This, in turn, enables automation and precision, rescuing administrators from the impossible burden of manual processes and associated failures.
Data generated by datacentre components can be used to build a self-healing, self-learning ecosystem. Intelligently gathered data is fed into algorithms to optimize true energy efficiency, predict capacity and provisioning, foresee and prevent failures, test experimental scenarios, and make sense of Big Data chaos. Analyses closely tied to business objectives save time and money, and foster agility and innovation. Early adopters clearly demonstrate the power of advanced analytics to confer competitive advantage. As analytics become more widespread, it will be exciting to see its transformative potential at work in the datacentre - and beyond.