Cloud Computing: The Coming IT Cambrian Explosion

Cloud computing continues to be a very hot subject. I recently participated in Xconomy’s conference on “The Promise and Reality of Cloud Computing,” and it was clear from the discussion that something big and profound seems to be going on, although we are not totally sure what it is yet.

Some of us feel that cloud computing may very well be The Next Big Thing—one of those massive changes that the IT industry goes through from time to time that really shake things up—like the advent of personal computers in the 1980s and the Web in the next decade. Others—a minority in this meeting—feel that this is the IT industry engaged in one of its periodic hype cycles.

Nicholas Carr nicely framed the historical shift to cloud computing in his keynote, which was based on his recent book, The Big Switch. Carr first talked about the evolution of power plants in the 19th century. In the early days, companies usually generated their own power with steam engines and dynamos. But with the rise of highly sophisticated, professionally run electric utilities, companies stopped generating their own power and plugged into the newly built electric grid.

IT, said Carr, is the next great technology that is going through a similar transformation. Many IT capabilities, now handled in a distributed way, will be centralized in highly industrialized, efficient, scalable data centers—Clouds—which should free companies to invest in innovation where it really matters to their business. Nick acknowledged that IT clouds are quite different in nature from electricity—more complex and diverse in the services they offer. So it is too early to tell how IT clouds will evolve.

My personal feeling is that there will be a variety of providers of cloud services—from new companies that specialize in innovative services aimed at particular industries, to enterprises that take advantage of those processes and services they are really good at—e.g., payments in banking, logistics in shipping, reservations in transportation—and start new businesses that service their own needs as well as those of others in the marketplace. This is already happening.

IT organizations will have to become much more professional, disciplined and efficient in their management of data centers, including energy usage. The data center is undergoing a phase of industrialization similar to what happened in manufacturing twenty years ago. If you stay behind, your management costs and quality of service will not be competitive.

At the conference, there was quite a bit of discussion about the relationship of cloud computing to computing-on-demand offerings, such as Amazon Web Services, and software-as-a-service application platforms, such as Salesforce.com. Some have said that this spells the death of software.

I prefer to think of what is happening as the long-needed evolution of application software to something that is far more usable by humans. When virtualizing applications to be used by people who care nothing about computers or technology—as is mostly the case with Clouds—the key thing we want to virtualize or hide from the user is complexity.

Most people want to deal with an application or a service, not software. We want those applications and services to be as intuitive as possible, and we want to have to know only as much as we need in order to use them. We don’t want to have to worry about extraneous error messages we don’t understand or new software releases we don’t know what to do with. We want a quality experience, as we do with other things in work and life we enjoy using.

Most of us would agree that while computers have been very useful, using them has been far from satisfying—sort of like what driving a car must have been like a hundred years ago. They got us to where we wanted to go, but they also kept breaking down and required constant attention.

This is far from the death of software. In fact, it will take lots of innovative software to make computers and computing applications usable, let alone enjoyable to use The more intelligent we want them to be—that is, intuitive, exhibiting common sense and not making us have to constantly take care of them—the more smart software it will take. But with cloud computing, our expectation is that all that software will be virtualized or hidden from us and taken care of by systems and/or professionals that are somewhere else—out there in The Cloud.

In my own presentation, following Nick Carr, I also framed cloud computing in sort of historical terms. First, I think of what is going on with IT as a kind of Cambrian Explosion, which is the period over 500 million years ago when the rate of evolution accelerated by an order of magnitude, giving rise to both more complex animals and a far greater diversity of organisms. This was at least partly due to the fact that the cell had been perfected and standardized over the preceding billion years, so evolution could now focus its energies in using these essentially commoditized cells in far more complex and diverse ways.

Looking at the Cambrian Explosion as a metaphor, we can think of digital components as following the path of cells in biology. In its first few decades, the IT industry spent a considerable fraction of its energies developing the basic components. But now that they are essentially standardized, commoditized and good enough for most purposes, we are seeing both the emergence of massively scalable systems—i.e., cloud data centers—as well as

Author: Irving Wladawsky-Berger

Irving Wladawsky-Berger retired from IBM in May 2007 after 37 years with the company. As Chairman Emeritus, IBM Academy of Technology, he continues to participate in a number of IBM’s technical strategy and innovation initiatives. He is also Visiting Professor of Engineering Systems at MIT, where he is involved in multi-disciplinary research and teaching activities focused on how information technologies are helping transform business organizations and the institutions of society. At IBM he was responsible for identifying emerging technologies and marketplace developments critical to the future of the IT industry, and organizing appropriate activities in and outside IBM in order to capitalize on them. In 1996, he led the effort to formulate IBM’s Internet strategy and to develop and bring to market leading-edge Internet technologies that could be integrated into IBM’s mainstream business. He subsequently led a number of companywide initiatives like Linux, Grid Computing and the On Demand Business initiative. He began his IBM career in 1970 at the Company’s Thomas J. Watson Research Center where he started technology transfer programs to move the innovations of computer science from IBM’s research labs into its product divisions. He has managed a number of IBM’s businesses, including the large systems software and the UNIX systems divisions. Dr. Wladawsky-Berger received an M.S. and a Ph. D. in physics from the University of Chicago.