IT Matters: The Complete Nicholas Carr Interview

going out and buying a piece of packaged software and installing it on your hard drive or your company’s server. On the consumer side, I think most young people today have already moved into the cloud for most of their computing, and I think we’ll continue to see this [evolve] quite rapidly over the next five ears on the consumer side. Most people at home doing their taxes or storing their photographs will go online. And one of the key things there will be the ability to store lots of data for basically free. That’s already come. On the small business side, they will do what consumers are doing pretty quickly as well, because they don’t want to buy all this gear and hire people. Big companies are going to be the slowest, because they have huge investments in IT. They often have very specialized software. Google isn’t going to create robust Web-based programs to handle very esoteric business processes, at least not quickly. So I think for the next 10, 20 years, larger business computing is going to be very much a hybrid. They are going to pull in a lot of the more basic computing services, whether it’s raw compute power, storage, or the basic enterprise systems for accounting and account management that everybody does in similar ways. Those will go online quickly. But it’s going to be years before they close up their data centers altogether.

X: One of the companies we cover all the time is VMware, which makes virtualization software. You mention how dysfunctional the distribution of computing power has become, because every company went out and built its own data center to meet peak demand and then uses it 20 percent of the time. Virtualization offers the possibility of using that other 80 percent.

NC: Not only did they build their data centers for peak demand, they built each application and each separate computer for peak demand.

X: Exactly. So the market for virtualization is still huge, and there will be this wave of companies trying to make most efficient use of their existing infrastructure. Then maybe then they’ll figure out how to plug that into the cloud, so their virtualized data centers are part of the larger virtualized cloud.

NC: I think that is exactly how it will happen. We’re starting to see it already. Big corporations are realizing that virtualization can save them literally billions of dollars. I think we will see companies take the utility technologies that software-as-a-service firms are using and revamp their internal IT along those lines. Essentially, they will run their own utility. And over time, as that happens, the lines between the private ones and the public ones will start to blur as capacity is allowed to shift where it needs to go.

X: When a revolution sets in, often you think “This is going to change everything forever.” And that’s sometimes true and sometimes it’s not. The utility model in electricity is pretty much here to say. And the PC revolution would have seemed pretty permanent in, say, 1995, but now, as you say in your chapter on the retirement of Bill Gates, the PC era is coming to an end. Do you see the utility computing era as something that is more likely to be here to stay, at least for the foreseeable future, or is there yet another revolution hiding inside that somewhere, waiting to emerge?

NC:
I think that some model of utility competing, if we define it broadly as a shared infrastructure for computing , is going to be the way we do computing for the future. If you assume that all of this stuff is going to stay networked—and I don’t see how we can go back to not being networked—that implies you are going to supply the resources in as economically efficient a way as possible through that network. That doesn’t mean there wont’ still be things running on people’s hard drives. But in general, what it says is that there is probably a big role for central processing plants of some sort.

One thing I’m happy to admit is I don’t know what the ultimate structure of the utility computing industry will be—whether it’s going to be four companies that run everything or a bunch of smaller companies that have hashed out really strong standards so that data can flow between them very easily. But it does seem to me that it will involve the centralization of a lot of computing functions that have been fragmented.

X: At the very end of the book you point toward the possibility of brain-machine interfaces, and how Sergey Brin and Larry Page at Google are obsessed with this idea of tapping directly into the human brain, and at the same time taking Google’s massive databases and somehow endowing them with artificial intelligence. Does that strike you as something that’s any more realistic now than it was in, say, 1968, when we first met HAL in 2001: A Space Odyssey?

NC: Yes, it does strike me as more realistic. But I don’t think what’s being built is a replica of human intelligence. We are at the stage now where there is so much data connected, and so many microprocessors, and such powerful microprocessors, that I think the nature of computer programming is going to change. It’s going to be much more along biological lines, where we train computers to see patterns. And as we move toward a more semantic Web, where information is coded in a much richer way that allows computers to make connections between information on their own, I think we will see computing systems becoming able in some rudimentary way to think and make decisions without the kind of human guidance that has been necessary in the past. As I try to point out in the book, Google doesn’t talk about this frivolously. It wants to do this. And I take them at their word. They’ve figured out a machine that pumps out a huge amount of money—and they buy a lot of computers.

Author: Wade Roush

Between 2007 and 2014, I was a staff editor for Xconomy in Boston and San Francisco. Since 2008 I've been writing a weekly opinion/review column called VOX: The Voice of Xperience. (From 2008 to 2013 the column was known as World Wide Wade.) I've been writing about science and technology professionally since 1994. Before joining Xconomy in 2007, I was a staff member at MIT’s Technology Review from 2001 to 2006, serving as senior editor, San Francisco bureau chief, and executive editor of TechnologyReview.com. Before that, I was the Boston bureau reporter for Science, managing editor of supercomputing publications at NASA Ames Research Center, and Web editor at e-book pioneer NuvoMedia. I have a B.A. in the history of science from Harvard College and a PhD in the history and social study of science and technology from MIT. I've published articles in Science, Technology Review, IEEE Spectrum, Encyclopaedia Brittanica, Technology and Culture, Alaska Airlines Magazine, and World Business, and I've been a guest of NPR, CNN, CNBC, NECN, WGBH and the PBS NewsHour. I'm a frequent conference participant and enjoy opportunities to moderate panel discussions and on-stage chats. My personal site: waderoush.com My social media coordinates: Twitter: @wroush Facebook: facebook.com/wade.roush LinkedIn: linkedin.com/in/waderoush Google+ : google.com/+WadeRoush YouTube: youtube.com/wroush1967 Flickr: flickr.com/photos/wroush/ Pinterest: pinterest.com/waderoush/