David Mayhew on fundamentally changing how we compute
If you imagine the Internet as having a backbone, its vertebrae could be data centers – large, refrigerated warehouses containing computer systems, components for telecommunications, and other equipment needed to keep information flowing in our lives. Suppose you could plug into this computing power as a service, a utility, much as we plug into the existing electric grid? David Mayhew is a chief technologist and fellow for Advanced Micro Devices (AMD). He leads a research collaboration that’s helping plan how to build a network of data centers powered by green energy – wind and solar. His work has been described as a new way to merge energy and information. He spoke to EarthSky’s Jorge Salazar.
Is this idea for a network of data centers the same thing as cloud computing?
In cloud computing, the cloud is a strange entity. If you get 10 cloud experts in a room and ask them to write down what it means, you’ll get 11 different definitions for what the cloud is.
But, basically, the cloud is the notion that, looking forward, we want to rent-compute rather than own-compute. That is, regular users might not have quite as much interest as now in owning a computer. They just have an interest in being able to use a computer. The same is true of businesses.
We can take a tremendous amount of owned computation, sitting in rooms and sitting on desks, and push it into a computational environment where people rent the use of computation. There will be a set of companies that have enormous computational resources spread out all over the country, and all over the world, from whom we rent access to computation. That notion is the cloud.
The cloud sounds like the electric utilities we get when we plug into a wall.
That’s exactly it. That two- or three-prong electrical outlet is an abstraction for an incredibly complex system to bring power to users. It looks simple, because you just plug into it, but it’s really very sophisticated.
Computation is also very sophisticated. We want to think of computation as something that’s just going to be ubiquitous, kind of like electricity. You tap into this large available supply of it and use as much as you need.
How – and why – does renewable energy fit into this idea?
There’s a recognition that we’re going to need more and more electricity. Some of the ways we get it now might be problematic looking forward, but green energy – basically wind, solar and hydro – are incredibly attractive because they don’t pollute and are abundantly available.
And so, there’s a recognition that, if we’re going to manufacture something with renewable energy, maybe the thing we want to manufacture is data – that is, computation. That’s because it’s much less expensive to do it that way.
If you look at a data center right now – if you go out and buy a computer and put it in a room and use it for five years and pay normal amounts for energy over the first five years – on average, you’re going to spend about as much for electricity as you’re going to pay for the computing power you’ve got there. You’ll spend about as much for electricity as you’re going to pay to buy the computer and put the software on it and everything else.
In other words, energy right now is a significant component in computational costs. Some people do get power cheaply and, for them, energy might only represent a third of their cost. But other people will pay a lot for energy. We’ve seen studies where energy represents almost 90% of computational cost.
There are two trends that are inescapable. Energy’s becoming more expensive. Computation is becoming less expensive.
At some point, it’s going to make more sense to focus on energy for computation that is off the grid than to continue to plug computers in wherever we happen to want to use them.
Information and energy demand are booming all over the world, as more and more people are are getting plugged in and going online. Would you like to speak to these trends?
You have a number of very interesting aspects in the global environment. In some of the less economically advantaged parts of the world, information is the enemy of poverty. There are all sorts of studies that show that you improve people’s lives – and improve their economic circumstances – by giving them access to information.
Yet to get access to information you need access to energy. And, in a lot of places, access to energy means you have to have some sort of infrastructure to provide that energy. Suppose you think of a windmill or a solar panel not just as something that generates electricity – but as the power supply for a data center.
And suppose you think of a data center not as big room full of equipment, but as a refrigerator full of computer components or a semi tractor trailer full of computer components. You hook it up to the windmill. You hook it up to the solar panel. And it computes.
You don’t need a grid. You don’t really need any infrastructure. You can put that computational capability anywhere. In the U.S., you have the option, for example, of putting the data center out in the middle of a desert in Arizona, where solar energy is readily available. Then, instead of paying $1 million or $2 million per mile to connect that solar array to the electric grid, you pay $1,000 a mile to connect an optical fiber from the computational resource to the information grid.
In relatively poor parts of the world, you wouldn’t need electrical infrastructure to get access to computation. You could use wireless mechanisms and have access to computation without that infrastructure.
So you want to bring together data centers and renewable energy. Is there anything like this operating in the world today?
There are people who are talking about this today. It’s still in the form of a big question: does it really work? It’s an economic discussion. There is money being applied to the problem. Most of that is being applied to paper studies, if you will, where we’re working out the details of how efficient can we make these systems. How much are they going to cost? What kind of utilization can we get from them?
At present, there are relatively small dollars being attached to the problem as a research statement. We hope in the future that relatively larger dollars will be applied, and we will start putting test systems out in the field and seeing how well the practice follows the theory.
What does this project have to offer?
We hope it offers everyone less expensive computation. As, increasingly, we become an information society, and the creation of information becomes an ever-larger percentage of what people do, we hope it becomes a less expensive way of getting at computation.
So, for regular people, instead of computation being something on an escalating scale in terms of cost, we hope this idea really helps put it on a decreasing scale in terms of cost.
What if you could rent access to computation from a bunch of data centers sitting out in fields next to windmills? Then, first, you don’t own the computer anymore, and it’s not really your problem to keep it running and keep it from having viruses. And you’re not upgrading it.
You would just have this ubiquitous access to computation. That would be a much better model, I think, for most users, whether personal users or business users. Let’s turn computation into something that’s a lot like electricity. We use it but leave its creation to a relatively small set of very expert people.
There’s a July 2011 report commissioned by the non-profit Carbon Disclosure Project. It looked at cloud computing and the savings that large companies – companies that generate over a billion dollars annually – could save. They could save $12.3 billion dollars on energy costs and save a lot of carbon dioxide – 85.7 million metric tons by 2020 – if they switched to cloud computing.
What are the obstacles to seeing this kind of network up and running in the real world?
It’s a time-money equation. Electricity as we know it is a highly regulated environment. Utilities can’t just start becoming data companies. It’s a very tightly controlled space. Yet these are the companies that have the expertise with energy.
The companies with expertise with computation are really accustomed to getting energy out of the wall. And they tend not to be quite so focused on energy production, though they tend to locate in places where they can get cheap hydropower. Some of them are stepping up to the plate and starting to buy energy production facilities.
But you sort of have two different worlds that really aren’t accustomed to thinking of themselves as one problem.
What’s the most important thing you want people today to know about the idea of combining renewable energy production with computational resources?
There is a way in which we can fundamentally change how we compute that makes it one of the greenest activities out there. We can take a lot of energy off the grid, use renewable energy to generate compute cycles.
And so computation can become an increasingly important part of our economy – without the energy that it requires becoming increasingly problematic to both cost and the environment.
Sponsored by Advanced Micro Devices. AMD – enabling a sustainable future.