Since when the concept of “cloud computing” has been introduced a few years ago, even though by means of different wording, the technology promised - among other things - a better environmental footprint. The basic concept is that a few large corporates can create large “data centres”, plants with a huge number of computers and related facilities, and run software on behalf of their customers (other corporates or private citizens) in change of a fee. Not counting huge privacy concerns, the idea is good for customers in a number of ways, since they can avoid to buy their own computer equipment (or buy smaller numbers) to run “at home”: it's known that the monetary value of computing gear drops very quickly. In the economy of scale of a large data centre, furthermore, a substantial energy saving can be achieved: it would consume less than the equivalent galaxy of computers that corporates or private citizens would privately run.
Ok, but I get instantaneously skeptical when I read these considerations. They are merely qualitative, while we need a quantitative approach to understand how things are really going - and I've never seen a paper with the numbers, since apparently nobody made the effort of publishing it. A circumstance that makes me suspicious. Add the point that while it's easy to compute how much a data centre consumes (just look at the bill), the consumption of the equivalent PC galaxy run by customers has only been estimated with models so far - and a model without an experimental validation is just useless.
In recent years, data centres have been accused by green organizations such as GreenPeace to be large energy eaters - thus enemies of the environment. This position is again badly formulated if we don't see the math, but it's very palatable by media. Even more when the data centre buys energy from a carbon, oil or nuclear facility (which is obvious: we're talking of USA and these sources are - according to many - the only ones capable to provide large amounts of energy in a continuous way 24h/24). So, suddenly, data centre owners started to build solar plants, with GreenPeace blessing them. In fact, now, that organization can apparently “proof” that tomorrow's business world can already rely upon a renewable energy source, countering objections by sceptical people about the pitfalls of basing our economy on today's renewable energy sources.
Apart from the fact that data centres are a big part of our future, but only a part (we will need anyway large manufacturing facilities), everything is not backed by math. I am not an expert and I can't do the math by myself, but there are experts around that try and, well, they can't make the math work. For instance, this article by Wired is an interesting reading. Note that, unfortunately, there's no cited math even here - but, among other things, an interesting statement made by Facebook:
Facebook admits its solar farm produces only enough energy to keep the lights on, and that going solar was more of an experiment than anything else.
In the original blog post, which
I suggest to read as well, there's an interesting quantitative consideration:
But Hamilton’s ultimate point is that if you did build a solar farm large enough to provide most of the power for such a facility, it would be significantly larger than the data center itself. Though Apple is building is a 20-megawatt solar farm in North Carolina, Hamilton points out, it’s still providing only a fraction of the power needed to run Apple’s data center. If you wanted to power the entire 500,000-square-foot facility, he estimates, you’d need a 181-million-square-feet solar farm.