Tag Archives: computing

Energy, Clouds, iPhones and Refrigerators

In the last few weeks an idea has been making the rounds that, when you count all of the required networks and cloud services, your iPhone uses more electricity than your refrigerator. This idea was first presented in a publication called “The Cloud Begins with Coal” by Mark Mills, and was quickly followed up in with further analysis (and a different version of the calculation) by the Breakthrough Institute, “Bracing for the Cloud”. [Disclosure: I am proud to be a Sr. Fellow at the Breakthrough Institute.]

Since these articles make some very interesting points, I decided to dive into the data. I’ll share some observations here. At the end I’ll take a closer look at the iPhone-fridge comparison. Teaser: I wouldn’t crank up the iPhone guilt just yet.

Rise of the Wireless Cloud

When I started focusing on the sustainability of computing around 2005, the main area of concern were large, traditional datacenters. Through the next half dozen years the focus shifted to large clouds, including general compute clouds (e.g. Amazon Web Services) and the cloud datacenters behind popular web services (e.g. Facebook, Google, etc). Multiple football fields in size, these facilities can use the energy of small or medium towns.

These whitepapers highlight the emergence of cellular data (data delivered over the mobile phone network) as a major new area of energy growth in the computing world. Although mobile data is still a small fraction of overall ITC energy use (less than 10%, and way less by some calculations), it is growing at a rate of multiple hundred percent per year, driven by a combination of increasing subscribers and increasing data usage.

Calculations of energy use for accessing web services, such as Facebook, over a wired or WiFi network have not been very dramatic, as the energy usage was small and spread across the device, the network and the datacenter (WiFi networks, like the one in your home or office, use a tiny fraction of the energy per bit of mobile data networks). These whitepapers show how dramatically the total energy changes when you include a mobile data network, whose energy cost starts to dominate the result and make the total energy larger by comparison.

In the US and other ‘wired’ countries (here I literally mean wired) it sometimes feels like mobile data is just a convenience. Do I really need to upload my pictures this instant, or could I wait until I’m on WiFi? But in most of the world the mobile Internet is the Internet. There aren’t ubiquitous wires that can carry high speed data, and its unlikely there will ever be any. For hundreds of millions of people the mobile Internet is the foundation for new ways of life, creating links to other people, information and commerce in ways that were previously unthinkable.

In this way Mills, et al have made an important connection between expanding global Internet access and the resulting increases in energy that the mobile network will require.

Finally, Mills and BTI have touched on four other interesting points, which I cover next.

ICT Share of WW Energy Use

In the 2007 timeframe there was general agreement that ICT used roughly 2% of the world’s energy. Part of why “…Begins with Coal” got attention is because 10% sounded dramatic, but that was a fraction of WW electricity, not all energy. With electricity around 40% of WW energy use (see EIA Monthly Report, Table 2.1, that puts ICT at 4% of WW energy, or double what we thought it was 6 years ago. I have long believed that ICT energy use was doubling every 5 years, so we’re more or less on track.

Lack of Precision in ICT Energy Calculations

Reading the “…Begins with Coal” you see estimates with very large ranges. One of the key amounts, KWh per GB of mobile data, is described as having a full 10X range of estimated values, from 2 to 20 (more on this in the refrigerator discussion).

A challenge is that the underlying equipment and usage are both changing rapidly. For example, the CEET whitepaper shows demand growing roughly 400% in 3 years, while the AT Kearney report (the least reliable of the bunch, IMHO), shows the energy per GB of wireless data dropping over 40% in 2 years. Unfortunately, Mills doesn’t help the situation, as he mixes data from multiple years without any attempt to normalize it. I was most impressed with the rigor of the CEET whitepaper from this perspective.

Is Cloud Computing Energy Efficient?

Awareness of the energy involved in the mobile Internet have asked many to wonder whether cloud computing is still efficient. I’ll admit that this is a legitimate question in the wired world (US, Europe, etc), though I strongly believe that any honest accounting will show it is far more efficient than everyone trying to run their own servers.

However, in the majority of the world without wired communications and reliable electricity, where mobile Internet is the Internet, there really isn’t a discussion to have: in these areas there is no computing without cloud computing.

________Begins with Coal

Since there seems to be acceptance that Cloud Begins with Coal, we need to also accept that EVs Begin with Coal. I still don’t see how EVs are a game changer without a huge breakthrough in renewable electricity, which would be a huge deal all by itself.

Examining the Calculations

So what uses more energy: a fridge, or an iPhone?

With everyone agreeing that the Energy Star fridge uses around 350 KWh/year, Mills and BTI take very different approaches to the iPhone calculation. Mills includes embodied energy, BTI doesn’t. This accounts for over 300KWh/year difference. But if we just focus in on the wireless data, we see two major differences. First, Mills uses 2 KWh/GB (presumably from CEET), while BTI uses 19 KWh/GB (presumably from AT Kearney). Second, Mills assumes usage of 2.8 GB/week, or 145.6 GB/year, while BTI uses 1.58GB/month, or 19.1GB/year. So we have a 9.5x difference in one key value, and a 7.6x difference in the other.

Let’s look at energy per GB first. I looked through a lot of the references, and found no other value over 5 KWh/GB, with most around 2 KWh/GB. I also independently calculated the value from other data in Mills and CEET, and had one value of 7 KWh/GB using worst case numbers, and other values as low as 1 KWh/GB. Since the AT Kearney report has no references and doesn’t show where the data came from, I’m inclined to go with Mills (and CEET’s) value of 2 KWh/GB.

Looking at data usage, BTI’s number (1.58GB/mo) looks far more reasonable, having been sourced from Verizon as the average for iPhone subscribers. Using Mills’ data of 20 Exabytes of total bandwidth and 1.2B subscribers, you can independently come up with an average of 1.39GB/mo. While I’m sure there are people using more than 10GB/mo as Mills suggests, this is clearly not a good representative value.

Using these selections to redo the calculations, we get:

Mills: 600 of the 700 KWh are from the wireless network, which should now be divided by 7.6, or 78KWh. Adding back in the remaining 100KWh leaves us with 178KWh. Note that this includes all embodied energy.

BTI: Substituting 2 KWh/GB for 19 yields 38 KWh/year. Adding the remaining 27 KWh/year yields 65 KWh/year.

Summary: Unless you’re personally over 5GB/mo of mobile data, feel free to hold off on your iPhone guilt trip for now.

Data Center Energy, Revisited

Knowing my longstanding interest in computing and sustainability, a number of people sent me the NYT article, Power, Pollution and the Internet on the inefficiencies of data centers (or as the link to the article says, “data-centers-waste-vast-amounts-of-energy-belying-industry-image.html”).

Here are my thoughts on some of the points raised by the article, and a closing thought on the messenger itself.

Do data centers use lots of energy? Absolutely. The number in the article says that data centers sustain around 30 billion watts, which I won’t argue with. But while that is a lot of energy, it’s only around 0.2% of the sustained, worldwide energy use (~ 17 trillion watts) or 1.3% of the worldwide electricity use. On the other hand, datacenter energy use is growing rapidly (at one point it was doubling every 5 years or so), so even though its small it does deserve ongoing attention.

Are data centers becoming more efficient? Yes. I don’t have data, but by the traditional measure of units of work divided by units of energy, data centers have gained more efficiency in the last 10 years than any other industry. Part of it is due to the ongoing improvements in silicon (captured elegantly by Moore’s Law), but there have been major improvements in datacenter design, cooling, power distribution, system utilization (through virtualization and other technologies), and other optimizations. There has been lots of healthy exchange among data center professionals which has raised awareness and spread knowledge of best practices.

Wait a minute, I thought data center people were all secretive. Some are, and for very good reason: data centers can represent a huge amount of value in a very small amount of space, so people are nervous about protecting those locations from a wide variety of physical, electrical and digital threats. However, I found the overall industry to be very open, with lots of useful information changing hands between companies about best practices, etc. The Open Computing Project and Green Grid are examples of publicly visible activities, and there is a lot more going on among professionals of different companies behind the scenes. This is an area where the NYT author had to work hard to ignore reality in order to support the point he wanted to make.

Is backup power a problem? Definitely, but mainly because there is no reliable alternative to diesel generators for long-term (more than a few minutes) backup electricity. Hospitals use the same thing (a future NYT expose?) and for the same reasons. This also leads to the next question….

Do all of these applications and databases really need this much backup power and instant-on capability? First, there is a practical issue that its hard to go through a datacenter and decide what you can turn off. My family reunion pictures may be on the same disk as the photos that show up on nytimes.com, so that I may be happy if they power the disk down periodically, but the general manager of the Times online business would be really upset. This kind of interconnectedness and complexity make it difficult to really turn things off. Second, there is a great opportunity for gathering work into as few of systems as possible, and turning off the rest until they are needed. Virtualization tools are advancing quickly, and some of the better run facilities are doing this to some extent already, so I think you’ll see much more if it in coming years. Third, this is really a question of perceived value and who gets to make the choices. Every operating data center is being paid for by someone who has decides each month with their wallet that this degree of insurance is worth it.

But couldn’t these people save money with better designs and operating models? They can, and they surely will over time. Moreover, given the growth of data centers it is vital that they continue to become more efficient. As I said, this industry has made huge strides already, but there’s a growing culture of awareness, measurement and improvement. Its useful to remember that most of this equipment has a useful life of 4-6 years, so its natural to expect some time lag before best practices role out everywhere. (Note that compared to most other things that people spend serious money on, this is actually a pretty short useful lifetime. So it is a partial explanation of why IT can get efficient faster than others.)

The information technology industry says it is making the world more sustainable. Is that reality or hype? I think there are two layers to this question. First, I have made the argument for years that it is impossible to envision a future society with simultaneously higher standard of living and greater sustainability, that does not have broader use of information technology than we have today. Technology continues to help us improve the processes we have, dematerialize goods and services, and rethink our economy. However, information technology is a tool, not a service in and of itself. The technology doesn’t make us more sustainable, its only certain applications of the technology that does, and those application are done by the IT industry’s customers, not the industry itself (note that I said “certain applications” – most uses of IT are not net sustainability gains). So even at Sun I was very wary of taking credit for the application of IT, and argued against that viewpoint in works like the SMART 2020 report.

Didn’t you have one last comment? As a Chief Sustainability Officer I developed the habit of taking sustainability critiques and applying them to their authors. For example, I used to wonder what basis Gartner had for critiquing our sustainability plans, when they didn’t have any themselves.

In this case that analysis is almost absurdly funny. Let’s review the main themes of the article with respect to The New York Times Company. Sustainability: the Times’ primary business model today is to cut down trees, grind them up to make newsprint and drive them to stores and homes in the middle of the night using a large fleet of vehicles. It is broadly agreed on the Internet that the Sunday Times results in 60,000 to 75,000 trees being cut down each week. The company could move to full digital delivery, but doesn’t have a business model to support that, so keeps chopping down trees. Utilization: only a tiny fraction of what the Times delivers to customers is read. A major fraction of the trees they chop down are used to print ads that their readers don’t actually want. Transparency: the Times itself has no sustainability report, no report of electricity usage, and no environmental impact statement. The only thing on their social responsibility webpage is marketing fluff.

I see that this is the first in a series for the Times – I’ll be looking forward to reading it and digging up more info on them also.

Notes on Apple’s Clean Energy Push in North Carolina

There’s been a lot of press about Apple’s major solar and fuel cell installation at its new data center in Malden, North Carolina. So far I haven’t seen direct statements from Apple staff – all of the data seems to be based on an Apple document titled “Facilities Report: 2012 Environmental Update”.

Here are some of my thoughts on the project:

  • I’m excited about this project and Apple’s leadership. The main thing clean energy companies need is customers, and this provides a boost to a couple of them. Hopefully it will get some other companies to think more strategically about their energy (more on this below).

View Larger Map

  • I got a chuckle out of everyone picking up on the following phrase from Apple’s document: “..will be the largest non-utility fuel cell installation…”. Apple is increasing becoming a compute utility, and if you’re a compute utility then you’re also in the energy business. Like Google, Amazon and Microsoft, Apple surely now has world-class staff on electricity generation and energy efficiency.

  • I always warn that no “green” project is “free green”, and its true here also, where 100+ acres of forest are no longer.

  • Reading the articles you’re left with the impression that clean energy is running the whole site, but I seriously doubt it. A quick back-of-the-envelope estimate suggests that the site is running at 100MW (200 to 400 watts per square foot, 250,000 to 500,000 of usable datacenter space), which others believe as well. That leads to 800M kwh/year, or 10X the annual capacity of the solar panels and fuel cells combined. (As with the note above this isn’t meant to ‘dis’ the project, but these are the likely facts).

  • If you had any preconceptions that LEED ratings told you anything about the energy usage of a building, hopefully this will dissuade you of that fantasy.

  • The solar facility is interesting, but I’m more intrigued by the fuel cells. They are very efficient, generate some useful heat (well, not useful here since they have enough already, but useful in many situations), and can run on a wide range of fuels, including biofuels (like Apple) and natural gas, both of which are better solutions than the predominately coal-based electricity in North Carolina. Barring a discontinuity in solar panel efficiency per square foot, I’d be that Apple will scale up the fuel cells farther in the future.

  • So why’d Apple do it? I doubt the project pays for itself by straightforward accounting, at least until well into the future. The PR side is a benefit, but it’s not Apple’s style, and least not enough to justify the project. My guess is that its about energy independence, and not wanting to be too reliant on one source. With this investment Apple can survive a major big power outage for at least 10% of the facility, and has bargaining power for Duke and other electricity providers. They’re also gaining experience running their own utility, and my bet is they’ll build on this initial footprint.

The electric utilities are one of the least innovative segments of the US economy, and their operating and financial future are tightly coupled to a fickle US energy policy. Do you think someone as thorough and innovative as Apple would leave their future in the hands of such an industry?