Cloud energy savings: looking at both sides of the equation

Doug Miller by Doug Miller,
Friday, June 22, 2012

There was an interesting piece in Information Week recently talking about a white paper Google had just published titled “Google Apps: Energy Efficiency in the Cloud.” This paper talks about how using Google Apps “significantly reduces energy consumption and carbon emissions.” The three areas highlighted in the paper are based on:

  1. Fewer servers mean less energy consumed.
  2. Again with fewer servers, you can save money on cooling.
  3. This is not an area of savings but the paper states that there may be increases in energy consumption in the 2–3% range from the use of Google servers and more network traffic.

While overall the paper raises some good points about how moving to a Google Apps cloud infrastructure can save money on energy costs, there are several areas worth thinking about beyond the statements in the paper.

Any good cloud implementation should save customers money spent on energy costs

One of the major selling points for cloud computing of any type is the idea that the customer no longer has to own and manage a datacenter full of IT infrastructure and servers, and instead the cloud service provide takes over providing the services. This is a basic tenet of cloud computing and simple math tells you if you have fewer servers running in your data center, you will spend less money on power. This really has little to do with the cloud service provider you go with and is not a unique feature of Google’s Software-as-a-Service products. The same would be true for moving your infrastructure needs to Amazon Web Services or if you migrated from an in-house CRM package to

Customer energy costs go down but cloud service provider energy consumption goes up

While the customer will see reductions in energy consumption and carbon emissions, this doesn’t come for free when you look at the big picture from a carbon emissions point of view. The cloud service provider has to install more servers in its data center to accommodate the increase in computing needs for each cloud customer. Yes, there are efficiencies in cloud computing that reduce the need for under-utilized or idle servers, but when Google adds 17,000 users (as Google did for the GSA), it will need to add servers to provide the service. This means Google’s power consumption will go up although it should not be as great as the savings to the customer since Google should be better at server utilization. Again, looking at the big picture, how does this impact carbon emissions? Based on a report from Greenpeace International titled “How dirty is your data?”, all cloud providers have some negative impact on carbon emissions. Cloud data centers consumed 3% of US electricity in 2011 and it is expected that this will increase 12% annually. Some cloud providers have a better record than others on the type of energy their data centers uses. For example, the report states:

“…the recent influx of mega utility-scale data centres into western North Carolina (Facebook, Google, Apple) was influenced by the attractive electricity prices offered by local utilities (Duke Energy and Progress Energy), which had extra capacity of dirty coal and nuclear power following the departure of the region’s textile and furniture manufacturing.”

This is not to pick on any particular cloud provider as all providers have issues with increasing carbon emissions. In fact, both Google and Microsoft have committed to using solar and wind power for some of their data center needs. Others are working other clean technologies as well. The point here is if your agency is committed to reducing its carbon footprint and assumes that just turning off those servers will help you meet those goals, keep in mind what you turn off impacts what gets turned on elsewhere. If this is of concern, you should ask your cloud provider where your cloud services are coming from and how those data centers will be powered.

External network bandwidth requirements (and energy requirements) may increase substantially

One area that is often overlooked in determining costs and energy consumption is the impact on networking infrastructure as you move to the cloud. As users spend more time interacting with a cloud service, network traffic – especially traffic routed outside the building – will increase. More network traffic can have the following consequences:

  • In order to maintain a reasonable level of service, the customer will likely need to upgrade and add additional network infrastructure. This means additional internal routers and other equipment, which require additional power and dollars.
  • Customers typically access a cloud service via a connection to the internet. These connections to the internet do not come for free. Network traffic could go up substantially as users interact with cloud-based servers hundreds or thousands of miles away. This may require equipment upgrades at the internet connection points and will certainly result in increased internet service provider costs as bandwidth requirements go up.

Some workloads may actually require more power if run in the cloud

An important point worth mentioning is Google’s paper “only covered energy consumption for servers and the associated cooling.” If you turn off an email server and instead use Google’s Gmail service, it makes sense that your power requirements will go down. However, what happens with other workloads. In reference to a case study on GSA’s migration to Google Apps, the report stated:

“The estimated direct + indirect energy consumption of GSA’s 18,300 laptops (~3,600,000 kWh) did not significantly change during the migration.”

This is an important point. So even though users are no longer running productivity apps on their local PCs, the power consumption for client devices does not measurably go down. However, now these productivity apps (such as Google Docs) are running in Google’s cloud and this will require additional servers and additional power on Google’s end. In addition, the customer will need to provide increased network capability since the user will need to be constantly connected to the network to use the service. So for this type of workload, there may be an increase in power requirements for the customer and the overall consumption of power and output of carbon emissions between the customer and Google will actually go up – maybe substantially. So running productivity apps in the cloud is not likely to reduce our overall carbon footprint unless the customer also moves to a lower powered (and potentially less functional) client device.


There are many good reasons to move to a cloud architecture, and reduced power consumption and energy costs should be major driver for cloud adoption. But we should not fool ourselves into thinking that there are no negative energy consequences for shifting our infrastructure to someone else. What we turn off, someone else needs to turn on and, in some cases, the real impact on global carbon emissions may be largely unknown. The message for cloud adopters who care about total power consumption and total carbon emissions is they need to sit down with their cloud service providers to get a better understanding of exactly how the world will be impacted by moving to that vendor’s cloud solution.

More information

Post a comment

Sign in to comment.

Not yet registered? Join the debate