1. The Case for the 2 C Warming Limit

    image

    On October 1, 2014, David G. Victor and Charles F. Kennel wrote an opinion piece that appeared in the journal Nature, titled “Ditch the 2 C warming goal” [1].  The provocative title, which accurately conveyed the point of view of the authors, led to several responses, two from Joe Romm at Climate Progress (here and here), one from Stefan Rahmstorf at Real Climate, one from William Hare at Climate Analytics, and one from David Roberts at Grist.  Victor wrote a long reply to the Romm and Rahmstorf pieces that appeared on Andy Revkin’s New York Times Dot Earth blog.

    For those interested in digging in, I found the longer Victor response to be clearer than the very condensed Nature article.  The Roberts response is the easiest read for those who are less technical, while the Romm, Hare, and Landowsky pieces go into a lot more detail about the problems with the Nature article, which are many and varied. 

    I’m not going to get into a blow-by-blow analysis of the discussion.  Instead, I’d like to explore some key aspects of the 2 C limit that Victor (and others) seem to misunderstand, because of the importance of this concept to making the case for urgent action on climate. 

    Let me begin by saying that Victor is an acquaintance of mine from when he worked at Stanford, and I’ve always been impressed by his keen intellect.  I invited him to lecture in my class when I was first a visiting professor there in 2003-4.  He also graduated from Harvard with an undergraduate degree in History and Science, as did I, so I have a deep understanding of his early training.  I would call him a friend, though not a close one.  But that doesn’t mean I agree with the arguments he made about abandoning the 2 C limit.

    The 2 C warming limit is more than just a number (or a goal to be agreed on in international negotiations).  It embodies a way of thinking about the climate problem that yields real insights [2].  The warming limit approach, which can also be described as “working forward toward a goal”, involves assessing the cost effectiveness of different paths for meeting a normatively-determined target.  It has its origins in the realization that stabilizing the climate at a certain temperature (e.g., a warming limit of 2 Celsius degrees above pre-industrial times) implies a particular emissions budget, which represents the total cumulative greenhouse gas emissions compatible with that temperature goal.  That budget also implies a set of emissions pathways that are well defined and tightly constrained (particularly now that we’ve squandered the past two decades by not reducing emissions).

    The 2 C limit is a value choice that is informed by science.  It should not be presented as solely a scientific “finding”, but as a value judgment that reflects our assessment of societal risks and our preferences for addressing them.

    The warming limit approach had its first fully-developed incarnation in 1989 in Krause et al. [3] (which was subsequently republished by Wiley in 1992 [4]).  It was developed further in Caldiera et al. [5] and Meinshausen et al. [6], and has recently served as the basis for the International Energy Agency’s analysis of climate options for several years running [7, 8, 9].

    Such an approach has many advantages.  It encapsulates our knowledge from the latest climate models on how cumulative emissions affect global temperatures, placing the focus squarely on how to stabilize those temperatures.  It places the most important value judgment up-front, embodied in the normatively determined warming limit, instead of burying key value judgments in economic model parameters or in ostensibly scientifically chosen concepts such as the discount rate.  It gives clear guidance for the rate of emissions reductions required to meet the chosen warming limit, thus allowing us to determine if we’re “on track” for meeting the ultimate goal, and allowing us to adjust course if we’re not hitting those near-term targets.   It also allows us to estimate the costs of delaying action or excluding certain mitigation options, and provides an analytical basis for discussions about equitably allocating the emissions budget. Finally, instead of pretending that we can calculate an “optimal” technology path based on guesses at mitigation and damage cost curves decades hence, it relegates economic analysis to the important but less grandiose role of comparing the cost effectiveness of currently available options for meeting near-term emissions goals [2].

    The warming limit approach shows that delaying action is costly, required emissions reductions are rapid, and most proved reserves of fossil fuels will need to stay in the ground if we’re to stabilize the climate.  These ideas may not be news to some, but many don’t realize that they follow directly from the warming limit framing. 

    •           Delaying emissions reductions forecloses options and makes achieving climate stabilization much more difficult [10].  “Wait and see” for the climate problem (or for new metrics characterizing it) is foolish and irresponsible, which is obvious when considering cumulative emissions under a warming limit.  The more fossil infrastructure you build now, the faster we’ll have to reduce emissions later.  If energy technologies changed as fast as computers there could be justification for “wait and see” in some circumstances, but they don’t, so it’s a moot point.  

    •           Global emissions will need to turn down this decade and approach zero in the next three to four decades if we’re to have a two thirds change of staying under the 2 C limit [11].  The emissions pathways given the current carbon budgets are tightly constrained.  Even if the climate sensitivity is at the lowest end of the range included in IPCC reports (1.5 C), that only buys us another decade in the time of emissions peak [12], which indicates that the findings on emissions pathways are robust, even in the face of large variations in climate sensitivity.

    •           The rate of emissions reductions, which is a number that can be measured, is one way to assess whether the world is on track to meet the requirements of the 2 C limit.  We know what we need to be doing to succeed, and if we don’t meet the tight time constraints imposed by that cumulative emissions budget in one year, we need to do more the next year, and the next, and the next.  It’s a way of holding policy makers’ proverbial feet to the fire.

    ª           The concept of “stranded fossil fuel assets” that can’t be burned, popularized by Bill McKibben [13] and Al Gore [14], follows directly from the warming limit framing.  In fact, our 1989 book, Energy Policy in the Greenhouse [3] (which Victor reviewed in a cursory way for Nature in 1990, ironically enough), had a chapter titled “How much fossil fuel can still be burned?”.  So the idea of stranded assets is not a new insight (but it is a profound one).

    Victor also expresses strong views of how international agreements come about, based on his extensive study of historical developments in this area.  It is likely, however, that an unprecedented challenge will require us to create international agreements in ways different from how we’ve done things in the past.  We aren’t necessarily constrained by history, and in fact modifying institutional arrangements (like property rights and international agreements) is one of the most important ways to speed up our rate of innovation to meet this challenge. 

    The possibility of such institutional changes is ignored by assumption in the economic modeling exercises cited by Victor in his longer essay.  For this and many other reasons, economic models tend to underestimate the possibilities for change and make alternative futures seem more expensive and difficult than they will be to achieve in reality [11].  Victor seems to believe the exact opposite, that the models are too optimistic about the possibilities for change. In support of his belief he cites a few examples of technologies with limited current application that dominate the modeling results, but does not mention the large literature indicating the inherent pessimism of such modeling exercises.  These models usually ignore the possibilities for energy efficiency improvements, for increasing returns to scale and learning effects, for path dependence, for changes in institutional and individual behavior, and for new mass produced technologies to achieve significant cost reductions [15, 16, 17, 18, 19, 20, 21]. 

    I do think the Victor and Kennel piece in Nature contributes something useful to the discussion, in the form of alternative metrics to supplement the 2 C limit.  But there’s no reason to abandon one of the few bright spots in the entire climate agenda because two researchers have a rather narrow idea of how international agreements should be negotiated.  Alternative metrics are useful and important, but they are a supplement, adding additional degrees of freedom to the negotiations.  They cannot replace the 2 C limit, nor should they.

    The warming limit approach is the most powerful analytical way of thinking about the climate problem that the climate science and policy community has yet devised.  So the answer is not to “ditch the 2 C limit”, but to use it to show (in Victor and Kennel’s words) that “politicians …pretend that they are organizing for action when, in fact, most have done little.” 

    The warming limit framing makes it abundantly clear that emissions reductions efforts to date are inadequate to meet the stated goal (see the discussion of “stranded assets” by McKibben [13] and Gore [14] for concrete evidence of this reality).   However, this failing is not the fault of the 2 C limit or the mode of analysis it enables, as Victor and Kennel imply.  Instead, it is the fault of those who allow this charade to continue.  The answer is therefore not to abandon this way of thinking about the climate problem, but to use it to argue for rapid and measurable reductions, starting now, and to expose as charlatans those who claim to be concerned about climate disruption but are unwilling to do what it takes to avoid it.  There is nothing better than the 2 C limit for making that case. 

    The Victor and Kennel article assumes that the 2 C limit is the cause of global inaction on emissions reductions, and that developing a new framework and associated metrics can somehow break the logjam.  I suggest instead that the lack of progress is in spite of the power of the warming limit framing, and that it owes more to the challenge of global elites confronting powerful corporations and countries who face the prospect of trillions of dollars in stranded assets and are fighting like hell to avoid that outcome. 

    The alternative to facing this difficult political challenge is allowing emissions trends to continue that will make the orderly development of human civilization as we have known it all but impossible by the end of this century.  A stark choice, but we will either reduce our emissions rapidly (which will require big changes in how society operates) or our current path will force upon us bigger (and far less manageable) changes.  That’s the reality that the warming limit framing makes clear, and ditching the warming limit won’t change that reality.

    Corrigendum:  The earlier posted version of this post incorrectly attributed the Real Climate article  to Stephen Landowsky.  The actual author was Stefan Rahmstorf .  My apologies to Stephan and Stefan for the misattribution.

    References

    1.         Victor, David G., and Charles F. Kennel. 2014. “Climate policy: Ditch the 2 °C warming goal.”  Nature.  vol. 514, no. 7520. October 2. pp. 30-31. [http://www.nature.com/news/climate-policy-ditch-the-2-c-warming-goal-1.16018]

    2.         Koomey, Jonathan. 2013. “Moving Beyond Benefit-Cost Analysis of Climate Change.”  Environmental Research Letters.  vol. 8, no. 041005. December 2. [http://iopscience.iop.org/1748-9326/8/4/041005/]

    3.         Krause, Florentin, Wilfred Bach, and Jon Koomey. 1989. From Warming Fate to Warming Limit:  Benchmarks to a Global Climate Convention. El Cerrito, CA: International Project for Sustainable Energy Paths. [http://www.mediafire.com/file/pzwrsyo1j89axzd/Warmingfatetowarminglimitbook.pdf]

    4.         Krause, Florentin, Wilfred Bach, and Jonathan G. Koomey. 1992. Energy Policy in the Greenhouse. NY, NY: John Wiley and Sons.

    5.         Caldeira, Ken, Atul K. Jain, and Martin I. Hoffert. 2003. “Climate Sensitivity Uncertainty and the Need for Energy Without CO2 Emission “  Science.  vol. 299, no. 5615. pp. 2052-2054. [http://www.sciencemag.org/cgi/content/abstract/299/5615/2052]

    6.         Meinshausen, Malte, Nicolai Meinshausen, William Hare, Sarah C. B. Raper, Katja Frieler, Reto Knutti, David J. Frame, and Myles R. Allen. 2009. “Greenhouse-gas emission targets for limiting global warming to 2 degrees C.”  Nature.  vol. 458, April 30. pp. 1158-1162. [http://www.nature.com/nature/journal/v458/n7242/full/nature08017.html]

    7.         IEA. 2010. World Energy Outlook 2010. Paris, France: International Energy Agency, Organization for Economic Cooperation and Development (OECD).  November 9. [http://www.worldenergyoutlook.org/]

    8.         IEA. 2011. World Energy Outlook 2011. Paris, France: International Energy Agency, Organization for Economic Cooperation and Development (OECD).  November 9. [http://www.worldenergyoutlook.org/]

    9.         IEA. 2012. World Energy Outlook 2012. Paris, France: International Energy Agency, Organization for Economic Cooperation and Development (OECD).  November 12. [http://www.worldenergyoutlook.org/]

    10.       Luderer, Gunnar, Robert C. Pietzcker, Christoph Bertram, Elmar Kriegler, Malte Meinshausen, and Ottmar Edenhofer. 2013. “Economic mitigation challenges: how further delay closes the door for achieving climate targets.”  Environmental Research Letters.  vol. 8, no. 3. September 17. [http://iopscience.iop.org/1748-9326/8/3/034033/article]

    11.       Koomey, Jonathan G. 2012. Cold Cash, Cool Climate:  Science-Based Advice for Ecological Entrepreneurs. Burlingame, CA: Analytics Press. [http://www.analyticspress.com/cccc.html]

    12.       Joeri, Rogelj, Meinshausen Malte, Sedláček Jan, and Knutti Reto. 2014. “Implications of potentially lower climate sensitivity on climate projections and policy.”  Environmental Research Letters.  vol. 9, no. 3. pp. 031003. [http://stacks.iop.org/1748-9326/9/i=3/a=031003]

    13.       McKibben, Bill. 2012. “Global Warming’s Terrifying New Math.” In Rolling Stone Magazine. July 19. pp.  [http://www.rollingstone.com/politics/news/global-warmings-terrifying-new-math-20120719]

    14.       Gore, Al, and David Blood. 2013. “The Coming Carbon Asset Bubble.” The Wall Street Journal (online).   October 29. p. [http://online.wsj.com/news/articles/SB10001424052702304655104579163663464339836?mod=hp_opinion]

    15.       Ackerman, Frank , Stephen J. DeCanio, Richard B. Howarth, and Kristen Sheeran. 2009. “Limitations of Integrated Assessment Models of Climate Change.”  Climatic Change.  vol. 95, no. 3-4. August. pp. 297-315.

    16.       Ackerman, Frank, Elizabeth A. Stanton, Stephen J. DeCanio, Eban Goodstein, Richard B. Howarth, Richard B. Norgaard, Catherine S. Norman, and Kristen A. Sheeran. 2009. The Economics of 350: The Benefits and Costs of Climate Stabilization. Portland, OR: Economics for Equity and Environment.  October. [http://www.e3network.org/papers/Economics_of_350.pdf]

    17.       DeCanio, Stephen J. 2003. Economic Models of Climate Change:  A Critique. Basingstoke, UK: Palgrave-Macmillan.

    18.       Laitner, John A. “Skip”, Stephen J. Decanio, Jonathan G. Koomey, and Alan H. Sanstad. 2003. “Room for Improvement:  Increasing the Value of Energy Modeling for Policy Analysis.”  Utilities Policy (also LBNL-50627).  vol. 11, no. 2. June. pp. 87-94.

    19.       Koomey, Jonathan. 2002. “From My Perspective:  Avoiding “The Big Mistake” in Forecasting Technology Adoption.”  Technological Forecasting and Social Change.  vol. 69, no. 5. June. pp. 511-518.

    20.       Scher, Irene, and Jonathan G. Koomey. 2011. “Is Accurate Forecasting of Economic Systems Possible?”  Climatic Change.  vol. 104, no. 3-4. February. pp. 473-479. [http://link.springer.com/article/10.1007%2Fs10584-010-9945-z]

    21.       Krause, Florentin, Paul Baer, and Stephen DeCanio. 2001. Cutting Carbon Emissions at a Profit:  Opportunities for the U.S. El Cerrito, CA: International Project for Sustainable Energy Paths.  May. [http://www.mediafire.com/file/0aro7bj2d7kqk8w/ipsepcutcarbon_us.pdf]

  2. Comments
  3. Google ditches ALEC, finally. Facebook, Yahoo, and Yelp follow suit

    imageimage

    Google just quit ALEC, and Chairman Eric Schmidt explained why, in very forthright terms:

    Google’s controversial decision to fund the American Legislative Exchange Council (ALEC) was a “mistake,” company chairman Eric Schmidt admitted on Monday, saying the group is spreading lies about global warming and “making the world a much worse place.”

    Facebook followed suit, as did Yelp, Yahoo, and some other tech companies.

    The reasoning for dropping ALEC is virtually identical to that I and my fellow Google Science Fellows explained in the open letter and associated essay to Google back in August 1, 2013, in an effort to get the company to drop its active fundraising for Senator James Inhofe:

    Climate change is a grave moral challenge that cannot be addressed without smart government policy, corporate innovation, and public participation.  Leaders and citizens must collaborate in ways that transcend differences, and call out those who impede progress by denying the reality of the problem.

    Recently, Google Inc. failed in this duty by hosting aJuly 11, 2013 fund-raiser in support of Oklahoma Senator James Inhofe’s re-election campaign.  The political gridlock that has derailed efforts to address climate change in the US owes much to Senator Inhofe.  His denial of the problem and fact-free assaults on the scientific community are designed to promote political dysfunction, to destroy the reputation of scientists, and to undermine our ability to find common ground.

    Such strategies conflict with Google’s successful evidence-based, problem-solving culture, and are arguably contrary to its corporate philosophy of “Don’t Be Evil.”

    Read more…

    Pretty simple actually.  Don’t lie, and don’t tolerate, enable, and support those who do, even if it’s advantageous to you in the short run. My kindergarteners are starting to understand that.  Hopefully more companies will, too.

    Further reading

    My June 18, 2012 blog post on intellectual honesty.

    My July 3, 2014 blog post on academic integrity and what it means.

  4. Comments
  5. Our new article analyzing downloading console games versus shipping them on discs

    Photo credit:  John McCullough.  Licensed through Creative Commons.

    The Journal of Industrial Ecology just published our article titled “The carbon footprint of games distribution”.  It’s freely downloadable.

    Here’s the summary:

    Summary

    This research investigates the carbon footprint of the lifecycle of console games, using the example of PlayStation®3 distribution in the UK. We estimate total carbon equivalent emissions for an average 8.8-gigabyte (GB) game based on data for 2010. The bulk of emissions are accounted for by game play, followed by production and distribution. Two delivery scenarios are compared: The first examines Blu-ray discs (BDs) delivered by retail stores, and the second, games files downloaded over broadband Internet. Contrary to findings in previous research on music distribution, distribution of games by physical BDs results in lower greenhouse gas emissions than by Internet download. The estimated carbon emissions from downloading only fall definitively below that of BDs for games smaller than 1.3 GB. Sensitivity analysis indicates that as average game file sizes increase, and the energy intensity of the Internet falls, the file size at which BDs would result in lower emissions than downloads could shift either up- or downward over the next few years. Overall, the results appear to be broadly applicable to title games within the European Union (EU), and for larger-than-average sized games in the United States. Further research would be needed to confirm whether similar findings would apply in future years with changes in game size and Internet efficiency. The study findings serve to illustrate why it is not always true that digital distribution of media will have lower carbon emissions than distribution by physical means when file sizes are large.

    These findings are contrary to the naive idea that downloading information is ALWAYS environmentally preferable to delivering it via physical media.  The issue is that the allocated electricity use and emissions grow in proportion to file size, and that large enough file sizes can offset the benefits of downloading.

    Here’s what I wrote several years ago about this issue, focusing on our earlier study about downloading music versus buying it on physical media:

    …consider downloading music versus buying it on a CD.  A study that is now “in press” at the peer-reviewed Journal of Industrial Ecology showed that the worst case for downloads and the best case for physical CDs resulted in 40% lower emissions of greenhouse gases for downloads when you factor in all parts of the product lifecycle (Weber et al. 2009). When comparing the best case for downloads to the best case for physical CDs, the emissions reductions are 80%.  Other studies have found similar results (see Turk et al. 2003, Sivaraman et al. 2007, Gard and Keoleian 2002, and Zurkirch and Reichart 2000).  In general, moving bits is environmentally preferable to moving atoms, and whether it’s dematerialization (replacing materials with information) or reduced transportation (from not having to move materials or people, because of electronic data transfers or telepresence) IT is a game changer.

    Our more recent work on downloading console games made me more carefully qualify these conclusions.  Downloads of small files are often environmentally preferable, but for larger file sizes the situation can be reversed. As the Internet improves in efficiency larger file sizes can be more efficiently downloaded, but file sizes also increase over time, as programming becomes more sophisticated and more high definition content is included in such downloads.  Data density on blu-ray discs also increases over time, though not quite as quickly as Internet data transfer efficiencies seem to increase.

    Finally, this research raises an important point about how emissions from networked activities should be allocated.  In the life cycle assessment (LCA) community there is ongoing debate between those who prefer what’s called “consequential” LCA and those who favor “attributional” LCA.  

    The first approach assesses the marginal effect on energy intensity of changes in network demand (i.e. the direct consequences of that change in demand), ignoring the fixed energy use associated with keeping the network running. The problem is that the fixed energy use is almost all of energy use for current networks, and energy use doesn’t vary much as load changes on a given network.  Of course, if network traffic increases enough more equipment needs to be added, so the medium term marginal change in intensities is higher than in the short run.  And in the long run, network technologies change, introducing additional complexity.

    The attributional LCA approach allocates the fixed energy use based on some measure of the service demand, in this case gigabytes (GBs) of data transferred.  This approach is the preferred one from my perspective, and it’s the one we used in this and other related analyses.

    To illustrate this distinction in another way, consider the energy used for a subway train.  The energy to move the train doesn’t vary much at all if I step onto it, but somehow that energy needs to be allocated.  A consequential LCA approach would just calculate the tiny incremental increase in energy caused by my additional mass on the train.  An attributional LCA would instead allocate all of the energy of the train over some metric of service delivered, like passenger kilometers.

    Please look at our article, which is freely downloadable, and send me comments!

  6. Comments
  7. One key issue missing from a recent Dot Earth blog post on the carbon commitment from electricity investments

    image

    Yesterday Andy Revkin posted a very useful discussion among high level experts about the issue of “climate commitment”, as discussed in a recent article in Environmental Research Letters by Davis and Socolow.  Both the blog post and the journal article are worth a read to get a sense for the complexities associated with displacing fossil fuels globally, particularly in the developing world.

    The “carbon commitment” is a logical result of the growing focus on cumulative emissions and is a helpful heuristic to help people think longer term about the climate implications of our energy investment decisions.  For historical context on the evolution of thinking that led us to this point, see my 2013 article in Environmental Research Letters (Koomey 2013) and the associated blog post.

    The issue that I didn’t see raised in the responses so far is whether coal fired electricity is actually a net benefit to society after you correctly account for the external costs associated with such generation.  I visited Beijing in February 2014 during what was the worst week so far for air pollution (perhaps it’s exceeded those levels since, I don’t know) and having 30 million people living in such conditions is horrifying.  The official GDP statistics are perverse in that they count visits to the doctor from air pollution related illnesses as something that adds to GDP, but ignores the actual costs to the society in lost productivity and lost life.

    The economic literature on this for the US is quite clear:  Coal fired electricity delivers negative net value added.  This was the conclusion of Muller et al. 2011 writing in the American Economic Review, and this result is strongly supported by Epstein et al. 2011 in the Annals of the New York Academy of Sciences.  What has been less well studied is the economics of coal fired electricity in developing countries.  In those countries the coal is often dirtier and the pollution controls mostly nonexistent, but the population is at a point in their economic development where energy is quite valuable, so I don’t think one can say a priori whether the US results would also apply to those countries.   I think we CAN say with certainty that the actual GDP growth rate for China and other countries with comparable air pollution is much lower than what the official statistics state, because these external costs are not currently being counted.  

    Discussion of the carbon commitment article really must acknowledge the importance of these external costs. Some of the commenters implicitly assumed that coal fired electricity delivered net benefits for developing countries, and that’s not necessarily true, given what we know about the costs of coal pollution to society in the developed world.  We do need additional studies of those costs in developing countries, but anyone who’s traveled recently in coal-dependent places knows that those external costs are large and mostly uncounted, and without a doubt would reduce the GDP growth reported by those countries if they were properly internalized. 

    Coal is not cheap when you count the pollution costs, and debates like these need to reflect that reality.  Use hashtag #coalisnotcheap whenever you tweet about articles online about coal’s external costs, so it will be easy to compile those stories in the future.

    References

    Epstein, Paul R., Jonathan J. Buonocore, Kevin Eckerle, Michael Hendryx, Benjamin M. Stout Iii, Richard Heinberg, Richard W. Clapp, Beverly May, Nancy L. Reinhart, Melissa M. Ahern, Samir K. Doshi, and Leslie Glustrom. 2011. “Full cost accounting for the life cycle of coal.”  Annals of the New York Academy of Sciences.  vol. 1219, no. 1. February 17. pp. 73-98. [http://dx.doi.org/10.1111/j.1749-6632.2010.05890.x])

    Koomey, Jonathan. 2013. “Moving Beyond Benefit-Cost Analysis of Climate Change.”  Environmental Research Letters.  vol. 8, no. 041005. December 2. [http://iopscience.iop.org/1748-9326/8/4/041005/].

    Muller, Nicholas Z., Robert Mendelsohn, and William Nordhaus. 2011. “Environmental Accounting for Pollution in the United States Economy.”  American Economic Review vol. 101, no. 5. August. pp. 1649–1675. [https://www.aeaweb.org/articles.php?doi=10.1257/aer.101.5.1649]

  8. Comments
  9. Upcoming class: Data center essentials for executives

    Cern datacenter

    Photo credit: By Hugovanmeijeren (Own work) [GFDL or CC-BY-SA-3.0-2.5-2.0-1.0], via Wikimedia Commons 

    I’ve been struggling for years to convince executives in large enterprises to fix the incentive, reporting, and other structural problems in data centers.  The folks in the data center know that there are issues (like having separate budgets for IT and facilities) but fixing those problems is “above their pay grade”.  That’s why we’ve been studying the clever things eBay has done to change their organization to take maximal advantage of IT, as summarized in this case study from last fall:

    Schuetz, Nicole, Anna Kovaleva, and Jonathan Koomey. 2013. eBay: A Case Study of Organizational Change Underlying Technical Infrastructure Optimization. Stanford, CA: Steyer-Taylor Center for Energy Policy and Finance, Stanford University.  September 26. 

    That’s also why I’ve worked with Heatspring to develop the following course:

    Upcoming online class (Nov 10 to Dec 12, 2014)–Data center essentials for executives.

    Here’s the course description:

    Spend five weeks learning from Jonathan Koomey, Ph.D., a Research Fellow at the Steyer-Taylor Center for Energy Policy and Finance at Stanford University, and one of the foremost international experts on data center energy use, efficiency, organization, and management. Jon, along with help from other industry leaders, has developed this course to help experienced executives bring their internal information technology (IT) organizations into the 21st century. For the Capstone Project, students will propose management changes to their own organizations to help increase business agility, reduce costs, and move their internal IT organization from being a cost center to a profit center.

    I’m excited about this class, but to make it happen, we need lots of signups by mid September.  Please spread the word by sending this to upper level management in the company where you work.

  10. Comments
  11. A short discussion of academic integrity and what it means

    I taught a class called Energy and Society as a visiting professor at UC Berkeley back in Fall 2011.  It was a kind of homecoming for me as I had taken the graduate version of that class at the Energy and Resources Group from John Holdren and the late Mark Christensen back in 1984.

    At some point in the semester I gave an impromptu lecture on academic integrity, and I recently ran across a recording of that lecture by chance. It struck me as a nice concise summary that others might find useful. Here’s an edited version:  

    It’s about the right time in the semester to remind everybody about academic integrity. It’s important that any work that you submit as yours should be your own individual thoughts—not thoughts lifted from other people in any way, shape, or form. When you do assignments you have to use what’s called “proper attribution”, and by that we mean quoting accurately and making sure that somebody who reads what you write can trace back to the original source who said what.

    I have been doing work on a paper recently with the historian Richard Hirsh, who never ever, ever uses quotes second hand. So if he hears somebody has quoted a particular person, the only way he ever uses that quote in his work is if he can look at the wording of the quote and the context in the original source, to make sure that he’s got it right. That turns out to be important, because you find mistakes all over the place.

    For example, the old White’s Law that we talked about earlier in the class? The relationship between culture and energy? The original slide that I presented came from last year’s class and was dated 1973. It turned out to come from a paper in 1943. So there was a little typo. And so going back to the original source, Richard figured that out. It’s good to be a history professor; you have time to track these things down.

    You have to use proper attribution—if you aren’t sure what that is then go to this site or this site and they will tell you a bit about that. It’s also important that you don’t work with other students or collaborate on assignments unless you’re given permission or instruction to do that. You need to do your own work and you need to make sure that whatever work you use to support your own work is properly attributed.

    You should take this issue seriously. The University is a test bed for real life.  As an undergrad, you need to experiment and try different things, but there are consequences—both here and in real life—for not following these rules and not guarding your academic integrity with great care.

    Reputation is a precious and perishable thing, and if you use someone else’s work without attribution, then you are impugning your own intellectual integrity: you are hurting yourself. In the real world, if you are a scientist and somebody finds out that you have copied data, you are ruined. You are ruined. There is no way to recover from that as a scientist. You might be able to do some work in another field, but no one’s ever going to trust you again.

    Academic integrity is about doing the right thing even when it is not convenient to do the right thing, to mean what you say and say what you mean, and follow through when you make a promise to someone. That’s all part of integrity. That’s all part of making sure that when people see your work they say: “I believe it”. And they will check it—in science especially they will always check it—but they will have an underlying confidence that because you’ve done your work with integrity in the past—every time they’ve checked it in the past it’s worked out well—they will believe your work and trust it and use it to support theirs. It’s a critical thing both personally and professionally. Please keep that in mind. If you have questions about this issue or about the rules about academic misconduct at UC Berkeley, please check this website.

    See also my post on What is Intellectual Honesty and Why is it Important?.

  12. Comments
  13. My podcast interview with Tom Bowman titled “How Can We Accelerate Carbon Reductions?”

    My interview with Tom Bowman about “the unique role entrepreneurs play in climate action” was posted this past Saturday. It turned out very nicely and required little or no editing, so I guess I was on a roll. Please listen, send comments, and spread the word!

    In the interview I talk about why economic models underestimate the scope and possibilities for change. I also explore why entrepreneurs are a crucial part of the solution. And I describe why hope is really the only choice in the face of climate change, the ultimate adaptive challenge.

    Tom is founder and CEO of Bowman Change, Inc., a consultancy dedicated to helping organizations reap the benefits of working with purpose—making social issues and environmental change central to their missions. His podcast series on climate solutions is extensive and interesting.

  14. Comments
  15. A closer look at funding for Bjorn Lomborg

    image

    Graham Readfearn over at Desmogblog.com has done the most detailed exposition to date of the various ways that one of the most famous “luke warmists”, Bjorn Lomborg, gets his money (read more about the luke warmists).  Students of misinformation know well that Lomborg is a prolific producer of half truths and cherry picked conclusions.  Unfortunately, the media lap it up.  

    Here are a few key paragraphs from the Desmogblog article:

    The impression back in 2012 might have been that Lomborg’s think tank was struggling for cash, but a DeSmogBlog investigation suggests the opposite.

    The nonprofit Copenhagen Consensus Center (CCC) has spent almost $1 million on public relations since registering in the US in 2008. More than $4 million in grants and donations have flooded in since 2008, three quarters of which came in 2011 and 2012.

    In one year alone, the Copenhagen Consensus Center paid Lomborg $775,000. 

    It’s important to follow the money, as Readfearn has done, to determine who’s supporting the most prominent skeptics.  Almost always the trail leads back to the status quo interests who want to keep earning profits from fossil fuel infrastructure as long as they can.

    Read more…

    Addendum, June 26, 2014:  Joe Romm at Climate Progress has gone into more detail about funding for Lomborg, indicating that some of the usual status quo suspects are behind these developments.

  16. Comments
  17. Risky Business: Documenting the economic risks associated with climate change

    image

    The new Risky Business report was released today.  Worth a read.  Here a paragraph motivating the report’s conclusions:

    Climate Change: Nature’s Interest-Only Loan

    Our research focuses on climate impacts from today out to the year 2100, which may seem far off to many investors and policymakers. But climate impacts are unusual in that future risks are directly tied to present decisions. Carbon dioxide and other greenhouse gases can stay in the atmosphere for hundreds or even thousands of years. Higher concentrations of these gases create a “greenhouse effect” and lead to higher temperatures, higher sea levels, and shifts in global weather patterns. The effects are cumulative: By not acting to lower greenhouse gas emissions today, decision-makers put in place processes that increase overall risks tomorrow, and each year those decision-makers fail to act serves to broaden and deepen those risks. In some ways, climate change is like an interest-only loan we are putting on the backs of future generations: They will be stuck paying off the cumulative interest on the greenhouse gas emissions we’re putting into the atmosphere now, with no possibility of actually paying down that “emissions principal.”

    Our key findings underscore the reality that if we stay on our current emissions path, our climate risks will multiply and accumulate as the decades tick by.

    By putting the risks in financial terms this report makes clear what’s at stake.  ”Staying the course” has real costs and risks, it’s not just the alternative future that costs something.  And all credible analyses show that the incremental costs of making the changes we need are modest (at most 1-2% of GDP, but very likely much less than that, for reasons that I can explain to anyone who’s interested in the details).

    Download the full report.

  18. Comments
  19. National Geographic changes its arctic maps because the arctic ice cover has melted so much in recent decades

    For the tenth edition of its National Geographic Atlas of the World, this venerable institution has now altered the way the arctic ice appears on its maps.  For details go to “Shrinking Arctic Ice Prompts Drastic Change in National Geographic Atlas

    Here’s an animated GIF illustrating the changes.

    image

  20. Comments
Blog Archive
Stock1

I research, consult, and lecture about climate solutions, critical thinking skills, and the environmental effects of information technology.

Partial Client List

  • AMD
  • Dupont
  • eBay
  • Global Business Network
  • Hewlett Packard
  • IBM
  • Intel
  • Microsoft
  • Procter & Gamble
  • Rocky Mountain Institute
  • Samsung
  • Sony
  • Sun Microsystems
  • The Uptime Institute