A short framing article about energy use and AI, just released

On February 10th, Veolia and Microsoft released a report on AI and energy, water, and waste management. One of the short chapters in that report is a framing article about AI and energy use by me and Eric Masanet at UC Santa Barbara.

The article distinguishes three ways AI might affect energy use:

• Direct effects of AI operations

• Effects of applying AI to energy-related activities

• Interactive systemic effects of AI on the broader economy

Figure 1 from the article illustrates these three potential effects and the uncertainties affecting each of them.

Conceptual figure illustrating three ways AI might affect energy and the uncertainties related to these effects

Here's the abstract:

Many recent assessments of the effects of artificial intelligence (AI) systems lack rigor. The electricity use and emissions of AI operations are often viewed as the most salient issues, but use of AI systems can have important effects when they are deployed, and such deployments can lead to complicated systemic interactions between AI systems, the broader energy system, and the economy as a whole.
All effects of AI deployment are subject to deep uncertainty, but analyzing the effects of AI operations is usually the most feasible. Human understanding of the effects of AI deployments on specific domains and on interactions with the broader economy is in its infancy, but we know that these effects could either increase societal energy use (e.g., by making fossil fuel or geothermal extraction cheaper, or fueling increased consumer consumption by more targeted advertising) or decrease societal energy use (e.g., by enabling deployment of batteries to increase renewable energy adoption, which is more efficient than thermal plants on a primary energy basis, or improving efficiency throughout the broader economy). It is impossible to know in advance the sign of the net effect over the long term. 
For these less well understood effects, researchers should design consistent test cases, focusing on measuring economic, energetic, and environmental parameters before and after the deployment of new AI systems. For testing interactions, new kinds of large-scale economic models may be needed, as current models do not represent the effects of technology changes in a sufficiently detailed and systematic way.

The full reference is

Koomey, Jonathan, and Eric Masanet. 2026. Understanding AI energy use (part of a special report on AI for energy, water and waste management). Veolia and Microsoft. February 10. [https://www.institut.veolia.org/en/publications/veolia-institute-review-facts-reports/ai-energy-water-and-waste-management]

Addendum (February 13, 2026)

The three ways AI affects energy use mirror the structure from this excellent article.

Kaack, Lynn H., Priya L. Donti, Emma Strubell, George Kamiya, Felix Creutzig, and David Rolnick. 2022. "Aligning artificial intelligence with climate change mitigation." Nature Climate Change. vol. 12, no. 6. 2022/06/01. pp. 518-527. [https://doi.org/10.1038/s41558-022-01377-7]

In much earlier drafts (a few years ago) we referenced this one and should have done it in the final article, but somehow that reference got dropped after many iterations. Apologies to those authors, we will make sure we include that reference in any follow-on work.

Example dashboards from our decomposition tools for long-term emissions scenarios

In the previous post, we announced our new Python package that helps modelers explore the key greenhouse gas emissions drivers in their scenarios. This post shows examples of visual dashboards that these tools allow you to create.

There are three main dashboards, as documented in Koomey et al. 2019 and Koomey et al. 2022. The first shows what we call "Kaya factors", like population, gross world product, final energy, primary energy, fossil fuel primary energy, total fossil carbon emissions, and net fossil carbon emissions after accounting for sequestration. We use runs from IMAGE 3.0.1. The baseline is SSP2 and the intervention case is IMA15-TOT, a scenario that keeps global temperatures from exceeding 1.5 C. The runs are documented in van Vuuren et al. 2018.

The second dashboard shows what we call the "Kaya ratios", which are the terms in the expanded Kaya identity. These include population, economic activity per person, final energy per dollar of economic activity, primary energy per unit of final energy, the fossil fuel fraction of primary energy, total fossil carbon per unit of primary energy, and the ratio of fossil carbon reaching the atmosphere to the fossil carbon combusted in the energy system.

The third dashboard summarizes our "fully expanded decomposition", which includes the energy sector results in one pane, along with additive results for biomass CCS, land use, industrial process carbon dioxide emissions, and emissions of other gases than CO2 (other agents). The intervention scenario in this case has no biomass CCS and little change in industrial process emissions.

These dashboards together give a complete high-level picture of the evolution and emissions of the global economy for a business-as-usual scenario and an emissions reduction scenario. Of course it's always possible to dig deeper, but these three dashboards are a great place to start. We hope that automating the creation of such dashboards will enable much faster troubleshooting and high-level analysis of scenarios.

To view the notebook that explains how to make these graphs, go here.

To download the Excel workbook that contains the original data for the scenario pictured above, go here.

References

Koomey, Jonathan, Zachary Schmidt, Holmes Hummel, and John Weyant. 2019. "Inside the Black Box:  Understanding Key Drivers of Global Emission Scenarios." Environmental Modeling and Software. vol. 111, no. 1. January. pp. 268-281. [https://www.sciencedirect.com/science/article/pii/S1364815218300793]

Koomey, Jonathan, Zachary Schmidt, Karl Hausker, and Dan Lashof. 2022. "Exploring the black box: Applying macro decomposition tools for scenario comparisons." Environmental Modeling and Software. vol. 155, September. [https://doi.org/10.1016/j.envsoft.2022.105426]

van Vuuren, Detlef P., Elke Stehfest, David E. H. J. Gernaat, Maarten van den Berg, David L. Bijl, Harmen Sytze de Boer, Vassilis Daioglou, Jonathan C. Doelman, Oreane Y. Edelenbosch, Mathijs Harmsen, Andries F. Hof, and Mariësse A. E. van Sluisveld. 2018. "Alternative pathways to the 1.5 °C target reduce the need for negative emission technologies." Nature Climate Change. 2018/04/13. [https://doi.org/10.1038/s41558-018-0119-8]

State-of-the-art decomposition tools to help integrated assessment modelers better understand and assess their results

TLDR: Below you can download a new open-source python package for calculating and comparing key drivers of emissions using outputs from Integrated Assessment Models.

In 2004-2006 I served on the dissertation committee of Holmes Hummel at Stanford University. Holmes's thesis showed how a commonly used identity (called the Kaya Identity) could enable deeper understanding of the energy-sector outputs from Integrated Assessment Models (IAMs). These models help analysts assess key drivers affecting energy use and emissions in long term greenhouse gas emissions scenarios.

The most common version of the Kaya Identity is the four factor version, which reads as follows:

Four factor Kaya identity, showing how energy-sector CO2 emissions relate to population, wealth per person, primary energy use per dollar, and carbon intensity of primary energy, respectively.

As Holmes showed, the four factor Kaya identity masks complex system dynamics in energy scenarios, so she created a more comprehensive version, which in its fully developed form looks like this (see Koomey et al. 2019, below):

Expanded Kaya identity, showing how energy-sector CO2 emissions relate to population, wealth per person, final energy use per dollar, energy supply loss factor, the fraction of primary energy supplied by fossil, fuels, the carbon intensity of fossil fuels supplied, and the net emissions of CO2 from energy sector after sequestration, respectively.

Holmes finished and defended her dissertation in December of 2006. I and a few others used her tools and it soon became clear that some additional tweaking was needed. Over many years, Holmes, John Weyant, my colleague Zachary Schmidt, and I developed the analytical tools more fully, which culminated in our 2019 refereed journal article laying out the theory and methods supporting this work:

Koomey, Jonathan, Zachary Schmidt, Holmes Hummel, and John Weyant. 2019. "Inside the Black Box:  Understanding Key Drivers of Global Emission Scenarios." Environmental Modeling and Software. vol. 111, no. 1. January. pp. 268-281. [https://www.sciencedirect.com/science/article/pii/S1364815218300793]

One of the key additions was summarizing emissions for all sectors, including the energy sector (as characterized in the expanded Kaya identity), land use, industrial process CO2 emissions, biomass carbon capture and storage (CCS), and emissions of other gases than CO2 (like CH4, N2O, and F-gases). This fully expanded decomposition, which characterizes total carbon equivalent emissions is summarized in this equation:

C for fossil fuels comes from the equation above. The negative term for CS is carbon sequestration from biomass combustion. For scenarios including direct air capture, an additional negative term for that option would also need to be added.

We applied these tools to two scenarios in our 2022 refereed journal article:

Koomey, Jonathan, Zachary Schmidt, Karl Hausker, and Dan Lashof. 2022. "Exploring the black box: Applying macro decomposition tools for scenario comparisons." Environmental Modeling and Software. vol. 155, September. [https://doi.org/10.1016/j.envsoft.2022.105426]

Holmes built her initial tools in Excel workbooks, and these served well for years, but it proved hard to convince modelers to integrate spreadsheets into their workflows, which were largely automated using Python and other more modern tools. With that reality in mind (and with funding from World Resources Institute) we set out to recreate our tools as a Python package that modelers could just grab and use.

Today we are releasing that Python package for general use.

Virtually all IAMs generate the required data to use our tools, and we stuck closely to the terminology and definitions embodied in IIASA's PYAM tools.

To download the Python package directly from PyPI, click here.

To view the Github project page, where you can also download the package, click here.

To view an example notebook in Github showing how to use the tools, click here.

The Python package is licensed under Apache 2.0, which is an open-source license that allows free use, modification, and distribution for commercial or private use. Any contributors automatically grant a royalty-free license to any patented algorithms they add to the software.

To view example dashboards generated by these tools, go here.

We are confident that these tools will facilitate analysis of IAM-based scenarios, assist in troubleshooting those scenarios, and increase understanding of key drivers affecting greenhouse gas emissions

Please do email us if you have questions or suggestions.

Jonathan Koomey

Zachary Schmidt

An empirical assessment of data center sites using reclaimed municipal wastewater for cooling

With recent growth in the data center industry has come increasing scrutiny about direct water use in these facilities [1]. Data center water use varies by facility type, location, and operational choices [2, 3]. These facilities use water because it’s usually more energy efficient to cool computing equipment using water than air. 

All water use is not created equal. Some facilities use potable water for cooling, some use surface water or groundwater, and some use recycled water, often reclaimed from municipal wastewater. The environmental effects of data centers using reclaimed municipal wastewater (hereafter “reclaimed water”) are much smaller than using other water sources, and this strategy is becoming more widely used. The widespread availability of municipal wastewater treatment infrastructure in cities around the world makes reclaimed water a practical solution that can be implemented in many locations, supporting the global expansion of sustainable data center operations.

This blog post summarizes a brief white paper assessing the prevalence of reclaimed municipal wastewater for data center cooling, focusing on the operations of the top ten companies as assessed by Data Centre Magazine in 2025 (listed in alphabetical order): AWS (Amazon), CyrusOne, Digital Realty, Equinix, GDS, Google, Meta, Microsoft Azure, NTT, and Telehouse. For facilities owned by one company housing computers owned by another company, we ignored the owners of the computing equipment and assigned each facility to its owner/operator.

There is relatively little public data about reclaimed water use for data center cooling, so Zachary Schmidt and I identified publicly available sources from which we could reliably infer the presence or absence of reclaimed water use by facilities owned by these companies around the world. We rely on published reports, news releases, news reports, data center mapping websites, utility bills, utility contracts, conference presentations, and satellite imagery to substantiate the findings. The report links to our sources.

Figure 1 contains the results. As of December 2025, Amazon has the highest number of confirmed sites using reclaimed water for cooling, at 24, with two other companies following with 17 and 13 facilities. Three other companies have between 6 and 8 facilities, with the rest at zero or one facility using reclaimed water. Note that some facilities not using reclaimed water don’t use ANY water onsite for cooling, but that choice generally means cooling for these facilities is less energy efficient than it would be if onsite water were used.

Figure 1: Global tally of data center sites using reclaimed municipal wastewater for cooling

Bar chart showing tally of data center sites using reclaimed water for cooling as of end of 2025

More work is needed to identify locations for which there is no current public information about their use of reclaimed water. Companies using reclaimed water should be happy to publicize it, so we think our relative rank order is unlikely to change much with the addition of new information, but we hope the tally will increase over time. We encourage all data center companies to consider the use of reclaimed water for cooling, as concern over water used by data centers continues to increase.

To download the report, click here.

We are grateful to Amazon Web Services for funding this research.

REFERENCES

1.         Shehabi, Arman, Sarah Josephine Smith, Alex Hubbard, Alexander Newkirk, Nuoa Lei, Md AbuBakar Siddik, Billie Holecek, Jonathan G Koomey, Eric R Masanet, and Dale A Sartor. 2024. 2024 United States Data Center Energy Usage Report. Lawrence Berkeley National Laboratory. LBNL-2001637. December 19. [https://eta-publications.lbl.gov/publications/2024-lbnl-data-center-energy-usage-report]

2.         Lei, Nuoa, and Eric Masanet. 2022. "Climate- and technology-specific PUE and WUE estimations for U.S. data centers using a hybrid statistical and thermodynamics-based approach." Resources, Conservation and Recycling. vol. 182, 2022/07/01/. pp. 106323. [https://www.sciencedirect.com/science/article/pii/S0921344922001719]

3.         Lei, Nuoa, Jun Lu, Arman Shehabi, and Eric Masanet. 2025. "The water use of data center workloads: A review and assessment of key determinants." Resources, Conservation and Recycling. vol. 219, 2025/06/01/. pp. 108310. [https://www.sciencedirect.com/science/article/pii/S0921344925001892]

New report out today: Electricity Demand Growth and Data Centers: A Guide for the Perplexed

This report is the result of a collaboration between Koomey Analytics and the Bipartisan Policy Center in Washington, DC.

Summary: Recent reports of unprecedented growth in electricity demand from data centers have appeared in many major news outlets. These headlines encapsulate two widely expressed concerns. First, that rising energy demand from data centers could further overburden aging power infrastructure. Second, this new source of demand could jeopardize efforts to mitigate climate change. This report explores the accuracy of such narratives and explains the key drivers of load growth for data centers. We find that: 

National and regional load growth are following different trends: Despite alarming headlines, national electricity demand has not shown rapid growth, although regional variations exist. For example, the states of Virginia and Georgia are experiencing substantial electricity load growth. 

Sources of electricity load growth vary: Data centers are projected to account for at most 25% of electricity demand by 2030, a substantial but not dominant share of new load. Onshoring of manufacturing, electrification of vehicles, and building energy use are expected to contribute much more to electricity demand growth than data centers.  

Future load growth due to data centers is uncertain: The exact trajectory of future electricity use by data centers is unknown due to 1) improvements in AI system efficiency; 2) the unpredictability of demand for AI services; and 3) limits in manufacturing production capacity of AI chips, servers, and associated infrastructure. 

Although data center electricity use is growing again, exactly how that load growth will play out is uncertain. This report puts these uncertainties into context to help inform our nation’s response to load growth to ensure affordable, resilient, reliable U.S. energy. 

Reference: Koomey, Jonathan, Zachary Schmidt, and Tanya Das. 2025. Electricity Demand Growth and Data Centers: A Guide for the Perplexed. Washington, DC: Bipartisan Policy Center. February. [https://bipartisanpolicy.org/report/electricity-demand-growth-and-data-centers/]

We've converted Koomey.com to use Ghost, an open source blogging/newsletter software

When we first created the Koomey.com site circa 2010, we used Tumblr, which was a capable blogging site. We customized the site (with some difficulty) but it mostly performed well for a long time (almost 15 years).

This past summer we started investigating other options, and soon settled on Ghost. Many companies use it to handle newsletters with subscriptions, but it also works well for blog site hosting. It's open source and pricing is flat fee subscription, rather than a percentage of revenues like Substack (although tiers for bigger orgs and sites cost more).

One of the important learnings from recent technology developments is that commercial sites have a life cycle, and in their end stages undergo what Cory Doctorow has called "enshittification". The idea is that new sites launch to please users, but over time they move more and more to please their investors, which hurts the user experience as the company sucks more and more revenue from customers. It's not a universal law, but it is often true.

Our shift to Ghost insulates us somewhat from enshittification. Their business model is subscriptions and hosting and if their hosting becomes problematic we can just spin up our own Ghost instance (it's open source).

We don't anticipate doing paid newsletters, but Ghost will make that easy if we decide to go that route. The switch involve a bunch of futzing, but the site is looking better than ever, and now we can start thinking about how to tweak structure and content to better serve our clients.

As we worked to convert the site to Ghost, we also realized that the nature and purpose of the site had to shift, from being Jon Koomey's personal site to being a corporate site for Koomey Analytics, the small research company that Jon leads. That led to some obvious changes, but we think it holds together.

Expect more changes and improvements in the near future. Please do get in touch with ideas, suggestions, and new data sources. We're always happy to hear from like-minded data and analysis geeks.

Our new report on digital twins for data center operations, out today!

The modern data center lies at the heart of today’s digital global economy, performing computing tasks like e-commerce, communications, search, financial modeling, and artificial intelligence (AI). Data centers undergo constant change, both in the workloads they run 24x7 and the hardware that runs those applications.

Lack of adequate planning and management can lead to under-provisioning, over-heating, and lost capacity, all of which undermine the profitability and sustainability of these critical facilities. Today’s AI and high-performance compute nodes can exacerbate these problems.

When IT loads deviate from the original data center design, stranded power and cooling capacity are the result. A simple analogy helps explain the problem. Most people are familiar with the game of Tetris TM, in which blocks fall at a regular pace, and the player’s task is to place those blocks in the correct orientation, filling up the space as thoroughly as possible.

In the simplest case, the blocks are of uniform size and shape (i.e., they conform precisely to what data center designers specified initially), and it’s easy for the user to fill up the space completely. The example on the left-hand side of Figure 1 illustrates this case. On the right-hand side, the TetrisTM player cannot make the shapes fit perfectly because their shapes are random, and they just keep coming. That leaves gaps (white space) between the shapes, which represent lost capacityin the data center. White space above the colored bricks represents unused capacity.

Figure 1: Lost capacity as illustrated by the game of Tetris

Lost data center capacity is exactly analogous to what are often called “zombie servers” in data centers, which are servers using electricity but doing nothing useful. This time it’s part of the data center itself (the cooling and power infrastructure) that is costing money (and lots of it) but not enabling any useful computing.

In this paper, we describe the challenges data center planners face and the potential for digital twins to help better manage data centers over their useful lives. Combining digital twins with computational fluid dynamics software (models that simulate and predict the behavior of airflow and heat in data centers) helps planners and managers save millions of dollars, reduce energy waste, increase profitability, improve data center reliability, predict failures, and lengthen the useful lifespan of costly data center equipment.

Download the report.

Download my talk titled “Fighting Zombie Data Centers with Digital Twins”.

Our new article in Joule titled "To better understand AI’s growing energy use, analysts need a data revolution" was published online at Joule today

Our new article in Joule on data needs for understanding AI electricity use came out online today in Joule (link will be good until October 8, 2024). Here’s the summary section:

As the famous quote from George Box goes, “All models are wrong, but some are useful.” Bottom-up AI data center models will never be a perfect crystal ball, but energy analysts can soon make them much more useful for decisionmakers if our identified critical data needs are met. Without better data, energy analysts may be forced to take several shortcuts that are more uncertain, less explanatory, less defensible, and less useful to policymakers, investors, the media, and the public.
Meanwhile, all of these stakeholders deserve greater clarity on the scales and drivers of the electricity use of one of the most disruptive technologies in recent memory. One need only look to the history of cryptocurrency mining as a cautionary tale: after a long initial period of moderate growth, mining electricity demand rose rapidly. Meanwhile, energy analysts struggled to fill data and modeling gaps to quantify and explain that growth to policymakers—and to identify ways of mitigating it—especially at local levels where grids were at risk of stress.
The electricity demand growth potential of AI data centers is much larger, so energy analysts must be better prepared. With the right support and partnerships, the energy analysis community is ready to take on the challenges of modeling a fast moving and uncertain sector, to continuously improve, and to bring much-needed scientific evidence to the table. Given the rapid growth of AI data center operations and investments, the time to act is now.“

I worked with my longtime colleagues Eric Masanet and Nuoa Lei on this article.

KQED Forum today about our digital carbon footprint

My friend and colleague Danny Cullenward and I were on KQED Forum this morning, talking about the environmental impacts of our digital lives. Lesley McClurg was the host.

You shouldn’t worry at all about your digital footprint, as we discussed in the show. It’s small and constantly improving, and much of the equipment uses the same amount of electricity when it’s idle as when it’s fully loaded, so your actions won’t change electricity use or emissions.

If you want to take personal action on climate, you should

* Vote against climate deniers and fossil fuel apologists.
* Replace fossil fuel equipment at end of life with electrified equipment. That’s when it’s most cost effective. Buy heat pumps instead of furnaces, heat pump water heaters instead of normal water heaters, induction cooktops instead of gas cooktops, heat pump dryers instead of gas dryers, and electric vehicles instead of gasoline or diesel vehicles (if not ready for full electric, buy a plug in hybrid).
* Fly less.
* Drive less.
* Eat less red meat.
* Vote against climate deniers and fossil fuel apologists again!

Much of what needs to happen is to change our SYSTEMS, which is not under the control of most individuals, but the actions above are both under individual control and highly impactful. For more ideas, see our 2022 book:

Koomey, Jonathan, and Ian Monroe. 2022. Solving climate change: A guide for learners and leaders. Bristol, UK: IOP Publishing. [http://www.solveclimate.org]

If you think new electricity load growth is a crisis, think again

The frenzy over new projections of electricity growth continues to escalate. This excellent episode of the Energy Transition Show is the best counterweight to that crisis mentality that I’ve found. The show notes themselves are extensive for those who want to dig in further.

Short summary: There are many reasons to believe that the utilities who are fanning the crisis mentality are doing it for self interested reasons based on data that are at best incomplete. Don’t take any of these claims at face value.

Related: Our Nature commentary on the need for scenarios to understand the effects of AI on electricity use in the face of deep uncertainty:

Luers, Amy, Jonathan Koomey, Eric Masanet, Owen Gaffney, Felix Creutzig, Juan Lavista Ferres, and Eric Horvitz. 2024. “Will AI accelerate or delay the race to net-zero emissions?” Nature. vol. 628, April 22. pp. 718-720. [https://doi.org/10.1038/d41586-024-01137-x]

Our Nature commentary on AI scenarios, out today

I worked with a stellar team of the world’s top experts on computing’s effect on energy and emissions to craft this commentary for Nature, which came out today (April 22, 2024):

Luers, Amy, Jonathan Koomey, Eric Masanet, Owen Gaffney, Felix Creutzig, Juan Lavista Ferres, and Eric Horvitz. 2024. “Will AI accelerate or delay the race to net-zero emissions?” Nature. vol. 628, April 22. pp. 718-720. [https://doi.org/10.1038/d41586-024-01137-x]

Here’s the bottom line summary:

“Artificial Intelligence (AI) is one of the most disruptive technologies of our time. It’s imperative that decisions around its development and use — today and as it evolves — are made with sustainability in mind. Only through developing a set of standard AI-driven emissions scenarios will policymakers, investors, advocates, private companies and the scientific community have the tools to make sound decisions regarding AI and the global race to net-zero emissions.”

I appeared on the Shift Key Podcast on April 3, 2024, talking about electricity used by AI and ICT, with a focus on load forecasts

This conversation was a fun one. here’s the description:

Will the rise of machine learning and artificial intelligence break the climate system? In recent months, utilities and tech companies have argued that soaring use of AI will overwhelm electricity markets. Is that true — or is it a sales pitch meant to build more gas plants? And how much electricity do data centers and AI use today?
In this week’s episode, Rob and Jesse talk to Jonathan Koomey, an independent researcher, lecturer, and entrepreneur who studies the energy impacts of the internet and information technology. We discuss why AI may not break the electricity system and the long history of anxiety over computing’s energy use. Shift Key is hosted by Robinson Meyer, executive editor of Heatmap, and Jesse Jenkins, a Princeton professor of energy systems engineering.

Listen on Apple Podcasts.

Listen on Spotify.

Listen on Audible.

An updated vignette of technology change for lighting

In 2011, we replaced lighting cans with LED inserts in our house, instantly reducing lighting energy use by 50% or more. The inserts looked like the ones on the left in the photos below.

Recently (September 2023) I needed to buy a few more to replace some of the old ones that failed. The new ones look like the one on the right in the photos. Both give 700 lumens of light output.

The old ones (with the little wire that screws into the socket) weigh 486 grams, use 11 W, have a color temperature of 3000 K, are about 11.9 cm high, and cost $50 each.

The newest ones weigh 226 grams, use 10 W, have a more pleasing color temperature of 2700 K, are about 6.3 cm tall, cost $11 each, and occupy less than half the volume of the 2011 version.

In a dozen years the price has come down a factor of nine, volume and weight are down by a factor of two (making shipping easier and less expensive), efficiency has improved about 9%, and lighting quality has improved. Not too shabby!

Technological progress like this is why Amory Lovins calls efficiency a renewable resource. It keeps getting better and cheaper over time!

For an intermediate look at the state of this technology in 2019, go here.

The 8th annual roundup episode of Chris Nelder's Energy Transition show

Every year since Chris Nelder started the Energy Transition Show, he’s interviewed me for the annual roundup episode, and this year is no exception. We discuss the proper role of government in a capitalist economy, climate change doomism, how the fossil fuel industry rigs the system, and the difficulties of the mid-transition as we shift away from conventional energy systems.

The full episode is here:

[Episode #207] – 8th Anniversary Show | The Energy Transition Show

If you’re not a subscriber, you can still hear a free abridged version on the Apple podcast app and elsewhere.

Our commentary titled “Abandon the idea of an ‘optimal economic path’ for climate policy” came out on July 2nd, 2023 in WIRES Climate Change

I, along with colleagues at World Resources Institute and Koomey Analytics, just had a commentary published in WIRES Climate Change. It’s titled “Abandon the idea of an ‘optimal economic path’ for climate policy”.

Many economic modelers think that if given enough time, money, graduate students, and coffee they can estimate an “optimal economic path” for climate mitigation that extends far into the future. They further argue that this path is the correct or best way to guide climate policy design.

The most prominent example is that of Nobel prize winning professor William Nordhaus, the father of cost-benefit or benefit-cost analysis for climate [1]. In his 2018 Nobel acceptance speech, Nordhaus [2] said:

[I]n the view of most economists, balancing of costs and benefits is the most satisfactory way to develop climate policy.

[O]ne of the most amazing results of Integrated Assessment Models (IAMs) is the ability to calculate the optimal carbon price…This concept represents the economic cost caused by an additional ton of carbon dioxide emissions (or more succinctly carbon) or its equivalent…In an optimized climate policy (abstracting away from various distortions), the social cost of carbon will equal the carbon price or the carbon tax.

Nordhaus argues that IAMs can estimate carbon prices that optimize global consumption, emissions, and climate change, balancing mitigation or abatement costs against benefits of reducing emissions (like risk reduction and avoided climate damages). Similar analyses, focused on damage costs, are used to assess appropriate social costs of carbon for regulatory purposes [3].

This way of framing the problem can be summarized in the following graph, which depicts benefit and cost curves in stylized fashion. It characterizes the place where the two curves cross as the “optimal” level of GHG reductions, where the marginal cost of reducing emissions is equal to the marginal benefits from reducing them. The point also suggests the optimal carbon price, as in the Nordhaus quotation above. In this view, reducing emissions beyond that point would imply that we are paying too much for emissions reductions because the costs for incremental emissions reductions would exceed the benefits.

This commentary focuses attention on underlying ideas about “optimal paths” that are in our view not widely enough understood and are often unstated, namely that

(1) there IS a single unique optimal path to solving the climate problem,

(2) this path exists independent of human choices, and

(3) society can discover this path in advance through better data collection, analysis, and logical thinking.

These beliefs are at odds with our current understanding of the forces driving the development of real economic and technological systems, which are dominated by increasing returns to scale, network externalities, learning curves, and other non-linear effects. Real non-linear systems are subject to “sensitive dependence on initial conditions”, which leads to chaotic and often unpredictable behavior of such systems in the face of imperfect measurements, randomness, and human choices [4, 5, 6, 7, 8]. Models of non-linear systems are also strongly affected by uncertainties in model structure, complicating things still further [9].

The full reference for the commentary is

Koomey, Jonathan, Zachary Schmidt, Karl Hausker, and Dan Lashof. 2023. “Abandon the idea of an “optimal economic path” for climate policy.” Invited Commentary for WIREs Climate Change. vol. e850, July 2. [http://doi.org/10.1002/wcc.850]

To download a pre-publication version of the article, click here.

References

1.         Nordhaus, William D. 1992. “An Optimal Transition Path for Controlling Greenhouse Gases.” Science. vol. 258, no. 5086. pp. 1315. [http://science.sciencemag.org/content/258/5086/1315.abstract]

2.         Nordhaus, William D. 2018. Nobel Prize Lecture, December 8, 2018 [https://www.nobelprize.org/prizes/economic-sciences/2018/nordhaus/lecture/]

3.         US EPA. 2022. Report on the Social Cost of Greenhouse Gases: Estimates Incorporating Recent Scientific Advances. Washington, DC: U.S. Environmental Protection Agency. September. [https://www.epa.gov/environmental-economics/scghg]

4.         Lorenz, Edward. 1995. The essence of chaos. Seattle, WA: The University of Washington Press. [https://uwapress.uw.edu/book/9780295975146/the-essence-of-chaos/]

5.         Gleick, James. 1988. Chaos: Making a new science. New York, NY: Penguin Books. [https://amzn.to/3Jxc2yv]

6.         DeCanio, Stephen J. 2013. Limits of Economic and Social Knowledge. New York, NY: Palgrave Macmillan. [https://stephendecanio.com/2017/06/30/limits-of-economic-and-social-knowledge/]

7.         Pluchino, Alessandro, Alessio Emanuele Biondo, and Andrea Rapisarda. 2018. “Talent versus luck: The role of randomness in success and failure.” Advances in Complex Systems. vol. 21, no. 03n04. pp. 1850014. [https://www.worldscientific.com/doi/abs/10.1142/S0219525918500145]

8.         Dizikes, Peter. 2011. “When the butterfly effect took flight.” In MIT Technology Review. February 22. pp. [https://www.technologyreview.com/2011/02/22/196987/when-the-butterfly-effect-took-flight]

9.         Thompson, Erica. 2022. Escape from model land: How mathematical models can lead us astray and what we can do about it. New York, NY: Basic Books. [https://amzn.to/3HDxH5t]

Blog Archive
Stock1

Koomey researches, writes, and lectures about climate solutions, critical thinking skills, and the environmental effects of information technology.

Partial Client List

  • AMD
  • Dupont
  • eBay
  • Global Business Network
  • Hewlett Packard
  • IBM
  • Intel
  • Microsoft
  • Procter & Gamble
  • Rocky Mountain Institute
  • Samsung
  • Sony
  • Sun Microsystems
  • The Uptime Institute