My talk at the Salinas, CA Rotary Club on Feb 9, 2023, focused on misconceptions about information technology energy and environmental impacts

My talk on February 9, 2023 for the Salinas Rotary club is an expansion of points made in a commentary article by me and Professor Eric Masanet, UCSB, in Joule in 2021:

Koomey, J., & Masanet, E. (2021). Does not compute: Avoiding pitfalls assessing the Internet’s energy and carbon impacts. Joule, 5(7), 1625-1628. https://www.sciencedirect.com/science/article/abs/pii/S2542435121002117.

In the talk, I presented nine different high-profile misconceptions about electricity use and emissions associated with computing, explored four pitfalls that lead to such misconceptions, and suggested four ways we can do better in the future.

Here is a graph illustrating that substantial increases in information technology services, in this case data flows, does not necessarily imply increases in energy use.

Here is the conclusions slide:

You can download a PDF of the slides (which include three pages of references) HERE.

Our commentary on the need for targeted programs affecting networked standby power for information technology

image

The peer-reviewed journal Energy Efficiency just published our commentary about networked standby power and the need for product-specific government industry interactions when setting voluntary targets and standards. Networked standby is power used by devices to maintain a network connection, and some states and countries are considering regulating it using what are called “horizontal standards” for standby power that apply to many different kinds of products.

The purpose of this commentary is to explain why we think even “clustered horizontal” targets, like the ones currently being analyzed by the California Energy Commission (Pasha 2021), will be challenging to develop for devices incorporating information and communication technology. We don’t think it is impossible to create horizontal targets in all cases, but we are convinced, because of the fast-moving nature of these technologies and the increasing integration of IT with the primary functions of most devices, that horizontal targets of any type (even more precisely targeted ones) will face unique headwinds.

Please email me if you don’t have access via the DOI link below or click on the sharing link above (in the first paragraph). The supplemental information is a white paper that contains more technical analysis and details supporting our arguments in the commentary.

Abstract

Efficiency of electronic devices is an area of active interest by policy makers in the European Union and elsewhere. Efforts to create a uniform horizontal efficiency standard (one that applies to many different types of equipment) have worked in the past, but as standards become more stringent, the need for product-by-product differentiation for such standards becomes more pressing.

Devising sensible regulations requires making reasonable average power consumption estimates for groups of components that reflect how they would actually be used in real products, not just treating components in isolation. Deep interactions between regulators and manufacturers are often needed to create efficiency targets that improve efficiency without sacrificing innovation. There are models of such interactions that have proven to work well (like the processes for developing Energy Star voluntary programs, many minimum efficiency standards, and industry voluntary agreements) that represent the best path forward.

References

Koomey, Jonathan, Zachary Schmidt, Bruce Nordman, Kieren Mayers, and Joshua Aslan. 2023. “Successful efficiency programs for information and communication technologies require product-specific analysis and industry/government collaboration.” Energy Efficiency. vol. 16, no. 1. 2023/01/18. pp. 2. [https://doi.org/10.1007/s12053-023-10083-y]

Pasha, Soheila. 2021. Staff Presentation: Low Power Mode Roadmap. Sacramento, CA: California Energy Commission.  [https://www.energy.ca.gov/event/workshop/2021-08/staff-workshop-appliance-efficiency-roadmap-low-power-mode-data-collection]

Our latest book, out today: Solving Climate Change: A Guide for Learners and Leaders

Our latest book was just released online today by IOP Publishing (The Institute of Physics). It’s called Solving Climate Change: A Guide for Learners and Leaders.

The publisher’s page for the book is https://doi.org/10.1088/978-0-7503-4032-8.

This textbook grew out of a course my colleague Ian Monroe and I taught at Stanford in 2017 and 2018, titled “Implementing Climate Solutions at Scale”. Its intended audience is academics and practitioners teaching classes like that one, though we hope others will also find it useful.

We posted some key parts of the book online:

Table of contents

Preface

Foreword by Professor Kimberly Nicholas

This book goes beyond our original courses to provide a more comprehensive framework for solving climate change than we’ve found elsewhere. We include an overview of climate solution technologies, as well as analytical tools necessary to identify solutions that really work. We also explore what’s needed to align incentives, mobilize money, and elevate truth in climate conversations, key pillars of climate action that are often overlooked by techno-centric discussions of global emissions reductions.

The overarching framing of the book (*the eight pillars of solving climate change”) is summarized in this graphic:


Please do reach out to me and Ian with questions, ideas for outreach, and suggestions for the next edition. You can also sign up for our mailing list by going to http://www.solveclimate.org and paging down a bit on the first page. Finally, if your institution has a library, please put in a request for them to purchase the book. It’s priced on the high side ($120), as textbooks often are, so it may be out of reach for many individuals, but libraries and companies should be able to afford it.

Our latest article on scenario decomposition tools was published in September 2022

Our latest article on scenario decomposition tools came out in Environmental Modeling and Software in September 2022:

Koomey, Jonathan, Zachary Schmidt, Karl Hausker, and Dan Lashof. 2022. “Exploring the black box: Applying macro decomposition tools for scenario comparisons.” Environmental Modeling and Software. vol. 155, September. [https://doi.org/10.1016/j.envsoft.2022.105426]

This article is a follow on to our 2019 article in the same journal:

Koomey, Jonathan, Zachary Schmidt, Holmes Hummel, and John Weyant. 2019. “Inside the Black Box:  Understanding Key Drivers of Global Emission Scenarios.” Environmental Modeling and Software. vol. 111, no. 1. January. pp. 268-281. [https://www.sciencedirect.com/science/article/pii/S1364815218300793]

The 2022 article applies the tools developed in the 2019 article to two aggressive emissions reduction scenarios, illustrating the kinds of insights available from using these tools. We apply a Logarithmic Mean Divisia Index (LMDI) decomposition to analyze emissions reductions from the energy sector and additional tools to assess emissions reductions from other sectors.

These are the two articles containing the scenarios we compared:

Grübler, Arnulf, Charlie Wilson, Nuno Bento, Benigna Boza-Kiss, Volker Krey, David L. McCollum, Narasimha D. Rao, Keywan Riahi, Joeri Rogelj, Simon De Stercke, Jonathan Cullen, Stefan Frank, Oliver Fricko, Fei Guo, Matt Gidden, Petr Havlík, Daniel Huppmann, Gregor Kiesewetter, Peter Rafaj, Wolfgang Schoepp, and Hugo Valin. 2018. “A low energy demand scenario for meeting the 1.5 °C target and sustainable development goals without negative emission technologies.” Nature Energy. vol. 3, no. 6. 2018/06/01. pp. 515-527. [https://doi.org/10.1038/s41560-018-0172-6]

van Vuuren, Detlef P., Elke Stehfest, David E. H. J. Gernaat, Maarten van den Berg, David L. Bijl, Harmen Sytze de Boer, Vassilis Daioglou, Jonathan C. Doelman, Oreane Y. Edelenbosch, Mathijs Harmsen, Andries F. Hof, and Mariësse A. E. van Sluisveld. 2018. “Alternative pathways to the 1.5 °C target reduce the need for negative emission technologies.” Nature Climate Change. 2018/04/13. [https://doi.org/10.1038/s41558-018-0119-8]

Here’s one example of our dashboards, comparing results for two scenarios:

image

Email me for a copy if you don’t have access.

My visit to Bletchley Park and The UK Computing History Museum in November 2017

Regular readers know that I’ve studied the history of computing for a very long time. About four years ago (November 17, 2017) I had the good fortune to visit Bletchley Park and the UK’s National Museum of Computing, outside of London. They are contiguously located, so it was easy to visit both, and well worth the trip. I’ve been meaning to write up a brief account since the visit, and finally made the time.

Both of these museums highlight the role of mathematics and computing in the UK war effort in the late 1930s and 40s, which was only made public in the 1990s. Code breaking featured prominently, as did Alan Turing. In Bletchley Park they’ve kept some of the offices just as they were, so it’s wonderful to be in that space and imagine what it was like to work there.

Here’s a picture of Alan Turing’s office as it looks now (and looked then):

Here’s a wonderful sculpture of Turing:

This funky 1990s era website has a lot of juicy historical detail, so if you’re feeling adventurous, check it out.

I found the recreated Colossus computer to be the highlight of the trip to the National Museum of Computing. When British Telecom (BT) started decomissioning their vacuum tube equipment in the 1980s and 1990s, some clever folks realized they could use the original design schematics for Colossus to rebuild it using the tubes from BT. The original machine is long since gone, but they made an exact replica, and it works!

It’s a special purpose computer in the purest sense. Its sole purpose was to break German Lorenz cipher. There is no clock as we understand it now, the machine is driven by a paper tape that runs in a loop. Each character is composed of 5 bits, and the machine could process 5,000 characters per second. It has 2,500 tubes, some argon filled, the rest vacuum tubes. Total power draw in operation is 8 kW.

I met Phil Hayes, the Chief Colossus Engineer, and asked him if there was any way to convert the 5,000 characters per second into something comparable to “instructions per second” or another more modern unit of performance. Phil was pretty sure that wasn’t possible, due to the specialized nature of the tasks performed by this computer.

Here’s a photo of me with Phil in front of Colossus:

Click on the link below to download a video of Colossus in operation (the sounds are great!). It’s a big file (46 MB) but worth the download:

Download video of Colossus in operation

If you are interested in the history of computing and are in and around London, by all means take the trip to Milton Keynes and check out these two world class museums.

Our new article published today in Joule: “Does not compute: Avoiding pitfalls in assessing the Internet's energy and carbon impacts”

I and Professor Eric Masanet of UC Santa Barbara have a new commentary article out today in the refereed journal Joule. It explores four common pitfalls that cause researchers and commentators to exaggerate information technology electricity use and emissions, and suggests four ways industry and researchers can avoid spreading such misconceptions in the future.

It’s a short article, so I won’t spoil it by giving too much away, but the figure above summarizes one key lesson from our review: Growth in data traffic in either the short term or the long term does not necessarily imply growth in energy use. It depends on how fast efficiency improves!

I summarize the conclusions in my June 1, 2021 keynote for the iTherm conference:

Here’s the reference:

Koomey, Jonathan, and Eric Masanet. 2021. “Does not compute: Avoiding pitfalls in assessing the Internet’s energy and carbon impacts.” Joule. June 24. [https://doi.org/10.1016/j.joule.2021.05.007]

I gave a virtual keynote today for the iTherm 2021 technical conference, focusing on misconceptions about electricity use and emissions associated with computing

My keynote talk today for the iTherm 2021 technical conference is an expansion of points made in a commentary article by me and Professor Eric Masanet, UCSB, which is “in press” at Joule right now (more when that’s published). I presented nine different high-profile misconceptions about electricity use and emissions associated with computing, explored four pitfalls that lead to such misconceptions, and suggested four ways we can do better in the future.

Here is the conclusions slide:

You can download a PDF of the slides (which include three pages of references) HERE.

I gave a virtual talk today for the Organization for Security and Co-operation in Europe and the World Energy Council on the role of  ICT in the energy transition

image

My talk was titled “Information and communications technology (ICT) and the energy/climate transition”, and I presented it today (November 24, 2020) at the 3rd Vienna Energy Strategy Dialogue, on the Implications of the Global Energy Transition”, Vienna, Austria.

The key points:

• Direct electricity used by ICT is modest and hasn’t grown much if at all in recent years.

• Nobody can credibly project ICT electricity use more than a few years ahead, and exaggerations of ICT electricity use abound in the literature.

• ICT is a powerful source of emissions reductions throughout the economy, which is why I call ICT our “ace in the hole” when it comes to facing the climate  challenge.

To download a PDF version of the talk, click here.

An Update On Trends In US Primary Energy, Electricity, And Inflation-Adjusted GDP Through 2019

Back in 2015, Professor Richard Hirsh (Virginia Tech) and I published the following article in The Electricity Journal, documenting trends in US primary energy, electricity, and real (inflation-adjusted) Gross Domestic Product (GDP) through 2014:

Hirsh, Richard F., and Jonathan G. Koomey. 2015. “Electricity Consumption and Economic Growth: A New Relationship with Significant Consequences?” The Electricity Journal. vol. 28, no. 9. November. pp. 72-84. [http://www.sciencedirect.com/science/article/pii/S1040619015002067]

Every year since, my colleage Zach Schmidt and I have updated the trend numbers for the US using the latest energy and electricity data from the US Energy Information Administration (EIA).  This short blog post gives the three key graphs from that study updated to 2019, and makes a few observations.

Figure 1 shows GDP, primary energy, and electricity consumption through 2019, expressed as an index with 1973 values equaling 1.0. From 2017 to 2018, GDP grew a little more slowly and primary energy and electricity grew a little more rapidly than in recent years, but primary energy and electricity consumption both dropped in 2019 relative to the year before. GDP continued to show modest growth consistent with recent historical rates (all bets are off for 2020, though, given the likely effects of COVID-19).

The overall picture really hasn’t changed that much. Electricity consumption and primary energy consumption have been flat for about a decade and two decades (respectively).

Figure 2 shows the ratio of primary energy and electricity consumption to GDP, normalized to 1973 = 1.0. The trends there are pretty clear as well. Primary energy use per unit of GDP has been declining since the early 1970s, while the ratio of electricity use to GDP has been declining since the mid-1990s. Before the 1970s, electricity intensity of economic activity was increasing, and from the early 1970s to the mid-1990s, it was roughly constant.

Figure 3 (which was Figure 4 in the Hirsh and Koomey article) shows the annual change in electricity consumption going back to 1950. Growth in total US electricity consumption has just about stopped in the past decade, but there’s significant year-to-year variation. The decline in 2019 electricity use almost offset the growth from 2017 to 2018 (and this decline predates the effects of COVID-19 on economic activity).

Flat or declining consumption poses big challenges to utilities, whose business models depend on continued growth to increase profits (unless they are in states like California, where the regulators have decoupled electricity use from profits). If the US embarks on a sustained effort to #electrifyeverything, then these trends can be reversed, but that will take time, and in the meantime, the long running efforts on efficiency standards and labeling continue to have substantial effects on electricity consumption in developed nations.

Email me at jon@koomey.com if you’d like a copy of the 2015 article or the latest spreadsheet with graphs. If you want to use these graphs, you are free to do so as long as you don’t change the data and you credit the work as follows:

This graph is an updated version of one that appeared in Hirsh and Koomey (2015), using data from the US Energy Information Administration and the US Bureau of Economic Analysis.

Hirsh, Richard F., and Jonathan G. Koomey. 2015. “Electricity Consumption and Economic Growth: A New Relationship with Significant Consequences?“ The Electricity Journal. vol. 28, no. 9. November. pp. 72-84. [http://www.sciencedirect.com/science/article/pii/S1040619015002067]

A fun science project: A simple cloud chamber!

Safety warning: This project involves dry ice, which can really damage your skin if you make direct contact with it. If you attempt this activity, use appropriate safety precautions (like oven mitts and tongs to move the dry ice)

When I was a kid I always wanted to make a cloud chamber, which makes vapor trails of atomic particles visible to the naked eye. I first learned about it from reading a book by C. L. Stong titled “The Amateur Scientist”, which was a compilation of Stong’s columns in Scientific American. It’s an amazing book, and if you love tinkering as much as I do, it’s a terrific source of inspiration. It was published in 1960 (yes, I’m old) and I still have my copy (yes, I’m a bit of a packrat).

image

You can still order a used copy on Amazon for almost $60, but for the DIY science geek it’s well worth it (even today). Some of the chapters include “A homemade atom smasher”, “The Millikan oil-drop experiment”, “A simple magnetic resonance spectrometer”, “Homemade electrostatic generators”, “A low-speed wind tunnel”, “An electronic seismograph”, “A transistorized drive for telescopes”, and lots of other fun projects in many fields of science.

The chapter on cloud chambers is very thorough, explaining many different designs and even showing how you can use magnetic fields to detect curvature in the particle tracks and determine exactly which types of charged particles they might be.

Back in those days I didn’t have access to dry ice so never did the experiment, but now it’s available in every supermarket. When one of our boys needed a science project, I suggested this one, and he jumped at it.

Nowadays there are many resources available online, and one of the best is the one by Science Friday, but I want to describe some things we learned from doing it using that book from 1960 in case you want to try this yourself.

The basic idea is to take a glass jar with a metal screw top, stuff a sponge in the bottom of the jar, pour some 90+% rubbing alcohol on the sponge, screw on the lid, invert it, place it on some dry ice, shine a flashlight from the side, and see what happens. When it works, you first see what looks like a tiny drizzle of alcohol droplets, then every so often (a few times a minute for us) you see a trail of condensed droplets that appears and then falls at the same rate as the alcohol “rain”. Those are atomic particles making their way through the alcohol clouds (see the Science Friday link above for examples of how these look).

It’s not as simple I made it sound in the previous paragraph. The inside of the jar lid needs to be black, for contrast. The light needs to be just so. Your container needs to be clear enough for visibility.

It’s important to choose the right container. Our first attempt used a pickle jar (the one we happened to have) that didn’t have super clear glass (it was a bit wavy). Once we got a better jar it worked great, so check the visibility through the glass before choosing a jar. We also tried this with a short (about 3″) tall jar, and that didn’t work as well because the glass frosted over from cold too quickly. Get a taller one (more like 6-8″ high).

Some websites advocate using permanent marker on the inside of the jar lid to make it black, but we found that the alcohol removed the marker so this didn’t work so well. Based on advice from the Stong book, we ended up buying some velvet (about half a yard) from the fabric store and cutting a piece that was about 1.5 feet square. We placed this over the open jar and then screwed the top on (velvet side was inside the jar).

image

When you flip that over, it looks like this.

image

The nice thing about this setup is that you can cover the block of dry ice (ours was about 10″ square and 1.5″ high) with the velvet and the metal top conducts heat away from the metal top and through the velvet. Stong recommended adding a little alcohol to the velvet also (in addition to charging the sponge with it) and that seemed to work for us. The cloth also covers the dry ice and prevents dry ice “steam” from interfering from viewing. It also prevents direct contact with dry ice, as a safety measure.

We then needed to create a light, and we improvised using a headlamp and a can of beans.

image

We put this to the side of the jar with the dry ice underneath,

Here’s how it looked inside after we put the jar with velvet and the lighting source inside an Amazon pantry box, with the whole thing on a cookie sheet for ease of carrying. We also put a doubled up towel underneath the dry ice to insulate it.

image

Here’s how it looked inside the box with the light on.

image

You’ll need to play with the lighting a bit. We used the rest of the velvet to make curtains so you can put your head inside the box for best viewing.

image

Soon after the lid cools down you can see tiny droplets falling, like alcohol rain. You have to watch intently for awhile before you see this, but once you recognize this effect, you know it’s working. Every 15-30 seconds you’ll see a trail, which is a line of droplets that condensed around a particle of some kind. These lines fall at the same rate as the alcohol rain, so they disappear quickly. We’ve seen a handful of really visible ones but it’s not like a giant rainstorm of particles, just an occasional one.

Timing is important for this. In 5-10 minutes after you place the jar lid on the dry ice it should be cold enough for the alcohol rain to start. After about 45-50 minutes the jar starts freezing up so best to get viewing in relatively soon after you’ve identified the alcohol rain.

In the Stong book they mentioned finding the little bit of radioactive material that exists in some old smoke alarms, which can in some cases lead to many more tracks if you place it near the chamber, but we didn’t have an old smoke alarm and so couldn’t try it.

Because this was for a science fair where other kids got to see the project, my son made a safety sign:

image

Kids will need to be careful not to touch the dry ice or the velvet. That’s the only big hazard here. Adults (or high school age kids) should also be the ones to pour the alcohol onto the sponge and velvet.

My son Nicholas made a movie (big file, about 57 MB, MP4 format) about our efforts. It starts with a discussion of making the cloud chamber from a metal coffee can, an effort we abandoned because we ran out of time, but then it moves to the design on which we finally settled (we had two designs going at once, just in case). It might help you when making your own. Please forgive the “home video” nature of it, and our messy garage. It even shows the alcohol “rain” (but we didn’t capture any particle trails on the video).

If you give this project a try, please email me to let me know how it worked out!

An old (2012) story with lessons that are still important today

I had at some point bookmarked this 2012 article containing a story from BP about reducing greenhouse gas (GHG) emissions and saving money. I’m posting it here now because the lesson it teaches is still important and relevant. BP thought its efforts to reduce GHG emissions would cost money, but instead those efforts generated a positive return.

Here’s the key paragraph:

“How could there be that much value available that was only uncovered after the initiative to cut greenhouse gases, in effect to use energy more effectively, and reduce emissions of gases such as methane and halons?  Simply put, almost everyone was busy with other things, and not looking for these savings.  And perhaps more to the point, people had accepted a certain way of doing things that was not optimal, but was the way they had been done for a very long time.  When you reset the context for the operation, which is what the greenhouse gas target setting did, smart operators find a more attractive solution.”

I wrote about this general lesson in Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs back in 2012, talking about the power of the general approach of “working forward toward a goal”. In BP’s case, the goal was modest GHG emissions reductions of 10%, and setting that goal helped the institution realize possibilities it hadn’t seen before. This approach “frees you from the constraints embodied in your underlying assumptions and worldview” and prompts you to assess ideas that wouldn’t normally come up in the course of normal operations.

Another insight is that the opportunities that arise from this approach are a renewable resource:

When I asked my friend Tim Desmond at Dupont whether his Six Sigma team (which is responsible for ferreting out new cost-saving opportunities across some of Dupont’s divisions) would ever run out of opportunities, he said “No way!” Changes in technology, prices, and institutional arrangements create opportunities for cost, energy, and emissions savings that just keep on coming.

Just because companies operate in a certain way doesn’t make it “optimal” for the current situation. There are always ways to improve operations, cut costs, and reduce emissions. We just need to look.

Finally, it’s important to set such goals in the context of whole systems integrated design, in which we start from scratch to re-evaluate tried and true ways of performing tasks. Rocky Mountain Institute has for years championed the power of “Factor Ten Engineering”, which allows us to create new ways of accomplishing the same tasks with substantial improvements in efficiency and emissions.

For more on how to combine “working forward toward a goal” with “whole systems integrated design”, see Chapter 6 of Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs. Email me if you’d like a PDF copy of that chapter.

Our article on changes in data center electricity use from 2010 to 2018, out in Science Magazine today

image

Our article on global data center electricity use is out today (February 28, 2020) in Science Magazine as a Policy Forum article.

The intro of the article gives context:

123556

Key findings:

• Total global data center electricity use increased by only 6% from 2010 to 2018, even as the number of data center compute instances (i.e. virtual machines running on physical hardware) rose to 6.5 times its 2010 level by 2018 (compute instances are a measure of computing output as defined by Cisco).

• Data center electricity use rose from 194 TWh in 2010 to 205 TWh in 2018, representing about 1% of the world’s electricity use in 2018.

• Computing service demand rose rapidly from 2010 to 2018. Installed storage capacity rose 26 fold, data center IP traffic rose 11 fold, workloads and compute instances rose six fold, and the installed base of physical servers rose 30%.

• Computing efficiency rapidly increased, mostly offsetting growth in computing service demand: PUE dropped by 25% from 2010 to 2018, server energy intensity dropped by a factor of 4, the average number of servers per workload dropped by a factor of 5, and average storage drive energy use per TB dropped by almost a factor of 10.

• Expressed as energy use per compute instance, the energy intensity of the global data center industry dropped by around 20% per year between 2010 and 2018.  This efficiency improvement rate is much greater than rates observed in other key sectors of the global economy over the same period.

• We also showed that current efficiency potentials are enough to keep electricity demand roughly constant for the next doubling of computing service demand after 2018, if policy makers and industry keep pushing efficiency in their facilities, hardware, and software.

• We offered three primary areas for policy action: (1) extend current efficiency trends by stressing efficiency standards, best practice dissemination, and financial incentives; (2) increase RD&D investments in next generation computing, storage, and heat removal technologies to deliver efficiency gains when current trends approach their limits, while incentivizing renewable power in parallel; and (3) invest in robust data collection, modeling, and monitoring.

Articles summarizing the work appeared yesterday in The New York Times, Bloomberg, USA Today, Data Center Dynamics, Wired, Quartz, IFL Science, New Scientist, and One Zero, among other outlets. Google also did a blog post describing their progress in improving data center efficiency over time.

The Northwestern University news release is here.

The UCSB news release is here.

The Lawrence Berkeley National Laboratory release is here.

The spreadsheet model used for the analysis can be downloaded here: https://zenodo.org/record/3668743#.XmF-Gi2ZPWZ

The full reference is

Masanet, Eric, Arman Shehabi, Nuoa Lei, Sarah Smith, and Jonathan Koomey. 2020. “Recalibrating global data center energy-use estimates.” Science. vol. 367, no. 6481. pp. 984. [http://science.sciencemag.org/content/367/6481/984.abstract]

Our analysis of supercomputer efficiency over time

Sam Naffziger of AMD and I just published our report on the efficiency of supercomputers over time.

Here’s the abstract:

The energy efficiency of computing devices is a topic of ongoing research and public interest. While trends in the efficiency of laptops and desktops have been well studied, there has been surprisingly little attention paid to trends in the efficiency of high-performance computing installations (known colloquially as “supercomputers”). This article analyzes data from the industry site Top 500 (http://www.top500.org) to assess how the efficiency of supercomputers has changed over the past decade. It also compares how the efficiency and performance of a recently announced supercomputer, scheduled to be completed in 2021, compares to a simple extrapolation of those historical trends. The maximum performance of the most powerful supercomputers doubled every 2.3 years in the past decade (representing a slowdown from doubling every year from 2002 to 2009), while the efficiency of those computers doubled every 2.1 years from 2009 to 2019.

The Top 500 data have some issues, but this effort is a reasonable attempt to glean some meaning from them. We focused on analyzing each supercomputer based on the year that it started operation, so we could track meaningful technology trends.  The Top 500 tracks the same supercomputers over time as they move down the list of top 500 machines, so we eliminated all but the first instance of any particular installation’s listing in the Top 500.

We split analysis of the performance of supercomputers into two periods, 2002 to 2009 and 2009 to 2019. The 1st period shows rapid growth (doubling every year or so) while the 2nd period shows a much slower doubling time of about 2.3 years, as well as much great variance in the data.

image

The efficiency data only start to become reliable around 2009, so that’s where we started the data analysis. Efficiency of supercomputers in the Top 500 data doubled every 2.1 years for the top performing machine, the top 10% of the top performing machines, and for the complete set of machines reported in the Top 500, which is pretty remarkable regularity. One caveat is that the R-squared of the linear regression goes down a lot as we regress on the bigger data sets.

image

We then focused in on the trend for the top performing machine so we could extrapolate that trend and compare it to an upcoming supercomputer (Frontier) built using Cray and AMD technology. The performance trend data show that Frontier is significantly above the trend line when it’s expected to start operation in 2021.

image

The story is the same for efficiency, although Frontier’s height above the trend line is less dramatic than for performance.

image

The details of how Frontier is expected to achieve these results are not yet public, but the article discusses some of the most promising areas for efficiency improvements as well as focuses on the need for future work, especially in the area of co-design of hardware and software.

Koomey, Jonathan, Zachary Schmidt, and Samuel Naffziger. 2019. Supercomputing Performance And Efficiency: An Exploration Of Recent History And Near-Term Projections. Burlingame, CA: Koomey Analytics.  [https://www.amd.com/en/system/files?file=documents/Supercomputing-Performance-Efficiency.pdf]

Go ahead and watch a movie on Netflix!

In October 2019, many news outlets (including Phys.org) reported that watching half an hour of Netflix would emit the same amount of carbon dioxide (1.6 kg) as driving four miles. This appears to be yet another amazing “factoid” about information technology’s environmental footprint that has little relationship to reality.

I dug into the calculations, at the prompting of the BBC, and figured out the real story. Half an hour of Netflix emits less than 20 grams of carbon dioxide, probably much less. The BBC interviewed me last week and did a nice story about it.

Listen to the episode here (it is the first story in the 28 minute show).

Download a podcast version here (grab the one from Friday January 24, 2020, titled “Netflix and Chill”).

I discovered a hidden gem in Palo Alto today

image

The Museum of American Heritage is a fantastic independent museum in Palo Alto, California. We needed a short activity to pass the time (we were in Palo Alto for coding lessons for one of our boys) and I discovered this place online. What a find it is!

The museum is a collection of artifacts from the 1800s and early 1900s, mostly gadgets of various sorts. It has a “general store” that uses the artifacts from one of the founders (whose parents owned a general store in the area until 1965). Some familiar brands are there if you look closely.

image


Our boys had a go at dialing my phone number on an old rotary phone (they needed a hint).

image


They had a real ice box! The big block of ice went in the upper left hand compartment and a bucket to catch melting water was in the lower right. Food went into the right hand compartment. Note the thickness of the doors. Well insulated!

image


They also had an early 1900s fridge. Apparently it used freon and needed that big condenser on top. The compartment wasn’t very big, maybe 1.5 feet x 3 feet by 1 feet deep, if that. Also note the tiny freezer.

image


For the kid set, the best features were the erector sets (not featured, but a source of endless fun) and the working old-style pinball machine.

image


We also saw a cool bacon cooker! The fat drips off the rounded metal into the platter below. This looks like a gadget someone should make a modern version of now.

image


Finally, we showed up on the same day as the Palo Alto “Repair Cafe” in which experts with tools help people who bring in their old appliances to get fixed up. It was quite a scene. It happens quarterly.

image

This little museum vastly exceeded our expectations. If you are in the area, by all means give it a go.

Here are the details:

Museum of American Heritage

http://www.moah.org

351 Homer Avenue

Palo Alto, California 94301

+1-650-321-1004

Open from 11am to 4pm on
Fridays, Saturdays and Sundays.
FREE General Admission (donations gratefully accepted).

For more details on the Repair Cafe, click here.

Blog Archive
Stock1

Koomey researches, writes, and lectures about climate solutions, critical thinking skills, and the environmental effects of information technology.

Partial Client List

  • AMD
  • Dupont
  • eBay
  • Global Business Network
  • Hewlett Packard
  • IBM
  • Intel
  • Microsoft
  • Procter & Gamble
  • Rocky Mountain Institute
  • Samsung
  • Sony
  • Sun Microsystems
  • The Uptime Institute