1. My visit to Bletchley Park and The UK Computing History Museum in November 2017

    Regular readers know that I’ve studied the history of computing for a very long time. About four years ago (November 17, 2017) I had the good fortune to visit Bletchley Park and the UK’s National Museum of Computing, outside of London. They are contiguously located, so it was easy to visit both, and well worth the trip. I’ve been meaning to write up a brief account since the visit, and finally made the time.

    Both of these museums highlight the role of mathematics and computing in the UK war effort in the late 1930s and 40s, which was only made public in the 1990s. Code breaking featured prominently, as did Alan Turing. In Bletchley Park they’ve kept some of the offices just as they were, so it’s wonderful to be in that space and imagine what it was like to work there.

    Here’s a picture of Alan Turing’s office as it looks now (and looked then):

    image

    Here’s a wonderful sculpture of Turing:

    image

    This funky 1990s era website has a lot of juicy historical detail, so if you’re feeling adventurous, check it out.

    I found the recreated Colossus computer to be the highlight of the trip to the National Museum of Computing. When British Telecom (BT) started decomissioning their vacuum tube equipment in the 1980s and 1990s, some clever folks realized they could use the original design schematics for Colossus to rebuild it using the tubes from BT. The original machine is long since gone, but they made an exact replica, and it works!

    It’s a special purpose computer in the purest sense. Its sole purpose was to break German Lorenz cipher. There is no clock as we understand it now, the machine is driven by a paper tape that runs in a loop. Each character is composed of 5 bits, and the machine could process 5,000 characters per second. It has 2,500 tubes, some argon filled, the rest vacuum tubes. Total power draw in operation is 8 kW.

    I met Phil Hayes, the Chief Colossus Engineer, and asked him if there was any way to convert the 5,000 characters per second into something comparable to “instructions per second” or another more modern unit of performance. Phil was pretty sure that wasn’t possible, due to the specialized nature of the tasks performed by this computer.

    Here’s a photo of me with Phil in front of Colossus:

    image

    Click on the link below to download a video of Colossus in operation (the sounds are great!). It’s a big file (46 MB) but worth the download:

    Download video of Colossus in operation

    If you are interested in the history of computing and are in and around London, by all means take the trip to Milton Keynes and check out these two world class museums.

  2. Our new article published today in Joule: “Does not compute: Avoiding pitfalls in assessing the Internet’s energy and carbon impacts”

    image

    I and Professor Eric Masanet of UC Santa Barbara have a new commentary article out today in the refereed journal Joule. It explores four common pitfalls that cause researchers and commentators to exaggerate information technology electricity use and emissions, and suggests four ways industry and researchers can avoid spreading such misconceptions in the future.

    It’s a short article, so I won’t spoil it by giving too much away, but the figure above summarizes one key lesson from our review: Growth in data traffic in either the short term or the long term does not necessarily imply growth in energy use. It depends on how fast efficiency improves!

    I summarize the conclusions in my June 1, 2021 keynote for the iTherm conference:

    image

    Here’s the reference:

    Koomey, Jonathan, and Eric Masanet. 2021. “Does not compute: Avoiding pitfalls in assessing the Internet’s energy and carbon impacts.” Joule. June 24. [https://doi.org/10.1016/j.joule.2021.05.007]

  3. I gave a virtual keynote today for the iTherm 2021 technical conference, focusing on misconceptions about electricity use and emissions associated with computing

    image

    My keynote talk today for the iTherm 2021 technical conference is an expansion of points made in a commentary article by me and Professor Eric Masanet, UCSB, which is “in press” at Joule right now (more when that’s published). I presented nine different high-profile misconceptions about electricity use and emissions associated with computing, explored four pitfalls that lead to such misconceptions, and suggested four ways we can do better in the future.

    Here is the conclusions slide:

    image

    You can download a PDF of the slides (which include three pages of references) HERE.

  4. I gave a virtual talk today for the Organization for Security and Co-operation in Europe and the World Energy Council on the role of  ICT in the energy transition

    image

    My talk was titled “Information and communications technology (ICT) and the energy/climate transition”, and I presented it today (November 24, 2020) at the 3rd Vienna Energy Strategy Dialogue, on the Implications of the Global Energy Transition”, Vienna, Austria. 

    The key points:

    • Direct electricity used by ICT is modest and hasn’t grown much if at all in recent years.

    • Nobody can credibly project ICT electricity use more than a few years ahead, and exaggerations of ICT electricity use abound in the literature.

    • ICT is a powerful source of emissions reductions throughout the economy, which is why I call ICT our “ace in the hole” when it comes to facing the climate  challenge.

    To download a PDF version of the talk, click here.

  5. An Update On Trends In US Primary Energy, Electricity, And Inflation-Adjusted GDP Through 2019

    Back in 2015, Professor Richard Hirsh (Virginia Tech) and I published the following article in The Electricity Journal, documenting trends in US primary energy, electricity, and real (inflation-adjusted) Gross Domestic Product (GDP) through 2014:

    Hirsh, Richard F., and Jonathan G. Koomey. 2015. “Electricity Consumption and Economic Growth: A New Relationship with Significant Consequences?” The Electricity Journal. vol. 28, no. 9. November. pp. 72-84. [http://www.sciencedirect.com/science/article/pii/S1040619015002067]

    Every year since, my colleage Zach Schmidt and I have updated the trend numbers for the US using the latest energy and electricity data from the US Energy Information Administration (EIA).  This short blog post gives the three key graphs from that study updated to 2019, and makes a few observations.

    Figure 1 shows GDP, primary energy, and electricity consumption through 2019, expressed as an index with 1973 values equaling 1.0. From 2017 to 2018, GDP grew a little more slowly and primary energy and electricity grew a little more rapidly than in recent years, but primary energy and electricity consumption both dropped in 2019 relative to the year before. GDP continued to show modest growth consistent with recent historical rates (all bets are off for 2020, though, given the likely effects of COVID-19).

    image

    The overall picture really hasn’t changed that much. Electricity consumption and primary energy consumption have been flat for about a decade and two decades (respectively).

    Figure 2 shows the ratio of primary energy and electricity consumption to GDP, normalized to 1973 = 1.0. The trends there are pretty clear as well. Primary energy use per unit of GDP has been declining since the early 1970s, while the ratio of electricity use to GDP has been declining since the mid-1990s. Before the 1970s, electricity intensity of economic activity was increasing, and from the early 1970s to the mid-1990s, it was roughly constant.

    image

    Figure 3 (which was Figure 4 in the Hirsh and Koomey article) shows the annual change in electricity consumption going back to 1950. Growth in total US electricity consumption has just about stopped in the past decade, but there’s significant year-to-year variation. The decline in 2019 electricity use almost offset the growth from 2017 to 2018 (and this decline predates the effects of COVID-19 on economic activity).

    image

    Flat or declining consumption poses big challenges to utilities, whose business models depend on continued growth to increase profits (unless they are in states like California, where the regulators have decoupled electricity use from profits). If the US embarks on a sustained effort to #electrifyeverything, then these trends can be reversed, but that will take time, and in the meantime, the long running efforts on efficiency standards and labeling continue to have substantial effects on electricity consumption in developed nations.

    Email me at jon@koomey.com if you’d like a copy of the 2015 article or the latest spreadsheet with graphs. If you want to use these graphs, you are free to do so as long as you don’t change the data and you credit the work as follows:

    This graph is an updated version of one that appeared in Hirsh and Koomey (2015), using data from the US Energy Information Administration and the US Bureau of Economic Analysis.

    Hirsh, Richard F., and Jonathan G. Koomey. 2015. “Electricity Consumption and Economic Growth: A New Relationship with Significant Consequences?“ The Electricity Journal. vol. 28, no. 9. November. pp. 72-84. [http://www.sciencedirect.com/science/article/pii/S1040619015002067]

  6. A fun science project: A simple cloud chamber!

    Safety warning: This project involves dry ice, which can really damage your skin if you make direct contact with it. If you attempt this activity, use appropriate safety precautions (like oven mitts and tongs to move the dry ice)

    When I was a kid I always wanted to make a cloud chamber, which makes vapor trails of atomic particles visible to the naked eye. I first learned about it from reading a book by C. L. Stong titled “The Amateur Scientist”, which was a compilation of Stong’s columns in Scientific American. It’s an amazing book, and if you love tinkering as much as I do, it’s a terrific source of inspiration. It was published in 1960 (yes, I’m old) and I still have my copy (yes, I’m a bit of a packrat). 

    image

    You can still order a used copy on Amazon for almost $60, but for the DIY science geek it’s well worth it (even today). Some of the chapters include “A homemade atom smasher”, “The Millikan oil-drop experiment”, “A simple magnetic resonance spectrometer”, “Homemade electrostatic generators”, “A low-speed wind tunnel”, “An electronic seismograph”, “A transistorized drive for telescopes”, and lots of other fun projects in many fields of science. 

    The chapter on cloud chambers is very thorough, explaining many different designs and even showing how you can use magnetic fields to detect curvature in the particle tracks and determine exactly which types of charged particles they might be.

    Back in those days I didn’t have access to dry ice so never did the experiment, but now it’s available in every supermarket. When one of our boys needed a science project, I suggested this one, and he jumped at it.

    Nowadays there are many resources available online, and one of the best is the one by Science Friday, but I want to describe some things we learned from doing it using that book from 1960 in case you want to try this yourself.

    The basic idea is to take a glass jar with a metal screw top, stuff a sponge in the bottom of the jar, pour some 90+% rubbing alcohol on the sponge, screw on the lid, invert it, place it on some dry ice, shine a flashlight from the side, and see what happens. When it works, you first see what looks like a tiny drizzle of alcohol droplets, then every so often (a few times a minute for us) you see a trail of condensed droplets that appears and then falls at the same rate as the alcohol “rain”. Those are atomic particles making their way through the alcohol clouds (see the Science Friday link above for examples of how these look).

    It’s not as simple I made it sound in the previous paragraph. The inside of the jar lid needs to be black, for contrast. The light needs to be just so. Your container needs to be clear enough for visibility.

    It’s important to choose the right container. Our first attempt used a pickle jar (the one we happened to have) that didn’t have super clear glass (it was a bit wavy). Once we got a better jar it worked great, so check the visibility through the glass before choosing a jar. We also tried this with a short (about 3″) tall jar, and that didn’t work as well because the glass frosted over from cold too quickly. Get a taller one (more like 6-8″ high).

    Some websites advocate using permanent marker on the inside of the jar lid to make it black, but we found that the alcohol removed the marker so this didn’t work so well. Based on advice from the Stong book, we ended up buying some velvet (about half a yard) from the fabric store and cutting a piece that was about 1.5 feet square. We placed this over the open jar and then screwed the top on (velvet side was inside the jar).

    image

    When you flip that over, it looks like this.

    image

    The nice thing about this setup is that you can cover the block of dry ice (ours was about 10″ square and 1.5″ high) with the velvet and the metal top conducts heat away from the metal top and through the velvet. Stong recommended adding a little alcohol to the velvet also (in addition to charging the sponge with it) and that seemed to work for us. The cloth also covers the dry ice and prevents dry ice “steam” from interfering from viewing. It also prevents direct contact with dry ice, as a safety measure.

    We then needed to create a light, and we improvised using a headlamp and a can of beans.

    image

    We put this to the side of the jar with the dry ice underneath, 

    Here’s how it looked inside after we put the jar with velvet and the lighting source inside an Amazon pantry box, with the whole thing on a cookie sheet for ease of carrying. We also put a doubled up towel underneath the dry ice to insulate it.

    image

    Here’s how it looked inside the box with the light on.

    image

    You’ll need to play with the lighting a bit. We used the rest of the velvet to make curtains so you can put your head inside the box for best viewing.

    image

    Soon after the lid cools down you can see tiny droplets falling, like alcohol rain. You have to watch intently for awhile before you see this, but once you recognize this effect, you know it’s working. Every 15-30 seconds you’ll see a trail, which is a line of droplets that condensed around a particle of some kind. These lines fall at the same rate as the alcohol rain, so they disappear quickly. We’ve seen a handful of really visible ones but it’s not like a giant rainstorm of particles, just an occasional one.

    Timing is important for this. In 5-10 minutes after you place the jar lid on the dry ice it should be cold enough for the alcohol rain to start. After about 45-50 minutes the jar starts freezing up so best to get viewing in relatively soon after you’ve identified the alcohol rain.

    In the Stong book they mentioned finding the little bit of radioactive material that exists in some old smoke alarms, which can in some cases lead to many more tracks if you place it near the chamber, but we didn’t have an old smoke alarm and so couldn’t try it.

    Because this was for a science fair where other kids got to see the project, my son made a safety sign:

    image

    Kids will need to be careful not to touch the dry ice or the velvet. That’s the only big hazard here. Adults (or high school age kids) should also be the ones to pour the alcohol onto the sponge and velvet.

    My son Nicholas made a movie (big file, about 57 MB, MP4 format) about our efforts. It starts with a discussion of making the cloud chamber from a metal coffee can, an effort we abandoned because we ran out of time, but then it moves to the design on which we finally settled (we had two designs going at once, just in case). It might help you when making your own. Please forgive the “home video” nature of it, and our messy garage. It even shows the alcohol “rain” (but we didn’t capture any particle trails on the video).

    If you give this project a try, please email me to let me know how it worked out!

  7. An old (2012) story with lessons that are still important today

    image

    I had at some point bookmarked this 2012 article containing a story from BP about reducing greenhouse gas (GHG) emissions and saving money. I’m posting it here now because the lesson it teaches is still important and relevant. BP thought its efforts to reduce GHG emissions would cost money, but instead those efforts generated a positive return.

    Here’s the key paragraph:

    “How could there be that much value available that was only uncovered after the initiative to cut greenhouse gases, in effect to use energy more effectively, and reduce emissions of gases such as methane and halons?  Simply put, almost everyone was busy with other things, and not looking for these savings.  And perhaps more to the point, people had accepted a certain way of doing things that was not optimal, but was the way they had been done for a very long time.  When you reset the context for the operation, which is what the greenhouse gas target setting did, smart operators find a more attractive solution.”

    I wrote about this general lesson in Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs back in 2012, talking about the power of the general approach of “working forward toward a goal”. In BP’s case, the goal was modest GHG emissions reductions of 10%, and setting that goal helped the institution realize possibilities it hadn’t seen before. This approach “frees you from the constraints embodied in your underlying assumptions and worldview” and prompts you to assess ideas that wouldn’t normally come up in the course of normal operations.

    Another insight is that the opportunities that arise from this approach are a renewable resource:

    When I asked my friend Tim Desmond at Dupont whether his Six Sigma team (which is responsible for ferreting out new cost-saving opportunities across some of Dupont’s divisions) would ever run out of opportunities, he said “No way!” Changes in technology, prices, and institutional arrangements create opportunities for cost, energy, and emissions savings that just keep on coming.

    Just because companies operate in a certain way doesn’t make it “optimal” for the current situation. There are always ways to improve operations, cut costs, and reduce emissions. We just need to look.

    Finally, it’s important to set such goals in the context of whole systems integrated design, in which we start from scratch to re-evaluate tried and true ways of performing tasks. Rocky Mountain Institute has for years championed the power of “Factor Ten Engineering”, which allows us to create new ways of accomplishing the same tasks with substantial improvements in efficiency and emissions.

    For more on how to combine “working forward toward a goal” with “whole systems integrated design”, see Chapter 6 of Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs. Email me if you’d like a PDF copy of that chapter.

  8. Our article on changes in data center electricity use from 2010 to 2018, out in Science Magazine today

    image

    Our article on global data center electricity use is out today (February 28, 2020) in Science Magazine as a Policy Forum article. 

    The intro of the article gives context:

    Data centers represent the information backbone of an increasingly digitalized world. Demand for their services has been rising rapidly (1), and data-intensive technologies such as artificial intelligence, smart and connected energy systems, distributed manufacturing systems, and autonomous vehicles promise to increase demand further (2). Given that data centers are energy-intensive enterprises, estimated to account for around 1% of worldwide electricity use, these trends have clear implications for global energy demand and must be analyzed rigorously. Several oft-cited yet simplistic analyses claim that the energy used by the world’s data centers has doubled over the past decade and that their energy use will triple or even quadruple within the next decade (35). Such estimates contribute to a conventional wisdom (5, 6) that as demand for data center services rises rapidly, so too must their global energy use. But such extrapolations based on recent service demand growth indicators overlook strong countervailing energy efficiency trends that have occurred in parallel (see the first figure). Here, we integrate new data from different sources that have emerged recently and suggest more modest growth in global data center energy use (see the second figure). This provides policy-makers and energy analysts a recalibrated understanding of global data center energy use, its drivers, and near-term efficiency potential.

    Key findings: 

    • Total global data center electricity use increased by only 6% from 2010 to 2018, even as the number of data center compute instances (i.e. virtual machines running on physical hardware) rose to 6.5 times its 2010 level by 2018 (compute instances are a measure of computing output as defined by Cisco). 

    • Data center electricity use rose from 194 TWh in 2010 to 205 TWh in 2018, representing about 1% of the world’s electricity use in 2018. 

    • Computing service demand rose rapidly from 2010 to 2018. Installed storage capacity rose 26 fold, data center IP traffic rose 11 fold, workloads and compute instances rose six fold, and the installed base of physical servers rose 30%. 

    • Computing efficiency rapidly increased, mostly offsetting growth in computing service demand: PUE dropped by 25% from 2010 to 2018, server energy intensity dropped by a factor of 4, the average number of servers per workload dropped by a factor of 5, and average storage drive energy use per TB dropped by almost a factor of 10. 

    • Expressed as energy use per compute instance, the energy intensity of the global data center industry dropped by around 20% per year between 2010 and 2018.  This efficiency improvement rate is much greater than rates observed in other key sectors of the global economy over the same period. 

    • We also showed that current efficiency potentials are enough to keep electricity demand roughly constant for the next doubling of computing service demand after 2018, if policy makers and industry keep pushing efficiency in their facilities, hardware, and software. 

    • We offered three primary areas for policy action: (1) extend current efficiency trends by stressing efficiency standards, best practice dissemination, and financial incentives; (2) increase RD&D investments in next generation computing, storage, and heat removal technologies to deliver efficiency gains when current trends approach their limits, while incentivizing renewable power in parallel; and (3) invest in robust data collection, modeling, and monitoring.

    Articles summarizing the work appeared yesterday in The New York Times, BloombergUSA Today, Data Center Dynamics, WiredQuartz, IFL Science, New Scientist, and One Zero, among other outlets. Google also did a blog post describing their progress in improving data center efficiency over time.

    The Northwestern University news release is here.

    The UCSB news release is here.

    The Lawrence Berkeley National Laboratory release is here.

    The spreadsheet model used for the analysis can be downloaded here: https://zenodo.org/record/3668743#.XmF-Gi2ZPWZ

    The full reference is

    Masanet, Eric, Arman Shehabi, Nuoa Lei, Sarah Smith, and Jonathan Koomey. 2020. “Recalibrating global data center energy-use estimates.” Science. vol. 367, no. 6481. pp. 984. [http://science.sciencemag.org/content/367/6481/984.abstract]

  9. Our analysis of supercomputer efficiency over time

    Sam Naffziger of AMD and I just published our report on the efficiency of supercomputers over time.

    Here’s the abstract:

    The energy efficiency of computing devices is a topic of ongoing research and public interest. While trends in the efficiency of laptops and desktops have been well studied, there has been surprisingly little attention paid to trends in the efficiency of high-performance computing installations (known colloquially as “supercomputers”). This article analyzes data from the industry site Top 500 (http://www.top500.org) to assess how the efficiency of supercomputers has changed over the past decade. It also compares how the efficiency and performance of a recently announced supercomputer, scheduled to be completed in 2021, compares to a simple extrapolation of those historical trends. The maximum performance of the most powerful supercomputers doubled every 2.3 years in the past decade (representing a slowdown from doubling every year from 2002 to 2009), while the efficiency of those computers doubled every 2.1 years from 2009 to 2019.

    The Top 500 data have some issues, but this effort is a reasonable attempt to glean some meaning from them. We focused on analyzing each supercomputer based on the year that it started operation, so we could track meaningful technology trends.  The Top 500 tracks the same supercomputers over time as they move down the list of top 500 machines, so we eliminated all but the first instance of any particular installation’s listing in the Top 500.

    We split analysis of the performance of supercomputers into two periods, 2002 to 2009 and 2009 to 2019. The 1st period shows rapid growth (doubling every year or so) while the 2nd period shows a much slower doubling time of about 2.3 years, as well as much great variance in the data.

    image

    The efficiency data only start to become reliable around 2009, so that’s where we started the data analysis. Efficiency of supercomputers in the Top 500 data doubled every 2.1 years for the top performing machine, the top 10% of the top performing machines, and for the complete set of machines reported in the Top 500, which is pretty remarkable regularity. One caveat is that the R-squared of the linear regression goes down a lot as we regress on the bigger data sets.

    image

    We then focused in on the trend for the top performing machine so we could extrapolate that trend and compare it to an upcoming supercomputer (Frontier) built using Cray and AMD technology. The performance trend data show that Frontier is significantly above the trend line when it’s expected to start operation in 2021.

    image

    The story is the same for efficiency, although Frontier’s height above the trend line is less dramatic than for performance.

    image

    The details of how Frontier is expected to achieve these results are not yet public, but the article discusses some of the most promising areas for efficiency improvements as well as focuses on the need for future work, especially in the area of co-design of hardware and software.

    Koomey, Jonathan, Zachary Schmidt, and Samuel Naffziger. 2019. Supercomputing Performance And Efficiency: An Exploration Of Recent History And Near-Term Projections. Burlingame, CA: Koomey Analytics.  [https://www.amd.com/en/system/files?file=documents/Supercomputing-Performance-Efficiency.pdf]

  10. Go ahead and watch a movie on Netflix!

    image

    In October 2019, many news outlets (including Phys.org) reported that watching half an hour of Netflix would emit the same amount of carbon dioxide (1.6 kg) as driving four miles. This appears to be yet another amazing “factoid” about information technology’s environmental footprint that has little relationship to reality. 

    I dug into the calculations, at the prompting of the BBC, and figured out the real story. Half an hour of Netflix emits less than 20 grams of carbon dioxide, probably much less. The BBC interviewed me last week and did a nice story about it.

    Listen to the episode here (it is the first story in the 28 minute show).

    Download a podcast version here (grab the one from Friday January 24, 2020, titled “Netflix and Chill”).

Blog Archive
Stock1

I research, write, and lecture about climate solutions, critical thinking skills, and the environmental effects of information technology.

Partial Client List

  • AMD
  • Dupont
  • eBay
  • Global Business Network
  • Hewlett Packard
  • IBM
  • Intel
  • Microsoft
  • Procter & Gamble
  • Rocky Mountain Institute
  • Samsung
  • Sony
  • Sun Microsystems
  • The Uptime Institute