Why "drain America first" is not a jobs policy

Michael Levi of the Council on Foreign Relations did a useful analysis last fall of the jobs the US could expect from drastically reducing oversight and expanding oil and gas drilling, both on and offshore.  There are a few important insights from this work, the main one being that advocates for this approach vastly overestimate the number of jobs that would come from such a policy.  This overestimate mainly results from false assumptions, a big one being the belief that the Obama Administration has somehow strangled oil and gas development in the US.  Climate Progress, referring to a Wall Street Journal article (subscription required) summarized as follows:

America’s Oil Production Grew Faster Than Any Other Country in Last Three Years

Federal forecasters are expected to confirm on Monday what the energy industry already knows: Oil production is surging in the U.S.
The U.S. Energy Information Administration is likely to raise by a substantial amount its existing estimate that U.S. oil production will grow by 550,000 barrels per day by 2020, to just over six million barrels daily.
The forecast will include new production data from developing oil fields, including the Bakken shale area in North Dakota, which could hold as much of 4.3 billion barrels of recoverable oil. North Dakota’s output of oil and related liquids topped 500,000 barrels per day in November, meaning that the state pumped more oil than Ecuador. In fact, U.S. oil production grew faster than in any other country over the last three years and will continue to surge as drillers move away from natural gas due to a growing gas glut, experts say. The glut has sent natural-gas prices to a 10-year low.

Scientists slowly adopting new web tools to promote rapid innovation

The NY Times had an article Monday 16 January 2012 discussing how new technologies are challenging the role of traditional journals in science.  Having free and open access to scientific results is one way to accelerate technological change .  The old model of closed Journals for which people and institutions pay to access will eventually give way to the “Open Access” model (where authors pay a fee to publish but the article is then freely available to everyone), and that’s a good thing for anyone who thinks we need more rapid technical innovation to help fix the problems humanity faces (I also believe we need innovation in our values, behaviors, and institutional arrangements, not just in science and technology, but that’s a separate discussion).

The article also talks about new ways to use collaborative web technologies to accelerate innovative scientific thinking, and we’re just at the beginning of learning how to tap these new technologies for this purpose.  More, please!

The article refers to a recent book by Michael Nielsen that discusses these trends more thoroughly, titled Reinventing Discovery:  The New Era of Networked Science.

The false tradeoff between economy and environment

Climate progress today once again summarizes the reasons why the alleged tradeoff between economic growth and environmental protection is really a false choice.  This is a story that cannot be told enough times, given how plausible and pervasive people mistakenly think this tradeoff is.

Here’s the beginning few paragraphs of the Climate Progress story:

“A top GE executive is calling the political battle between economy and environment “nonsense.”

In a video interview (featured below) at an international clean energy investment conference last week, Mark Vachon, vice president of GE’s successful Ecomagination program, hailed “environmental performance” as a key driver for business.

“There’s this theory that you have to pick one: economics or environmental performance. That’s nonsense. Innovation is the way you can have both,” said Vachon.”

I like to talk about it this way:  We’re going to get our energy services one way or another. Either we’ll get them from conventional fossil fuels (like oil, gas, or coal) or we’ll deliver them with some combination of energy efficiency and non-fossil alternative sources. There will be jobs and economic growth generated either way, so the real question is, do we want jobs and economic activity that threaten our climate and cost us dearly in other kinds of pollution, or jobs and economic activity that don’t?  I choose the latter!

Of course, the question of cost comes up when framing the issue this way, but this concern is easily treated by focusing on total societal cost, including externalities. For climate it’s impossible to precisely assess those risks (even though we know the risks are real and substantial), but the external costs for other pollutants are well established (and large) for fossil fuels, large enough to make many renewables already economic from society’s perspective even without including climate risks.  Check out Epstein, Paul R., Jonathan J. Buonocore, Kevin Eckerle, Michael Hendryx, Benjamin M. Stout Iii, Richard Heinberg, Richard W. Clapp, Beverly May, Nancy L. Reinhart, Melissa M. Ahern, Samir K. Doshi, and Leslie Glustrom. 2011. “Full cost accounting for the life cycle of coal."  Annals of the New York Academy of Sciences.  vol. 1219, no. 1. February 17. pp. 73-98. [http://dx.doi.org/10.1111/j.1749-6632.2010.05890.x] and Muller, Nicholas Z., Robert Mendelsohn, and William Nordhaus. 2011. "Environmental Accounting for Pollution in the United States Economy."  American Economic Review vol. 101, no. 5. August. pp. 1649–1675 for recent peer reviewed treatment of externalities.

I wrote more about the issue of externalities in my post about the EPA’s recently announced rules on mercury and other pollutants from power plants.

For those interested in a more detailed treatment of the issue of tradeoffs, see Goodstein, Eban. 1999. The Trade-Off Myth: Fact and Fiction About Jobs and the Environment. Washington, DC: Island Press.

My post giving "Four reasons why cloud computing is efficient" was #3 on the GigaOM list of top ten green stories of 2011

Back on Dec 23rd, 2011, Katie Fehrenbacher of GigaOM wrote about their top ten green stories of 2011 (as measured by number of clicks) and my post giving “Four reasons why cloud computing is efficient” was #3, which ain’t bad.   The original post was published on July 24, 2011.

Brilliant 1 minute video explaining the difference between weather and climate

Climate progress points to a wonderful video showing the difference between climate and weather.  Watch:

Interesting use of video games for educating the public about climate science

Climate Progress reports on Al Gore’s recent work promoting the use of video games to educate the public about climate science.  This is yet another example of how information technology can be a game changer, and we’re only at the beginning of learning how to use it for this purpose.  Interactive learning is powerful and effective, and I’m hopeful we’ll get a whole lot more clever at using it to help people understand science better.

Steve Lohr's NY Times blog highlighted our work on trends in computing efficiency today

I chatted with Steve Lohr of the NY Times yesterday about the implications of the last six decades of progress in computing efficiency, and his blog today reflects our conversation nicely.  He also talked about my new book, Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs, which will be released on February 15, 2012, which gives some examples of why those trends are so powerful and important.

Here’s the intro to the blog post:

A New ‘Law’ for the Mobile Computing Era

The new gadgetry at the International Consumer Electronics Show this week owes a lot to the crisp articulation of ever-increasing computer performance known as Moore’s Law. First proclaimed in 1965 by Intel’s co-founder Gordon Moore, it says that the number of transistors that can be put on a microchip doubles about every two years.

But a new descriptive formulation that focuses on energy use seems especially apt these days. So much of the excitement and product innovation today centers on battery-powered, mobile computing — smartphones, tablets, and a host of devices based on digital sensors, like personal health monitors that track vital signs and calorie-burn rates. And the impact of low-power sensor-based computing is evident well beyond the consumer market.

The trend in energy efficiency that has opened the door to the increasing spread of mobile computing is being called Koomey’s Law. It states that the amount of power needed to perform a computing task will fall by half every one and a half years.

The description of improving energy efficiency was the conclusion of an analysis published last year in the IEEE Annals of the History of Computing, with the title “Implications of Historical Trends in the Electrical Efficiency of Computing.” (An early draft [PDF] of the paper is here.) Jonathan G. Koomey, a consulting professor at Stanford University, was the lead author. His collaborators were three other scientists — Stephen Berard of Microsoft, Maria Sanchez of Carnegie Mellon University, and Henry Wong of Intel. (Mr. Koomey did not use the term “Koomey’s Law,” but others have.)

Like Moore’s Law, the significance of Koomey’s Law is more as an influential observation than a scientific discovery. Both are concepts that credibly measure what has happened and what is possible with investment and effort.

My talk at Stanford on long-term trends in the efficiency of computing

Last Halloween (October 31, 2011) I gave a talk on the long-term trends in the efficiency of computing at Stanford, and I’m finally getting around to posting the link.

On a related  note, the trends I talk about in the Stanford talk were listed as #5 in the Popular Mechanics list of the top 10 tech concepts you need to know for 2012.

EPA announces mercury rules for power plants!

The EPA today announced stricter rules on mercury emissions from power plants, which is an important development for those interested in greenhouse gas emissions.  That’s because many of the older coal plants have no pollution controls and have social costs much higher than the value of the electricity they generate.  It’s long past time for these plants to retire.  And it turns out that there’s plenty of spare natural gas-fired generation capacity to pick up the slack, so CO2 emissions from these plants will go down a lot

Here’s what I wrote in Chapter 5 of my forthcoming book, Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs:

“About 15% of existing US coal plants (about 50 GW out of 300 GW total) are old, inefficient, polluting plants that were grandfathered under the Clean Air Act, so they have few or no pollution controls.[1]  More than half of US coal plants are 35 years of age or older.[2] The total social cost of running many of these plants is higher than the cost of alternative ways of supplying that electricity (even without counting the damages from greenhouse gas emissions),[3] so they represent an obsolete capital stock from society’s perspective.  The most effective action we as a society can take would be to enforce existing environmental regulations, develop new ones (as the US EPA is now considering for mercury, mining, and other environmental issues), and charge these plants the full social cost of the damages they inflict upon us, which would double the cost per kWh of existing coal-fired plants even using low estimates of pollution costs.  This will force lots of old polluting coal plants to retire, many others to reduce their hours of operation, generate lots of economic benefits in reduced health costs, give a boost to coal’s competitors, and reduce greenhouse gas emissions, so it’s a win all the way around.”


[1] Celebi, Metin, Frank C. Graves, Gunjan Bathla, and Lucas Bressan. 2010. Potential Coal Plant Retirements Under Emerging Environmental Regulations. The Brattle Group, Inc.  December 8. [http://www.brattle.com/documents/uploadlibrary/upload898.pdf]

[2] See Figure 5-6 in Lovins, Amory B., Mathias Bell, Lionel Bony, Albert Chan, Stephen Doig, Nathan J. Glasgow, Lena Hansen, Virginia Lacy, Eric Maurer, Jesse Morris, James Newcomb, Greg Rucks, and Caroline Traube. 2011. Reinventing Fire:  Bold Business Solutions for the New Energy Era. White River Junction, VT: Chelsea Green Publishing, p. 175.

[3] For details, see Muller, Nicholas Z., Robert Mendelsohn, and William Nordhaus. 2011. “Environmental Accounting for Pollution in the United States Economy."  American Economic Review vol. 101, no. 5. August. pp. 1649–1675, and Epstein, Paul R., Jonathan J. Buonocore, Kevin Eckerle, Michael Hendryx, Benjamin M. Stout III, Richard Heinberg, Richard W. Clapp, Beverly May, Nancy L. Reinhart, Melissa M. Ahern, Samir K. Doshi, and Leslie Glustrom. 2011. ”Full cost accounting for the life cycle of coal.“  Annals of the New York Academy of Sciences.  vol. 1219, no. 1. February 17. pp. 73-98. [http://dx.doi.org/10.1111/j.1749-6632.2010.05890.x].

More on efficiency trends in computing, from my forthcoming book

My book, Cold Cash, Cool Climate:  Science-based Advice for Ecological Entrepreneurs, will be released on February 15, 2012.   In Chapter 6, I discuss the power of mobile information and communication technology, and I reproduce that section below.

The Power of Mobile ICT

The performance of electronic computers has shown remarkable and steady growth over the past 60 years, a finding that is not surprising to anyone with even a passing familiarity with computing technology. What most folks don’t know, however, is that the electrical efficiency of computing (the number of computations that can be completed per kilowatt-hour of electricity) has doubled about every one and a half years since the dawn of the computer age (See Figure 6-1).[1]  The existence of laptop computers, cellular phones, and personal digital assistants was enabled by these trends, which presage continuing rapid reductions in the power consumed by battery-powered computing devices, accompanied by new and varied applications for mobile computing, sensors, wireless communications and controls.

The most important future effect of these trends is that the power needed to perform a task requiring a fixed number of computations will fall by half every 1.5 years, enabling mobile devices performing such tasks to become smaller and less power consuming, and making many more mobile computing applications feasible.  Alternatively, the performance of some mobile devices will continue to double every 1.5 years while maintaining the same battery life (assuming battery capacity doesn’t improve).  These two scenarios define the range of possibilities.  Some applications (like laptop computers) will likely tend towards the latter scenario, while others (like mobile sensors and controls) will take advantage of increased efficiency to become less power hungry and more ubiquitous.

These technologies will allow us to better match energy services demanded with energy services supplied, and vastly increase our ability to collect and use data in real time.  They will also help us minimize the energy use and emissions from accomplishing human goals, a technical capability that we sorely need if we are to combat climate change in any serious way.  The future environmental implications of these trends are profound and only just now beginning to be understood.[2]

As one of many examples of what is becoming possible using ultra low power computing, consider the wireless no-battery sensors created by Joshua R. Smith of Intel and the University of Washington.[3]  These sensors scavenge energy from stray television and radio signals, and they use so little power (60 microwatts in this example) that they don’t need any other power source.  Stray light, motion, or heat can also be converted to meet slightly higher power needs, perhaps measured in milliwatts.

The contours of this exciting design space are only beginning to be explored.  Imagine wireless temperature, humidity, or pollution sensors that are powered by ambient energy flows, send information over wireless networks, and are so cheap and small that thousands can be installed where needed.   Imagine sensors scattered throughout a factory so pollutant or materials leaks can be pinpointed rapidly and precisely. Imagine sensors spread over vast areas of glacial ice, measuring motion, temperature, and ambient solar insolation at very fine geographical resolution.  Imagine tiny sensors inside products that tell consumers if temperatures while in transport and storage have been within a safe range.  Imagine a solar powered outdoor trash can/compactor that notifies the dispatcher when it is full, thus saving truck trips (no need to imagine this one, it’s real[4]). In short, these trends in computing will help us lower greenhouse gas emissions and allow vastly more efficient use of resources.

Figure 6-1:  Computations per kWh over time

Trends in computations per kWh since 1946
Creative Commons License


Graph of computations/kWh from 1946 to 2009 by Jonathan Koomey is licensed under aCreative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.
Permissions beyond the scope of this license may be available at http://www.koomey.com.


[1] Koomey, Jonathan G., Stephen Berard, Marla Sanchez, and Henry Wong. 2011. “Implications of Historical Trends in The Electrical Efficiency of Computing."  IEEE Annals of the History of Computing.  vol. 33, no. 3. July-September. pp. 46-54. [http://doi.ieeecomputersociety.org/10.1109/MAHC.2010.28]

[2] Greene, Kate. 2011. "A New and Improved Moore’s Law.” In Technology Review. September 12. [http://www.technologyreview.com/computing/38548/?p1=A1]

“A deeper law than Moore’s?” In The Economist. October 10, 2011. [http://www.economist.com/blogs/dailychart/2011/10/computing-power]

[3] Eisenberg, Anne. 2010. “Bye-Bye Batteries: Radio Waves as a Low-Power Source.” The New York Times.  New York, NY.  July 18. p. BU3. [http://www.nytimes.com/2010/07/18/business/18novel.html]

[4] [http://bigbellysolar.com/]

NYT today explores the implications of efficiency improvements in computing, even though the article doesn't mention computing efficiency once!

Steve Lohr wrote a great article for the NY Times today titled “The Internet Gets Physical”, where he explores what he thinks is the next big thing (and I think he’s right).  The article states:

“…the protean Internet technologies of computing and communications are rapidly spreading beyond the lucrative consumer bailiwick. Low-cost sensors, clever software and advancing computer firepower are opening the door to new uses in energy conservation, transportation, health care and food distribution. The consumer Internet can be seen as the warm-up act for these technologies.”

Internet watchers are just now waking up to this new potential, which is driven by trends in the efficiency of computing that we identified in our recent paper in the IEEE Annals of the History of Computing (Koomey et al. 2011).  The electrical efficiency of computing (the number of computations that can be completed per kilowatt-hour of electricity) has doubled about every one and a half years since the dawn of the computer age, so that the power needed to perform a task requiring a fixed number of computations will fall by half every 1.5 years.  Devices performing such tasks can thus become smaller and less power consuming, making many more mobile computing applications feasible.

These technologies will allow us to better match energy services demanded with energy services supplied, and vastly increase our ability to collect and use data in real time.  They will also help us minimize the energy use and emissions from accomplishing human goals, a technical capability that we sorely need if we are to combat climate change in any serious way.  The future environmental implications of these trends are profound and only just now beginning to be understood (Greene 2011, The Economist 2011)

If you know of specific examples of innovations in low power computing, sensors, and controls, I’m eager to hear about them, as I’m starting to think about how to describe these trends for a broader audience.  So send me email!

For more background, check out my recent radio interviews on this topic.  Also see the talk I gave at Microsoft in December 2010.

References

Greene, Kate. 2011. “A New and Improved Moore’s Law.” In Technology Review. September 12.

A deeper law than Moore’s?” In The Economist. October 10, 2011.

Koomey, Jonathan G., Stephen Berard, Marla Sanchez, and Henry Wong. 2011. “Implications of Historical Trends in The Electrical Efficiency of Computing."  IEEE Annals of the History of Computing.  vol. 33, no. 3. July-September. pp. 46-54.

An example of increasing returns to scale in photovoltaic adoption

Climate Progress points to a Yale University study on adoption of photovoltaics (PVs) in residences, in which the time lag between installations falls as the number of installations increases.  The authors call this a “peer” effect, in which a greater concentration of PV panels makes it even more likely that neighbors will also install PVs.

This is a specific example of what in the economics literature is called “increasing returns to scale”.  There are many different forms of this effect, including economies of scale, network externalities, learning by doing, and zero marginal costs for reproducing information (by using information technology).  For those interested in carbon mitigation opportunities, this effect is critical, but it is omitted by assumption from virtually all computable general equilibrium models, because including it would result in path dependence and multiple possible end-points for a given starting point.  The real world is full of such effects, and that’s one reason why conventional economic assessments of the costs of reducing carbon emissions almost invariably overestimate the costs of taking action.  I will have a lot more to say about increasing returns in upcoming posts.

NY Times.com article yesterday on the deluge of data from DNA sequencing

The NY Times.com article yesterday on the deluge of data from DNA sequencing raised a a couple of interesting issues for me.

Here’s one important item I noticed:

“The cost of sequencing a human genome — all three billion bases of DNA in a set of human chromosomes — plunged to $10,500 last July from $8.9 million in July 2007, according to the National Human Genome Research Institute.

That is a decline by a factor of more than 800 over four years. By contrast, computing costs would have dropped by perhaps a factor of four in that time span.”

This example highlights an important point:  the cost to perform computations is driven by more than Moore’s law.  It’s also a function of our cleverness in designing efficient algorithms and characterizing problems in the most effective ways, and that kind of cleverness can lead to much more rapid improvements in our ability to do useful computations than just the trends in raw computing horsepower would indicate.

Now on to my second point.  The big constraint in DNA research is fast becoming our ability to make sense of the voluminous data being generated by the new sequencing machines, and that takes human thinking, it’s not just a computational task.  Just as in many other areas that are likely to see an explosion in data generation (caused by the revolution in ultra low power mobile information technology) there will be big opportunities for those who can combine careful critical thinking with information technology to sort through vast piles of data and help people generate actionable information.  This is also one of the conclusions of the recently released ebook by Brynjolfsson and McAffee titled “Race Against the Machine”, which I highly recommend.

A new radio interview on trends in the energy efficiency of computing

The Canadian Broadcasting Company just posted my interview for their “Spark” radio show, which is “an ongoing conversation about technology and culture, hosted by Nora Young”.  It talks about our work on trends in the energy efficiency of computing over the past six decades, which I wrote about here and here.

Unfortunately, the tagline from the announcer at the end incorrectly indicates that I called the long term trends (doubling of energy efficiency every year and a half) “Koomey’s law”, instead of noting that it was MIT’s Technology Review that popularized the term in their recent article.  And the first person to use the term publicly was Max Henrion of Lumina Systems at a talk he gave at the Uptime Institute Symposium in 2010.  I’ve asked the producer to correct that in the web version.  Ah well..

My interview on Colorado Public Radio about data centers just aired

This interview followed a news piece reporting on why many companies are considering building data centers in Colorado Springs.  The interviewer asked some good questions and the discussion illuminates some important aspects of data centers and electricity use. It’s a good non-technical introduction to these issues.  Listen to it here.

Blog Archive
Jonathan Koomey

Koomey researches, writes, and lectures about climate solutions, critical thinking skills, and the environmental effects of information technology.

Partial Client List

  • AMD
  • Dupont
  • eBay
  • Global Business Network
  • Hewlett Packard
  • IBM
  • Intel
  • Microsoft
  • Procter & Gamble
  • Rocky Mountain Institute
  • Samsung
  • Sony
  • Sun Microsystems
  • The Uptime Institute