“… [N]ew renewable energy capacity could quadruple that of fossil fuels over next three years”

This is utility-scale capacity only. See the footnote from the original post repeated at the bottom. Also, given uncertainties related to federal data availability at federal Web sites during the partial federal shutdown, I have copied the cited report and placed it so it is publicly available in a safe location.

Quoting:

Washington DC – According to an analysis by the SUN DAY Campaign of the latest data released by the Federal Energy Regulatory Commission (FERC), natural gas dominated new electrical generating capacity in 2018. However, renewable energy sources (i.e., biomass, geothermal, hydropower, solar, wind) may be poised to swamp fossil fuels as new generating capacity is added over the next three years.

FERC’s “Energy Infrastructure Update” report (with data through November 30, 2018) notes that new natural gas generation placed in service during the first 11 months of 2018 totaled 16,687 MW or 68.46% of the total (24,376 MW). Renewable sources accounted for only 30.12% led by wind (3,772 MW) and solar (3,449MW).(*)

However, the same report indicates that proposed generation and retirements by December 2021 include net capacity additions by renewable sources of 169,914 MW. That is 4.3 times greater than the net new additions listed for coal, oil, and natural gas combined (39,414 MW).

Net proposed generation additions from wind alone total 90,268 MW while those from solar are 64,066 MW — each greater than that listed for natural gas (56,881 MW). FERC lists only a single new 17-MW coal unit for the three-year period but 16,122 MW in retirements. Oil will also decline by 1,362 MW while nuclear power is depicted as remaining largely unchanged (i.e., a net increase of 69 MW).

FERC’s data also reveal that renewable sources now account for 20.8% of total available installed U.S. generating capacity.(**) Utility-scale solar is nearly 3% (i.e., 2.94%) while hydropower and wind account for 8.42% and 7.77% respectively.

(*) FERC only reports data for utility-scale facilities (i.e., those rated 1-MW or greater) and therefore its data does not reflect the capacity of distributed renewables, notably rooftop solar PV which accounts for approximately 30% of the nation’s installed solar capacity.

(**) Capacity is not the same as actual generation. Capacity factors for nuclear power and fossil fuels tend to be higher than those for most renewables. For the first ten months of 2018, the U.S. Energy Information Administration reports that renewables accounted for 17.6% of the nation’s total electrical generation – that is, a bit less than their share of installed generating capacity (20.8%).

Source:

FERC’s 6-page “Energy Infrastructure Update for November 2018” was released in early January 2019. In a seeming departure from its norm, FERC did not announce the release of this report on its web page and a specific release date does not appear on the report itself. However, it is assumed the report was issued within the past week. It can be found at: https://www.ferc.gov/legal/staff-reports/2018/nov-energy-infrastructure.pdf. For the information cited in this update, see the tables entitled “New Generation In-Service (New Build and Expansion),” “Total Available Installed Generating Capacity,” and “Proposed Generation Additions and Retirements by October 2021.”

Posted in American Solar Energy Society, Anthropocene, Bloomberg New Energy Finance, BNEF, bridge to somewhere, Buckminster Fuller, clean disruption, CleanTechnica, decentralized electric power generation, decentralized energy, electricity, FERC, green tech, ILSR, investment in wind and solar energy, John Farrell, Joseph Schumpeter, leaving fossil fuels in the ground, local generation, local self reliance, natural gas, rate of return regulation, solar democracy, solar domination, solar energy, solar power, Sonnen community, the energy of the people, the right to know, the value of financial assets, Tony Seba, wind energy, wind power, zero carbon | Leave a comment

A look at an electricity consumption series using SNCDs for clustering

(Slightly amended with code and data link, 12th January 2019.)

Prediction of electrical load demand or, in other words, electrical energy consumption is important for the proper operation of electrical grids, at all scales. RTOs and ISOs forecast demand based upon historical trends and facts, and use these to assure adequate supply is available.

This is particularly important when supply is intermittent, such as solar PV generation or wind generation, but, to some degree, all generation is intermittent and can be unreliable.

Such prediction is particularly difficult at the small and medium scale. At large scale, relative errors are easier to control, and there are a large number of units drawing upon or producing electrical energy which are amassed. At the very smallest of scales, it may be possible to anticipate usage of single institutions or households based upon historical trends and living patterns. This has only partly been achieved in devices like the Sense monitor, and prediction is still far away.

Presumably, techniques which apply to the very small could be scaled to deal with small and moderate size subgrids, although the moderate sized subgrids will probably be adaptations of the techniques used at large scale.

There is some evidence that patterns of electrical consumption directly follow the behavior of the building’s or home’s occupants that day, modulated by outside temperatures and occurrence of notable or special events. Accordingly, being able to identify the pattern of behavior early in a day can offer power prior information for the consumption pattern that will hold later in the day.

There is independent evidence occupant do, in a sense, select their days from a palette of available behaviors. This has been observed in Internet Web traffic, as well as secondary signals in emissions from transportation centers. Discovering that palette of behaviors is a challenge.

This post reports on an effort do such discovery using time series of electricity consumption for 366 days from a local high school. Consumption is sampled every 15 minutes.

Here is a portion of this series, with some annotations:

The segmentation is done automatically with a regime switching detector. The portion below shows these data atop a short-term Fourier spectrum of the same (STFT):

The point of this exercise is to cluster days together in a principled day, so to attempt to derive a kind of palette. One “color” of such a palette would be a cluster. Accordingly, if a day is identified, from the preliminary trace of its electricity consumption as being a member of a cluster, the bet is that the remainder of the day’s consumption will follow the patterns of other series seen in the cluster. If more than one cluster fits, then some kind of model average across clusters can be taken as predictive, obviously with greater uncertainty.

(Click on figure to see larger image and then use browser Back Button to return to blog.)

Each day of 366 for the 2007-2008 academic year was separated and pairwise dissimilarities for all days were calculated using a Symmetrized Normalized Compression Divergence (SNCD) described previously. The dissimilarity matrix was used with the default hierarchical clustering function, hclust, in R and its Ward-D2 method. That clustering produced the following dendrogram:

The facilities of the dynamicTreeCut package of R were used to find a place to cut the dendrogram and thus identify clusters. The cutreeDynamic function was called on the result of hierarchical clustering, using the hybrid method, and a minimum cluster size setting of one, to give the cluster chooser free range.

There were 5 clusters found. Here they are in various ways.

First, the dates and their weekdays:


$`1`
 2007-09-06  2007-09-07  2007-09-10  2007-09-14  2007-09-17  2007-09-18  2007-09-21  2007-09-25  2007-09-27  2007-10-01  2007-10-02  2007-10-03  2007-10-04  2007-10-09 
 "Thursday"    "Friday"    "Monday"    "Friday"    "Monday"   "Tuesday"    "Friday"   "Tuesday"  "Thursday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"   "Tuesday" 
 2007-10-10  2007-10-22  2007-10-23  2007-10-29  2007-10-31  2007-11-02  2007-11-05  2007-11-06  2007-11-13  2007-11-21  2007-11-28  2007-12-03  2007-12-04  2007-12-05 
"Wednesday"    "Monday"   "Tuesday"    "Monday" "Wednesday"    "Friday"    "Monday"   "Tuesday"   "Tuesday" "Wednesday" "Wednesday"    "Monday"   "Tuesday" "Wednesday" 
 2007-12-06  2007-12-11  2007-12-12  2007-12-14  2007-12-17  2007-12-18  2007-12-19  2008-01-03  2008-01-04  2008-01-11  2008-01-15  2008-01-16  2008-01-17  2008-01-18 
 "Thursday"   "Tuesday" "Wednesday"    "Friday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"    "Friday"    "Friday"   "Tuesday" "Wednesday"  "Thursday"    "Friday" 
 2008-01-22  2008-01-23  2008-01-24  2008-01-29  2008-01-30  2008-01-31  2008-02-05  2008-02-06  2008-02-07  2008-02-11  2008-02-12  2008-02-13  2008-02-25  2008-02-27 
  "Tuesday" "Wednesday"  "Thursday"   "Tuesday" "Wednesday"  "Thursday"   "Tuesday" "Wednesday"  "Thursday"    "Monday"   "Tuesday" "Wednesday"    "Monday" "Wednesday" 
 2008-02-28  2008-03-10  2008-03-12  2008-03-13  2008-03-14  2008-03-19  2008-03-24  2008-03-25  2008-04-01  2008-04-02  2008-04-03  2008-04-04  2008-04-11  2008-04-23 
 "Thursday"    "Monday" "Wednesday"  "Thursday"    "Friday" "Wednesday"    "Monday"   "Tuesday"   "Tuesday" "Wednesday"  "Thursday"    "Friday"    "Friday" "Wednesday" 
 2008-04-28  2008-04-30  2008-05-05  2008-05-07  2008-05-09  2008-05-12  2008-05-19  2008-05-22  2008-05-27  2008-05-28  2008-06-01  2008-06-02  2008-06-04  2008-06-05 
   "Monday" "Wednesday"    "Monday" "Wednesday"    "Friday"    "Monday"    "Monday"  "Thursday"   "Tuesday" "Wednesday"    "Sunday"    "Monday" "Wednesday"  "Thursday" 
 2008-06-07  2008-06-10  2008-06-13  2008-06-17  2008-06-18  2008-06-19  2008-06-23  2008-06-24  2008-06-27  2008-07-01  2008-07-02  2008-07-05  2008-08-11  2008-08-18 
 "Saturday"   "Tuesday"    "Friday"   "Tuesday" "Wednesday"  "Thursday"    "Monday"   "Tuesday"    "Friday"   "Tuesday" "Wednesday"  "Saturday"    "Monday"    "Monday" 
 2008-08-27 
"Wednesday" 

$`2`
 2007-09-03  2007-09-04  2007-09-08  2007-09-12  2007-09-13  2007-09-15  2007-09-20  2007-09-24  2007-09-29  2007-10-06  2007-10-07  2007-10-08  2007-10-12  2007-10-15 
   "Monday"   "Tuesday"  "Saturday" "Wednesday"  "Thursday"  "Saturday"  "Thursday"    "Monday"  "Saturday"  "Saturday"    "Sunday"    "Monday"    "Friday"    "Monday" 
 2007-10-20  2007-10-27  2007-10-28  2007-10-30  2007-11-03  2007-11-22  2007-11-23  2007-11-26  2007-12-01  2007-12-13  2007-12-24  2007-12-26  2007-12-28  2007-12-31 
 "Saturday"  "Saturday"    "Sunday"   "Tuesday"  "Saturday"  "Thursday"    "Friday"    "Monday"  "Saturday"  "Thursday"    "Monday" "Wednesday"    "Friday"    "Monday" 
 2008-01-05  2008-01-14  2008-01-21  2008-01-25  2008-02-02  2008-02-04  2008-02-09  2008-02-10  2008-02-15  2008-02-18  2008-02-19  2008-02-20  2008-02-21  2008-02-22 
 "Saturday"    "Monday"    "Monday"    "Friday"  "Saturday"    "Monday"  "Saturday"    "Sunday"    "Friday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"    "Friday" 
 2008-03-04  2008-03-06  2008-03-15  2008-03-18  2008-03-23  2008-03-28  2008-03-29  2008-04-05  2008-04-10  2008-04-16  2008-04-17  2008-04-18  2008-04-21  2008-04-22 
  "Tuesday"  "Thursday"  "Saturday"   "Tuesday"    "Sunday"    "Friday"  "Saturday"  "Saturday"  "Thursday" "Wednesday"  "Thursday"    "Friday"    "Monday"   "Tuesday" 
 2008-04-25  2008-05-01  2008-05-02  2008-05-08  2008-05-21  2008-05-24  2008-05-29  2008-06-08  2008-06-12  2008-06-21  2008-06-25  2008-06-26  2008-07-04  2008-07-06 
   "Friday"  "Thursday"    "Friday"  "Thursday" "Wednesday"  "Saturday"  "Thursday"    "Sunday"  "Thursday"  "Saturday" "Wednesday"  "Thursday"    "Friday"    "Sunday" 
 2008-07-07  2008-07-13  2008-07-18  2008-07-21  2008-07-22  2008-07-23  2008-07-24  2008-07-29  2008-07-30  2008-08-01  2008-08-02  2008-08-05  2008-08-06  2008-08-08 
   "Monday"    "Sunday"    "Friday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"   "Tuesday" "Wednesday"    "Friday"  "Saturday"   "Tuesday" "Wednesday"    "Friday" 
 2008-08-09  2008-08-10  2008-08-12  2008-08-13  2008-08-15  2008-08-16  2008-08-20  2008-08-28 
 "Saturday"    "Sunday"   "Tuesday" "Wednesday"    "Friday"  "Saturday" "Wednesday"  "Thursday" 

$`3`
 2007-09-05  2007-09-11  2007-09-19  2007-09-26  2007-09-28  2007-10-05  2007-10-11  2007-10-16  2007-10-17  2007-10-18  2007-10-19  2007-10-24  2007-10-25  2007-10-26 
"Wednesday"   "Tuesday" "Wednesday" "Wednesday"    "Friday"    "Friday"  "Thursday"   "Tuesday" "Wednesday"  "Thursday"    "Friday" "Wednesday"  "Thursday"    "Friday" 
 2007-11-01  2007-11-07  2007-11-08  2007-11-09  2007-11-14  2007-11-15  2007-11-16  2007-11-19  2007-11-20  2007-11-27  2007-11-29  2007-11-30  2007-12-07  2007-12-10 
 "Thursday" "Wednesday"  "Thursday"    "Friday" "Wednesday"  "Thursday"    "Friday"    "Monday"   "Tuesday"   "Tuesday"  "Thursday"    "Friday"    "Friday"    "Monday" 
 2007-12-20  2007-12-21  2007-12-27  2008-01-02  2008-01-07  2008-01-08  2008-01-09  2008-01-10  2008-01-28  2008-02-01  2008-02-08  2008-02-14  2008-02-26  2008-02-29 
 "Thursday"    "Friday"  "Thursday" "Wednesday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"    "Monday"    "Friday"    "Friday"  "Thursday"   "Tuesday"    "Friday" 
 2008-03-03  2008-03-05  2008-03-07  2008-03-08  2008-03-11  2008-03-17  2008-03-26  2008-03-27  2008-03-31  2008-04-07  2008-04-08  2008-04-09  2008-04-14  2008-04-15 
   "Monday" "Wednesday"    "Friday"  "Saturday"   "Tuesday"    "Monday" "Wednesday"  "Thursday"    "Monday"    "Monday"   "Tuesday" "Wednesday"    "Monday"   "Tuesday" 
 2008-04-24  2008-04-29  2008-05-06  2008-05-13  2008-05-14  2008-05-15  2008-05-16  2008-05-20  2008-05-23  2008-05-30  2008-06-03  2008-06-06  2008-06-09  2008-06-11 
 "Thursday"   "Tuesday"   "Tuesday"   "Tuesday" "Wednesday"  "Thursday"    "Friday"   "Tuesday"    "Friday"    "Friday"   "Tuesday"    "Friday"    "Monday" "Wednesday" 
 2008-06-14  2008-06-16  2008-06-22  2008-07-14  2008-07-25  2008-08-19  2008-08-26 
 "Saturday"    "Monday"    "Sunday"    "Monday"    "Friday"   "Tuesday"   "Tuesday" 

$`4`
2007-09-01 2007-09-02 2007-09-09 2007-09-16 2007-09-22 2007-09-23 2007-09-30 2007-10-13 2007-10-14 2007-10-21 2007-11-04 2007-11-10 2007-11-11 2007-11-12 2007-11-17 2007-11-18 
"Saturday"   "Sunday"   "Sunday"   "Sunday" "Saturday"   "Sunday"   "Sunday" "Saturday"   "Sunday"   "Sunday"   "Sunday" "Saturday"   "Sunday"   "Monday" "Saturday"   "Sunday" 
2007-11-24 2007-11-25 2007-12-02 2007-12-08 2007-12-09 2007-12-15 2007-12-16 2007-12-22 2007-12-23 2007-12-25 2007-12-29 2007-12-30 2008-01-01 2008-01-06 2008-01-12 2008-01-13 
"Saturday"   "Sunday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday"  "Tuesday" "Saturday"   "Sunday"  "Tuesday"   "Sunday" "Saturday"   "Sunday" 
2008-01-19 2008-01-20 2008-01-26 2008-01-27 2008-02-03 2008-02-16 2008-02-17 2008-02-23 2008-02-24 2008-03-01 2008-03-02 2008-03-09 2008-03-16 2008-03-21 2008-03-22 2008-03-30 
"Saturday"   "Sunday" "Saturday"   "Sunday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday"   "Sunday"   "Sunday"   "Friday" "Saturday"   "Sunday" 
2008-04-06 2008-04-12 2008-04-13 2008-04-19 2008-04-20 2008-04-26 2008-04-27 2008-05-03 2008-05-04 2008-05-10 2008-05-11 2008-05-17 2008-05-18 2008-05-25 2008-05-31 2008-06-15 
  "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday"   "Sunday" "Saturday"   "Sunday" 
2008-06-29 2008-07-12 2008-07-19 2008-07-20 2008-07-26 2008-07-27 2008-08-03 2008-08-17 2008-08-24 2008-08-31 
  "Sunday" "Saturday" "Saturday"   "Sunday" "Saturday"   "Sunday"   "Sunday"   "Sunday"   "Sunday"   "Sunday" 

$`5`
 2008-03-20  2008-05-26  2008-06-20  2008-06-28  2008-06-30  2008-07-03  2008-07-08  2008-07-09  2008-07-10  2008-07-11  2008-07-15  2008-07-16  2008-07-17  2008-07-28 
 "Thursday"    "Monday"    "Friday"  "Saturday"    "Monday"  "Thursday"   "Tuesday" "Wednesday"  "Thursday"    "Friday"   "Tuesday" "Wednesday"  "Thursday"    "Monday" 
 2008-07-31  2008-08-04  2008-08-07  2008-08-14  2008-08-21  2008-08-22  2008-08-23  2008-08-25  2008-08-29  2008-08-30 
 "Thursday"    "Monday"  "Thursday"  "Thursday"  "Thursday"    "Friday"  "Saturday"    "Monday"    "Friday"  "Saturday" 

Note that most of the weekend days are in cluster 4 along with a Christmas Tuesday (25 December 2007) and Veterans Day (observed) on a Monday, 12 November 2007, and a Good Friday, 21 March 2008. Assigning meanings to the other clusters depends upon having events to mark them with. It’s known, for example, that the last day of school in 2008 was 20th June 2008. Unfortunately, the academic calendars for 2007-2008 have apparently been discarded. (I was able to find a copy of the 2008 Westwood High School yearbook, but it is not informative about dates, consisting primarily of photographs.) Accordingly, it’s necessary to look for internal consistency.

There is a visual way of representing these findings. The figure below, a reproduction of the one at the head of the blog post, traces energy consumption for the high school during each day. The abscissa shows hours of the day, broken up into 96 15-minute intervals. For each of 366 days, the energy consumption recorded is plotted, and the lines connected. Each line is plotted in a different color depending upon the day of the week. The colors are faded by adjusting their alpha value so they can be seen through.

Note how days with flat energy consumption tend to be in a single color. These are apparently weekend days.

Atop of each of the lines describing energy consumption, a black numeral has been printed which gives the cluster number to which the day was assigned. These are printed at the highest point of their associated curves, but these are jittered so they don’t stack atop one another and make them hard to distinguish.

(Click on figure to see larger image and then use browser Back Button to return to blog.)

The clusters go along with consumption characters. A proactive energy management approach would entail examining the activities done on the days in each of the clusters. Of special interest would be clusters, such as clusters 1 and 3 which have very high energy usage.

Code and data

The code and data reviewed here are available in my Google replacement for a Git repository.

Future work

I am next planning to apply this clustering technique to long neglected time series of streamflow in Sharon, MA and on the South Shore.

Posted in American Statistical Association, consumption, data streams, decentralized electric power generation, dendrogram, divergence measures, efficiency, electricity, electricity markets, energy efficiency, energy utilities, ensembles, evidence, forecasting, grid defection, hierarchical clustering, hydrology, ILSR, information theoretic statistics, local self reliance, Massachusetts, microgrids, NCD, normalized compression divergence, numerical software, open data, prediction, rate of return regulation, Sankey diagram, SNCD, statistical dependence, statistical series, statistics, sustainability, symmetric normalized compression divergence, time series | Leave a comment

On plastic bag bans, and the failure to realize economic growth cannot be green

(Updated 2019-01-12.)

Despite the surge of interest in plastic bag bans, the environmental sustainability numbers haven’t been run. For example, it makes no sense to trade using paper bags instead of plastic ones, even if the paper is recycled, because paper is a nasty product to make, and more emissions are involved shipping paper bags than plastic ones. Paper bags are heavier, get wet, and cost towns and residents to recycle or dispose.

The City of Cambridge, Massachusetts, put fees on all retail bags, but did that after studying the matter for seven years. Reports on their study are available at the City of Cambridge Web site.

Even reusable bags have an impact to be made, and, if used, must be reused one or two hundred times to offset their own upstream environmental impacts in comparison with plastic bags, downstream impacts and all. The biggest problem people have with reusable bags is remembering to bring them along.

We don’t really know what happens to plastic bags in oceans, apart from anecdotal evidence of harm to macroscale creatures. Cigarette filters and microplastics seem to persist.

See the podcast from BBC’s “Costing the Earth”

for some of the complexities.

Wishful environmentalism can be damaging: It consumes policy good will, energy on the part of activists, and misses addressing substantial problems, like expansive development, which cause far greater harm to the natural world. And, worse, the “feel good” of not using plastic bags, or helping to ban them tends to justify personal behaviors which are more damaging, such as taking another aircraft flight for fun that hasn’t been properly offset in its emissions (*). Air travel is a huge contributor, and has, thus far, never been successfully penalized for its contributions to human emissions. The last round on that was fought during the Obama administration which fiercely negotiated with Europe not to have to pay extra fees for landing in EU airports.

The hard fact is economic growth cannot be green. Quoting:

Study after study shows the same thing. Scientists are beginning to realize that there are physical limits to how efficiently we can use resources. Sure, we might be able to produce cars and iPhones and skyscrapers more efficiently, but we can’t produce them out of thin air. We might shift the economy to services such as education and yoga, but even universities and workout studios require material inputs. Once we reach the limits of efficiency, pursuing any degree of economic growth drives resource use back up.

These problems throw the entire concept of green growth into doubt and necessitate some radical rethinking. Remember that each of the three studies used highly optimistic assumptions. We are nowhere near imposing a global carbon tax today, much less one of nearly $600 per metric ton, and resource efficiency is currently getting worse, not better. Yet the studies suggest that even if we do everything right, decoupling economic growth with resource use will remain elusive and our environmental problems will continue to worsen.

This sounds discouraging, but I am not discouraged. The natural world has repeatedly dealt with species which were resource hogs. That it ends poorly for the species who do is a salutary lesson for those which can observe it, assuming they learn.

Claire bought me a wonderful book for the holidays. It’s Theory-based Ecology by Pásztor, Botta-Dukát, Magyar, Czárán, and Meszéna, and I got it for my Kindle Oasis. It has a number of themes but two major ones are (1) exponential growth of unstructured populations, and (2) the inevitability of population regulation. By the latter they mean organism deaths due to insufficient resources, or, in other words, growth beyond the carrying capacity.

In our case, that kind of collapse or growth is mediated by an economic system, one which suffers its own periodic collapses. Accordingly, the choice is whether to keep hands off and allow such a collapse via a Minsky moment occur on its own, or, instead, intervene and have a controlled descent. We are not as self-sustaining as we collectively think, and developed countries, although wealthier and replete with resources, also have a greater cross section for impact and harm.

Our choice.

Update, 2019-01-12

From The Hill, “Will a market crash get the action we need on climate change?”:

So, what’s the good news? The end of denial by financial markets and government leaders is nearly at hand. For most investors, the risks of climate change loom beyond their investment horizon. It’s been easy for investors to operate in a speculative carbon bubble, acting as though there are no impending costs to earnings-per-share or to liabilities in their portfolios from the buildup of carbon in the atmosphere. But these costs may increasingly look real, and when investors start taking these costs into account, markets will revalue: not just oil and gas stock, but all stocks.

Companies have facilities that will be flooded or be without needed water for production; supply chains will need to be rebuilt; costs of transportation will increase. What about the costs to financial institutions as communities need to be abandoned because of flood or drought? What are the fiscal consequences to governments of rebuilding airports, roads and other critical infrastructure? What will happen to consumer spending?

There will be winners and losers in this revaluation, but as past speculative bubbles have shown us, when they burst, markets move very quickly.

Government leaders have likewise largely operated in a bubble. It is the rare leader who can spend political (or taxpayer) capital on addressing an over-the-horizon problem. When the bubble bursts, government leaders will need to address the real concerns of rebuilding infrastructure, food and water security, and public health threats that will be seen by voters as imminent.


(*) This is actually pretty straightforward to do. Here’s our formula.

There is something called the New England Wind Fund. Essentially, contributions are used to buy WRECs, and one WREC prevents 842 pounds of CO2 emissions on the electric grid. Thecarbonfootprint.com offers a CO2 travel calculator. It tells how much CO2-equivalents are produced from a flight. (They offer calculators for other modes of travel, too.) They also offer you a vehicle for offsetting right on the spot, but I do not recommend using it. They do also make available a check box for additional forcing effects, which I always check. This is because emissions at typical aircraft altitudes are worse than at sea level or on the ground.

The result is in metric tonnes, where 1 metric ton is 1000 kilograms. There are 2.2 lbs per kilogram. So 1 WREC prevents 383 metric tonnes of CO2 emissions.

For a trip, calculate how much emissions you will make in units of WRECs, and then go to the New England Wind Fund site and contribute US$40 for each WREC.

Done.

I don’t recommend using the carbonfootprint.com offset because, while they could be fine, Carbon offsetting programs need constant auditing and checking, and there are some unscrupulous operators out there who use these for greenwashing purposes only. I know New England Wind, though, and these really do get converted into WRECs.

Posted in adaptation, an ignorant American public, an uncaring American public, Anthropocene, development as anti-ecology, E. O. Wilson, environment, evidence, evolution, exponential growth, fragmentation of ecosystems, global warming, greenwashing, Humans have a lot to answer for, Hyper Anthropocene, local self reliance, plastics, population biology, quantitative biology, quantitative ecology, supply chains, sustainability, sustainable landscaping, The Demon Haunted World, the right to be and act stupid, the right to know, the tragedy of our present civilization, the value of financial assets, tragedy of the horizon | Leave a comment

Hogwarts Hymn

Posted in Harry Potter, J K Rowling | Leave a comment

My most political post yet … yeah, but it’s me, and Bill Maher is, most of the time, what I’m down with.

Sorry, but there are distinctions to be made.

Posted in Bill Maher, objective reality | Leave a comment

International climate negotiations, the performance: `Angry and upset’

Climate Adam, who you should follow:

Posted in adaptation, American Association for the Advancement of Science, Anthropocene, carbon dioxide, Carbon Worshipers, climate change, Glen Peters, global warming, Hyper Anthropocene, Kevin Anderson | Leave a comment

Love your home. The place we call home needs love. But love means nothing, without action.

Posted in Ørsted, bridge to somewhere, Buckminster Fuller, climate disruption, decentralized electric power generation, decentralized energy, ecological disruption, electricity, green tech, Green Tech Media, investment in wind and solar energy, local generation, solar democracy, solar domination, solar energy, solar power, Spaceship Earth, sustainability, the energy of the people, the green century, tragedy of the horizon, utility company death spiral, wind energy, wind power, zero carbon | Leave a comment

Series, symmetrized Normalized Compressed Divergences and their logit transforms

(Major update on 11th January 2019. Minor update on 16th January 2019.)

On comparing things

The idea of a calculating a distance between series for various purposes has received scholarly attention for quite some time. The most common application is for time series, but particularly with advent of means to rapidly ascertain nucleotide and ligand series, distances between these are increasingly of inferential interest.

Vitányi’s idea(**)

When considering the divergence between arbitrary sequences, P. M. B. Vitányi has been active since at least 1998, with his work with colleagues Bennett, Gács, Li, and Zurek, per “Information distance” which appeared in IEEE Transactions on Information Theory in that year. Since 1998, he has published papers with these and other colleagues. The paper which concerns the present post is R. Cilibrasi, P. M. B. Vitányi, “Clustering by compression“, which appeared in IEEE Transactions on Information Theory in 2005. That paper and A. R. Cohen, P. M. B. Vitányi, “Normalized Compression Distance of multisets with applications“, IEEE Transactions on Information Theory, 2015, 37(8), 1602-1614, explain why this way of comparing sequences is attractive. For example, from Cohen and Vitányi,

Pairwise normalized compression distance (NCD) is a parameter-free, feature-free, alignment-free, similarity metric based on compression … The way in which objects are alike is commonly called similarity. This similarity is expressed on a scale of 0 to 1 where 0 means identical and 1 means completely different … To define the information in a single finite object one uses the Kolmogorov complexity [15] of that object (finiteness is taken as understood in the sequel). Information distance [2] is the information required to transform one in the other, or vice versa, among a pair of objects … Here we are more concerned with normalizing it to obtain the so-called similarity metric and subsequently approximating the Kolmogorov complexity through real-world compressors [19]. This leads to the normalized compression distance (NCD) which is theoretically analyzed and applied to general hierarchical clustering in [4]. The NCD is parameter-free, feature-free, and alignment-free, and has found many applications in pattern recognition, phylogeny, clustering, and classification ….

This is exciting because it offers a way of, if you will, doing really non-parametric statistics: Not only don’t inferential procedures based upon these not care about statistical distributions which the units of study exhibit, they are opaque to many features which might sidetrack inference with outlying characteristics. These sometimes arise from simple mistakes in measurement or record. It’s to be expected, I think, that use of such techniques will result in loss of statistical power in comparison to inferences based upon good parametric models for a given dataset. On the other hand, it’s almost impossible to make specification errors, or form Likelihood functions improperly. Aspects of models which cause these just are not seen.

Definition and some properties

The basic idea of determining how far apart two sequences \textbf{x} and \textbf{y} are begins by positing a practical compressor, an operator, R(\textbf{s}), which takes a sequence \textbf{s} into a compressed version of \textbf{s}, or \textbf{s}_{C}. Then define a Z(\textbf{s}) = \rho(R(\textbf{s})), where the \rho(.) is a length measure of sorts, each of \textbf{s}_{C}, perhaps the length of the resulting compression in bits or nats. Then

K_{\text{vc}}(x,y) = \frac{Z(x||y) - \min{(Z(x), Z(y))}}{\max{(Z(x), Z(y))}}.

where \textbf{x}||\textbf{y} denotes the concatenation of the sequence \textbf{x} with the sequence \textbf{y}, is interpreted as the normalized compressed divergence between \textbf{x} and \textbf{y}. If

K_{\text{sym}}(\textbf{x}, \textbf{y}) = \frac{K_{\text{vc}}(\textbf{x}, \textbf{y}) + K_{\text{vc}}(\textbf{y}, \textbf{x})}{2}.

is calculated instead, a pseudo-distance is obtained. It is at least symmetric. In general,

K_{\text{vc}}(\textbf{x}, \textbf{y}) \ne K_{\text{vc}}(\textbf{y}, \textbf{x}).

In other words, K_{\text{vc}}(\textbf{x}, \textbf{y}) is not a metric distance. K_{\text{sym}}(\textbf{x}, \textbf{y}) is not one either, but it is symmetric. This why, in the title I refer to it as a symmetrized divergence and to differentiate from the standard Cilibrasi and Vitányi I call it SNCD and I refer to it as a divergence(*).

In fact, the terminology can be confusing. Both fail because they don’t extend across the non-negative Reals. Nevertheless, it is possible to cluster objects using either. It’s difficult to do inferences with these, but definining

\mathcal{L}_{\text{vc}}(\textbf{x}, \textbf{y}) \triangleq \text{logit}(K_{\text{sym}}(\textbf{x}, \textbf{y})).

gets to an index which does extend across the Reals and can be readily used for statistical inference. The logit is a well-known transform for mapping probabilities onto the Real line.

However, the Triangle Inequality is not respected, so the geometry is non-Euclidean.

The point of the post

In addition to introducing Normalized Compressed Divergence to the readership, something which I’ll be using in future posts, I constructed several series which show similarities to one another. The pseudo-distances between related pairs of these were calculated, as were their logits.

Below I show the series, and then I present a table showing the results. Hopefully, this gives some idea of what series are considered familiar or not.

The cases

ypredict0, ypredict1

These are two similar instances of a curve and dataset taken from Sivia and Skilling. Both divergences between these curves and intermediate ones are calculated in the results below.

y.trig0, y.trig1, y.trig2, y.trig3

These are instances of a basic sine series and three modulations of it.

  • y.trig0 shows 4 waves from a sine function with frequency one.
  • y.trig1 shows 4 waves from a related function, one with an amplitude modulation of 1 + \frac{x}{2\pi}.
  • y.trig2 shows 4 waves from a related function, one shifted in phase by an eighth of a wavelength, that is \sin{(x + \frac{\pi}{4})}.
  • y.trig3 shows 4 waves from a related function, one chirped in frequency, as \sin{(x (1 + 4\epsilon))}, where \epsilon steps in 0 \le \epsilon \le 8 \pi in thousandths.

Some practicalities

When calculating a compressed version of a signal, generally speaking, practical compressors demand a string version of the signal. I have chosen to use the xzip compressor with the “-9e” header option, as specified in the R internal memCompress function. This means a length \rho(R(\textbf{s})) divergences cannot be zero, but, nevertheless, the divergence could be.

Also, numeric signals need to be converted to characters. There are many ways that could be done. I originally used the technique in the TSclust package of R, since that was the only package which overtly claimed to offer an NCD dissimilarity measure. But it turns out that has problems, at least for numerical series. The is also a TSdist package which simply imports the corresponding dissimilarity measures from TSclust.

A problem in TSclust

The TSclust package of R has a dissimilarity calculation. Consulting the source, it’s clear this is not symmetrized, and is patterned literally after Cilibrasi and Vitányi:


###################################################################################
####### Clustering by Compression (2005), Cilibrasi, R., Vitanyi, P.M.B.,  ########
######## Normalized Compression Distance ##########################################
###################################################################################
diss.NCD <- function(x,y, type="min") {
    .ts.sanity.check(x,y)
    comp <- .compression.lengths(x,y, type)  
    (comp$cxy - min(comp$cx,comp$cy)) / max(comp$cx, comp$cy)
}

#common part of compression methods,
#calculate the sizes of the compressed series and of their concatenation
.compression.lengths <- function(x, y, type) {      
    methods <- type
    type = match.arg(type, c("gzip", "bzip2", "xz", "min"))
    if (type == "min") { #choose the best compression method of the three 
        methods <- c("gzip", "bzip2", "xz")
    }
    xy <- as.character(c(x,y))
    x <- as.character(x)
    y <- as.character(y)
    cxym <- sapply( methods, function(m) { length( memCompress(xy, type=m) )})
    cxy <- min(cxym)
    cx <- min(sapply( methods, function(m) { length( memCompress(x, type=m) )}))
    cy <- min(sapply( methods, function(m) { length( memCompress(y, type=m) )}))
    list(cx=cx, cy=cy, cxy=cxy)
}

Apart from the fact that .ts.sanity.check doesn’t work, not that the numeric series are simply converted to character strings in .compression.lengths before being subjected to compression and, subsequently, calculation of the NCD. This cannot be correct.

Consider what would happen if there were two time series, \mathbf{A} and \mathbf{B}, each of length 100. Suppose \mathbf{A} consists of a sequence of 50 copies of the string “3.1” followed by a sequence of 50 copies of the string “3.2”. Suppose \mathbf{B} consists of a sequence of 50 copies of the string “3.2” followed by a sequence of 50 copies of the string “3.1”. Using the dissVSTR function from below, which operates purely on strings, not on statistical series, the SNCD is 0.7.

Now consider two other series, like these but slightly modified, \mathbf{C} and \mathbf{D}, also each of length 100. But instead, suppose \mathbf{C} consists of a sequence of 50 copies of the string “3.01” followed by a sequence of 50 copies of the string “3.02” and \mathbf{D} consists of a sequence of 50 copies of the string “3.02” followed by a sequence of 50 copies of the string “3.01”. If these were statistical series, the values are close to one another than they were. But since they are strings, and have an additional numeral zero, dissVSTR actually shows their SNCD is larger, about 0.73, implying they are father apart.

I first tried to fix this in an earlier version of this post by precalculating a combined set of bins for the pooled values in both series based upon standard binning logic in the sm package and then quantiles using the hdquantile function of the Hmisc package. I then assigned a character to each of the resulting bins and used the R cut function to determine to which bin each of the individual series belonged and coded that. This was a bit better, but I still think it was wrong: Bins corresponding to bigger values hadn’t any more mass than smaller bins and, so, a distance apart at a larger bin was ranked in information distance exactly the same way as a bin apart at the low end.

Remedy

The remedy I’ve chosen is to do code differently if SNCD of the the pair of objects is calculated on strings (or files) or numerical series. For strings, there’s no problem going right ahead and calculating the compressed versions of the strings. But for numerical statistical series, as in the last suggestion above, quantiles of the pooled values from the two series (without duplicates removed) are calculated using the hdquantile function from Hmisc. The number of quantiles points is one more than the rounded version of 1 + \log_{2}{(n)}, where n is the length of the longer of the two series, in the case they are of different length. So


hdquantile(x=c(x,y), probs=seq(0, 1, 
               length.out=round(log(max(n.x, n.y)/log(2) + 1)), 
               names=FALSE)

calculates the pooled quantiles.

Next, assign a unique string character to each of the resulting bins. But instead of just using that character by itself, replicate it into a string with the same number of repetitions as the bin number. Thus, if there are m bins, the bin containing the smallest values has its character just of length unity, but the last bin, bin m, has its corresponding character replicated m times.

Finally run over the two series, and code them by emitting the repeated bin labels corresponding to the bin in which they fall. This is the result submitted for compression comparison.

There is an additional thing done to accommodate NA values in the series, but the reader can check the code below for that.

The results

There are 6 cases which serve as end members in various capacities, as shown above. The divergences between ypredict1 and ypredict2 are shown, as are divergences between y.trig0 and y.trig1, y.trig0 and y.trig2, y.trig0 and y.trig3, y.trig1 and y.trig2, y.trig1 and y.trig3, and finally y.trig2 and y.trig3.

Also shown are intermediate morphings between end members of these pairs. If \mathbf{y}_{1} is one end member and \mathbf{y}_{2} is a second end member, then

\mathbf{y}_{\epsilon} = (1 - \epsilon) \mathbf{y}_{1} + \epsilon \mathbf{y}_{2}.

is the \epsilon intermediate morphing between the two.

Using the divergence between ypredict1 and ypredict2 as a baseline, the modulation of the sine case, y.trig0, into its phase shifted counterpart, or y.trig2, is see as the least change. The amplitude modulated version of y.trig0 called y.trig1 has a substantial divergence, but not as much as the chirped version called y.trig3. The intermediate versions of these behavior predictably. It is a little surprising that once ypredict1 is transformed 0.7 of the way to ypredict2, the divergence doesn’t worsen. Also, in the case of the sinusoids, divergences as the curves approach the other end point do not change monotonically. That isn’t surprising, really, because there’s a lot going on with those sinusoids.

The code producing the results

Intermediate datasets, R code for the above results is available from a directory in my Google drive replacement for Git.

New versions of the codes have been uploaded to the Google drive. The old versions are still there.

The key code for producing the SNCDs from numerical statistical series is:


library(Matrix) # Data structure for large divergence matrices
library(random) # Source of the random function
library(Hmisc)  # Source of the hdquantile function
library(gtools) # Source of the logit function

numericToStringForCompression<- function(x, y)
{
  n.x<- length(x)
  n.y<- length(y)
# (This the default number of bins for binning from the sm package, but there are
#  two vectors here, and they need a common number of bins.)
  nb<-  max(round(log(n.x)/log(2)+1), round(log(n.y)/log(2)+1))
  Q<- hdquantile(c(x,y), probs=seq(0,1,length.out=1+nb), names=FALSE)
  alphaBet<- c(letters, LETTERS, sapply(X=0:9, FUN=function(n) sprintf("-%.0f", n)))
  m<- length(Q) - 1
  stopifnot( (1 < m) && (m <= length(alphaBet)) )
  codes<- c("!", mapply(A=rev(alphaBet[1:m]), K=(1:m), 
                 FUN=function(A,K) Reduce(f=function(a,b) paste0(a,b,collapse=NULL), x=rep(A, (1+K)), init="", right=FALSE, accumulate=FALSE)))
  cx<- 1+unclass(cut(x, Q, labels=FALSE))
  cx[which(is.na(cx))]<- 1
  cy<- 1+unclass(cut(y, Q, labels=FALSE))
  cy[which(is.na(cy))]<- 1
  chx<- codes[cx]
  chy<- codes[cy]
  return(list(x=chx, y=chy))
}

compression.lengths<- function(xGiven, yGiven, type="xz")
{
  if (is.numeric(xGiven))
  {
    coding<- numericToStringForCompression(x=xGiven, y=yGiven)
    x<- coding$x
    y<- coding$y
  } else
  {
    stopifnot( is.character(xGiven) )
    stopifnot( is.character(yGiven) )
    x<- xGiven
    y<- yGiven
  }
  #
  xx<- c(x,x)
  yy<<-c(y,y)
  xy<- c(x,y)
  yx<- c(y,x)
  stopifnot( is.character(xx) )
  stopifnot( is.character(yy) )
  stopifnot( is.character(xy) )
  stopifnot( is.character(yx) )
  zero<- length(memCompress("", type=type))
  cx<- length(memCompress(x, type=type)) - zero
  cy<- length(memCompress(y, type=type)) - zero
  cxx<- length(memCompress(xx, type=type)) - zero
  cyy<- length(memCompress(yy, type=type)) - zero
  cxy<- length(memCompress(xy, type=type)) - zero
  cyx<- length(memCompress(yx, type=type)) - zero
  return(list(cx=cx, cy=cy, cxx=cxx, cyy=cyy, cxy=cxy, cyx=cyx, csymmetric=(cxy+cyx)/2))
}


divc.NCD <- function(xGiven, yGiven, trans=function(x) x) 
{
  typCompr<- "xz"
  if (is.numeric(xGiven))
  {
    coding<- numericToStringForCompression(x=xGiven, y=yGiven)
    x<- coding$x
    y<- coding$y
  } else
  {
    stopifnot( is.character(xGiven) )
    stopifnot( is.character(yGiven) )
    x<- xGiven
    y<- yGiven
  }
  #
  xy<- c(x,y)
  yx<- c(y,x)
  zero<- length(memCompress("", type=typCompr))
  cx<- length(memCompress(x, type=typCompr)) - zero
  cy<- length(memCompress(y, type=typCompr)) - zero
  cxy<- length(memCompress(xy, type=typCompr)) - zero
  cyx<- length(memCompress(yx, type=typCompr)) - zero
  #
  # Symmetrized NCD of the above.
  mnxy<- min(cx, cy)
  mxxy<- max(cx, cy)
  ncd<- max(0, min(1, ( (cxy - mnxy) + (cyx - mnxy) ) / (2*mxxy) ) )
  #
  return(trans(ncd))
}

divs<- function(SERIES, period=25)
{
  stopifnot( is.data.frame(SERIES) ) 
  N<- ncol(SERIES)
  divergences<- Matrix(0, N, N, dimnames=list(NULL, NULL))
  # Since logits are so common in inference, calculate those, too.
  logit.divergences<- Matrix(-Inf, N, N, dimnames=list(NULL, NULL))
  N1<- N-1
  for (i in (1:N1))
  {
    for (j in ((1+i):N))
    {
      d<- divc.NCD(xGiven=SERIES[,i], yGiven=SERIES[,j], trans=function(x) x)
      divergences[i,j]<- d
      divergences[j,i]<- d
      ld<- logit(d)
      logit.divergences[i,j]<- ld
      logit.divergences[j,i]<- ld
    }
    if (0 == (i%%25))
    {
      cat(sprintf("... did %.0f\n", i))
    }
  }
  stopifnot( !is.null(colnames(SERIES)) )
  colnames(divergences)<- colnames(SERIES)
  rownames(divergences)<- colnames(SERIES)
  colnames(logit.divergences)<- colnames(SERIES)
  rownames(logit.divergences)<- colnames(SERIES)
  #
  # Return Matrix objects, leaving conversion to a matrix, a  distance matrix, or a data
  # from to the consumer of the output. Can't anticipate that here. 
  return(list(divergences=divergences, logit.divergences=logit.divergences))
}

dissVSTR<- function(VSTR, period=25, logitp=FALSE)
{
  stopifnot( is.vector(VSTR) ) 
  N<- length(VSTR)
  zero<- length(memCompress(""))
  ncdf<- function(cx, cy, cxy, cyx) { mnxy<- min(cx,cy) ; mxxy<- max(cx,cy) ; return( max(0, min(1, (cxy + cyx - 2*mnxy)/(2*mxxy) ))) }
  #
  CV200) & (0 < period))
  {
    cat(sprintf("Preconditioning of %.0f items completed.\n", N))
  }
  #
  if (logitp)
  {
    dInitial<- -Inf
    trans<- logit
  } else
  {
    dInitial<- 0
    trans<- function(x) x
  }
  #
  divergences<- Matrix(dInitial, N, N, dimnames=list(NULL, NULL))
  #
  N1<- N-1
  for (i in (1:N1))
  {
    sx<- VSTR[i]
    cx<- CV[i]
    for (j in ((1+i):N))
    {
      sy<- VSTR[j]
      cy<- CV[j]
      sxy<- sprintf("%s%s", sx, sy)
      syx<- sprintf("%s%s", sy, sx)
      cxy<- length(memCompress(sxy)) - zero
      cyx<- length(memCompress(syx)) - zero
      d<- trans(ncdf(cx, cy, cxy, cyx))
      if (is.nan(d))
      {
        cat("NANs within VSTR. Inspection:\n")
        browser()
      }
      divergences[i,j]<- d
      divergences[j,i]<- d
    }
    if ((0 < period) && (200 < N) && (0 == (i%%period)))
    {
      cat(sprintf("... did %.0f\n", i))
    }
  }
  colnames(divergences)<- names(VSTR)
  rownames(divergences)<- names(VSTR)
  # Return a Matrix object, leaving conversion to a matrix, a  distance matrix, or a data
  # from to the consumer of the output. Can't anticipate that here.
  return(divergences)
}

You are welcome to use this, but please acknowledge its source:

Jan Galkowski from 667-per-cm.net.

Thanks.

(*) L. Pardo, Statistical Inference Based on Divergence Measures, Chapman & Hall/CRC, 2006.

(**) Others have done important work in this area, including I. J. Taneja (2013) in “Generalized symmetric divergence measures and the probability of error“, Communications in Statistics – Theory and Methods, 42(9), 1654-1672, and J.-F. Coeurjolly, R. Drouilhet, and J.-F. Robineau (2007) in “Normalized information-based divergences“, Problems of Information Transmission, 43(3), 167-189.

Posted in Akaike Information Criterion, bridge to somewhere, computation, content-free inference, data science, descriptive statistics, divergence measures, engineering, George Sughihara, information theoretic statistics, likelihood-free, machine learning, mathematics, model comparison, model-free forecasting, multivariate statistics, non-mechanistic modeling, non-parametric statistics, numerical algorithms, statistics, theoretical physics, thermodynamics, time series | 1 Comment

Winter composting: How to make friends with microbes and defy weather (podcast, too)

(This blog post is accompanied by an explanatory podcast. See below.)

Many people compost. It can be easy or hard, depending upon your tolerance for turning and work, and of the Wild Thing who wants a free meal.

I can imagine and have known dogs to get into bins, and racoons supposedly get into them. There’s plenty of photo evidence online of racoon footprints near compost piles and bins, and I have seen muddy footprints of racoons on the outside of ours, but I haven’t found a single online photo of a racoon in a compost bin. Racoons might be hardy, but my wife, Claire, once had a granddog that got into some partly cooked compost and soon developed seizures because the partly digested compost had neurotoxins from microbial activity. This did require expensive hospitalization.

So it’s a good idea to keep your compost bins protected from the stray mammal.

That’s a shot of our twin compost bins. The active one is above. The other is “cooking down” over the winter, and should be ready for garden and yard use by late Spring. We have a New England-style Blue Bin which collects rainwater from our roof and gutter system. We use this almost exclusively to add water to the compost in late Spring, Summer, and Autumn, and wash out pails and implements.

I took out some compost today and, with the handle of the small pitchfork I use, an essential tool for composting, I was able to knock ice away from the interior of the Blue Bin spigot and get free running water:

It is 1st January, after all, in greater Boston, Massachusetts.

This brings me to what I want to primarily write and talk about: How to do winter composting. Happy to share.

First, let’s look at our composting setup:

When Claire and I remodelled our kitchen, we had stainless steel compost bin installed flush with the quartz rock counter. Uncapped, and accompanied by a pail of compost from First Parish Needham, Unitarian Universalist, which we collect from coffee hour there and compost, sharing the chore with friend, Susan McGarvey, it looks like:

Now, there are several sources for composting which claim meats and cheese are unsuitable. I can imagine that, in the case of open compost piles, or extreme quantities, it might be a good idea to keep these out of your compost. But Claire and I are vegetarians, for the most part, we like our cheese, and we have three cats. That means we have scrapes from cat food.

That’s Darla, by the way.

For the most part, we never trash our bio-organic waste. The exception is the litter from the cat litter boxes.

During the Summer and warm months, particularly in drought, keeping the compost piles moist is a key goal. When they start developing flies and lots of insects instead of predominantly worms, that means they are too dry. The typical procedure in non-Winter months is

  1. Keep the moisture in the compost bucket inside to a minimum.
  2. Stir the existing compost well.
  3. Add in the new material, stirring lightly.
  4. Add water. For us, that’s a full compost pail of the water.
  5. Stir the compost briskly, mixing in the water and the new material throughout the pile.

In addition, in other than really hot days, you still should see worms and, often, steam rising from the pile when you first stir it.

Winter is quite different.

No additional water gets added to the compost. However, the pail is rather wet, as I’ll dump excess coffee and soy milk into it, so the compost can, at times, be floating in ambient organic stuff. (I do the dishes, by the way, so I can control this.)

So here are the steps I do. I explain why later and in the podcast. That’s inserted below.

Get organized.

You saw the two compost bins above. The active one always has this extract heavy plastic sheet on it, weighed down by rocks. This is critter deterrent, and it has worked for years. The bins themselves are heavy plastic which resists clawing and chewing. They are actually two balls which, could, in principle, be rolled. But I discovered that that means they can’t be loaded more than halfway, because they are too heavy. So they still next to one another, in place, and I stir them instead.

So Winter composting is really quite different than Summer. In fact, we time things to swap when the leaves come down off the trees. The key point is to keep the compost pile from freezing solid. The only reasonable heat source is the exothermic activity of the microbes cracking and consuming the foodstuffs and organics themselves. In fact, everything about Winter composting is designed to maximize that.

The compost pile begins as a empty contained which is filled with whole leaves. These are compacted a couple of times and topped. Then, an indentation is made in the center of the leaves with the pitchfork, carving out a hole in the middle which will hold the compost. It doesn’t extend to the ground. At least initially, the compost should float on top of leaves, supported by them.

Initially, new compost is added into the hole. About 50-50 compost from an old, existing pile is added to innoculate the new pile with microbes. Leaves are spread atop, and the bin is closed up. And you’re on your way.

Then, it is critical that adding additional new material be done carefully and gently.

Leaves are gently removed from the top of the compost, and pushed to the side. If it is really cold you, you should endeavor to get the task done quickly.

Then, with the pitchfork, plunge into the bolus of compost, and twist it, trying to disturb the compost side to side as little as possible. The idea is you don’t want to break up the clumps of microbes working on that bolus. You want to aerate it, sure. Make sure you reach all parts of the pile, but concentrate on the center. Don’t attempt to stir it aggressively as you would in the Summer. That’ll make the effort a failure, because it’ll freeze.

In fact, when done correctly, the temperature of the compost in the leaves will exceed 60°C. In all likelihood, this will kill off the worms. For Winter composting, that’s fine. We’re interested in the microbes. There’s no danger of fire, because that demands temperatures well above boiling.

Next, add the compost. Here there’s two sources. One came from coffee hour at First Parish, and consists primarily of relatively dry coffee grounds and paper. Add that in first.

Obviously, the plastic bag containing the coffee grounds is left out. As a matter of fact, don’t even attempt to put supposedly compostable plastic bags in a home composting setup, especially in Winter time. These generally don’t break down except in really high temperature industrial composters. And, even then, I wonder.

Next I added in our sloppily wet home compost. The idea is to saturate the relatively dry with this. Do not add additional water. When it gets cold, it’ll freeze into a block of uselessness which will stink like crazy during the Spring thaw.

A critically important step: Take the pitchfork, but do not stir, not even as gently as above. Instead, simply push the forks down through the new material two to three dozen times. The idea is drag up old composting material and microbes through the new material, so to get it well underway before it freezes. The microbes will give off heat, and keep that from happening.

So here’s what it looks like. I’ll sometimes fish a bit of old compost out with the pitchfork and spread atop the new for good measure.

The leaves are scraped back over the compost bolus, and I check to make sure it remains insulated in all directions.

The next step is to close it up, and clean up.

However, today, even though there are no worms in the pile, we have friend visiting, a slug, which, apparently, associated with the oak leaves, has decided that the compost pile is a nice place to hang out during the cold Winter.

I didn’t disturb it and placed it back into the leaves.

I put the two lids atop, the rocks back, and then clean up.

That’s rainwater from our barrel. On days when the barrel is frozen up, I need to plan ahead and take out a pail of preferable warm-to-hot water to help with the cleanup. By the way, here’s a tip on the barrel which is intended to prevent something we encountered last winter. We failed to drain the barrel enough ahead of the deep cold, possibly because of excess rains. And when it frozen, it expanded, and rotated by itself on its vertical axis, making it difficult to direct rainwater and such into it.

Our solution was to deliberately drain it deeply after the new compost bin setup, even if this meant we were wasting rainwater.

What to do with the washed out compost bin’s water, whatever it’s source?

Claire has a row of chipped, composting leaves off to one side of the back yard. I pour the excess water, with its slurry of food, on top of these. In addition to filtering, this helps the microbes in these side composters, yet isn’t enough to attract any animal who really cares.

I’ve seen wintering and migrating Robins, though, poking around these piles. Perhaps they have some worms and other critters, like the slug shown above.

That’s it.


A technical summary: The biochemistry and microbiology of home composting is still largely unexplored. I have been able to find one comprehensive paper:

E. Ermolaev, C. Sundberg, M. Pell, H. Jönsson, “Greenhouse gas emissions from home composting in practice“, Bioresource Technology, 2014, 151, 174-182.

While this paper is titled to suggests its interest is principally greenhouse gas emissions, they actually did a serious dive into what was going on. There are several other papers which emphasize the greenhouse gas emissions of composting:

There is also this 2003 paper by Kathryn Parent and the American Chemical Society:

K. E. Parent, “Chemistry and Compost“, 2003, American Chemical Society.

And recent paper was indicated:

M.A. Vázquez, M. Soto, “The efficiency of home composting programmes and compost quality“, Waste Management, June 2017, 64, 39-50.

Posted in agroecology, argoecology, Botany, Carbon Cycle, composting, ecological services, Ecological Society of America, ecology, engineering, environment, fermentation, First Parish Needham, karma, local self reliance, Nature, science, solid waste management, sustainability, sustainable landscaping, Unitarian Universalism, UU, UU Humanists, UU Needham, water as a resource | Leave a comment

Gov Jerry Brown on Meet the Press, a parting comment on 2018 at Bill Gates’ Notes, and the best climate blog post of 2018

Segment One

Outgoing Governor Jerry Brown of California on NBC’s Meet the Press this morning:

I’ll miss him there, but I don’t think Gov Jerry is going anywhere soon.

Segment Two

Bill Gates Notes offered an end of year summary remark to which I posted a comment today, 30th December 2018 at 12:33 EST (no direct link available, sorry), reproduced below:

Thanks, Bill, for your year end insights, documenting where we are, and your continued leadership.

As someone who grew up with computers (FORTRAN in 6th grade on IBM 1620), and was often dreaming of a technological future, I must say that the only part of that dream which came true were computing, and the Internet. It’s a great part, don’t get me wrong, but I wish we had more in the direction of sustainable economies and living. That said, and at age 66, I remain part of the computing industry, and I continue to be excited by the phenomenon which Marc Andreessen described in 2000, that “Software is eating the world”. Everywhere anyone turns, traditional devices which used to use mechanical connections and actuators are being displaced by general purpose computers, often embedded, and things like electric motors controlled by pulse-modulated signals. These are cheaper, lighter, less power hungry, and offer finer, smarter controls. This extends to analog applications of all kinds, from control boards for music systems and video, to automobile controls. I await the day when they make their long anticipated debut as part of civil engineering projects.

On nuclear, I recently studied the field, and believe that the long lamented negative learning curve it exhibits is due solely to the failure of that industry to fail to create modestly sized modular units which can be produced like commodities. Instead, nuclear power has been a cost-plus business and they build bigger and more elaborate all the time, which means these inevitably overshoot schedules and cost targets. We need something like 1 MW reactors which can be lashed together to obtain both arbitrary sizing and greater reliability. (If I lose one server in a farm of ten thousand, like, who cares?) It would be good if they were portable. It would be especially good if they had design safeguards so the materials could not be diverted for nasty purposes, especially dirty bombs. I believe that’s possible, but I also believe that this will require a triumph of imagination, and I can’t see existing players — any more than IBM in its day — coming to that on their own. I wish hope and purpose in this direction for your efforts. No doubt nuclear power was incentivized in the direction they pursued, but the path may have also depended upon contingencies which no one really chose.

By the way, “Software is eating the world” is the corporate motto of Andreesen-Horowitz VC company.

Segment Three

And my vote for the best single climate-related blog post of the year is Eli Rabett‘s Heat has no hair. It begins:

Among physicists and chemists, well at least the theoretical side of the latter it is well known that electrons have no hair by which is meant that a bunny can’t tell one electron from another. This has serious consequences in quantum mechanics because in a multi-electron system you have to allow for each electron to be anywhere any electron is and it gets quite complicated. True, when an atom is ionized you can trace the electron as it is expelled from the atom, but you can’t say WHICH electron it was. Same for electron capture. You could identify an atom before it is captured, but once it was captured you can not identify it from any of the others in the atomic system.

The same thing is true of heat. Heat in an object, perhaps better thermal energy, is random motion of atoms and molecules, translation, vibration, whatever. You can say where heat entering an object came from (say radiation from the sun), but if there is more than one source (trivial case). once it is randomized and in the object you can’t say where it came from.

Which brings Eli to the evergreen claim of those who deny the greenhouse effect, that radiation is not important compared to convection.

Read more at the original link. As I wrote in a related comment:

All the best for your continued explanations and wish you happiness, health, and continued good spirits. Your writing is a joy.


Happy New Year, everyone. Let’s hope the Angry Beast continues to be kind, and we learn some respect. To understand how far we have yet to go, how long we have known, it is worth taking a look at a publication from 2003, an issue of Wild Earth, one called Facing the Serpent. Although they did not mean The Angry Beast, that’s where we are. As Dr Kate Marvel remarked this year, it will take courage, not hope.

Posted in American Association for the Advancement of Science, American Chemical Society, American Meteorological Association, an ignorant American public, Anthropocene, anti-science, astronomy, atmosphere, attribution, being carbon dioxide, Berkeley Earth Surface Temperature project, Bill Gates, Blackbody radiation, bridge to somewhere, California, carbon dioxide, cement production, climate, climate change, climate zombies, development as anti-ecology, ecological services, economics, Eli Rabett, energy flux, environment, evidence, friends and colleagues, global warming, Grant Foster, greenhouse gases, Hyper Anthropocene, investment in wind and solar energy, Jerry Brown, Lawrence Berkeley National Laboratory, leaving fossil fuels in the ground, meteorology, nuclear power, oceanography, oceans, Principles of Planetary Climate, quantum mechanics, science, sea level rise, solar democracy, solar energy, solar power, sustainability, the energy of the people, the green century, the tragedy of our present civilization, tragedy of the horizon, University of California, University of California Berkeley, water as a resource, wind energy, wind power, wishful environmentalism, zero carbon | Leave a comment

Mark Carney is aligned with the geo-biological-physical everything

Bank of England Governor Mark Carney might not be popular on all his pronouncements, but he’s the most comprehensively educated on the matter of climate risk of any in the international discussion groups of the OECD.

Some people will be a climate [change] denier … or take a view that the speed with which domestic policy will change will lag international agreements. People can be on the other side of the spectrum as well. That’s called a market, but the market needs information.

He is, of course, completely right about how markets work.

This has actually moved onwards.

This is definitely worth a look, even if you need to pay for it.

Posted in Anthropocene, capitalism, climate change, economic trade, economics, global warming | Leave a comment

administrative note

Two users — actually one masquerading as two — have been banned from commenting, both because, for the most part, they/he/it have contributed very little to the discussion here, have of late been both arrogant and personally insulting, both here and at other blogs where I sometimes comment, and finally have repeatedly violated the clearly stated Rules of the House. This was done despite an earlier caution.

The offending party claimed, in part, that I banned them from the site, which was not correct. In fact, they were able to post a comment today. I did close commenting on two posts which had some comments and which I did not find productive.

There are no advertisements here, and I pay for this site annually, out of pocket. I do not receive subsidies or reimbursements, and I do not post anything here on behalf of any organization.

Accordingly, I continue to reserve moderation. This is not Speakers Corner.

I do encourage comments. Indeed, I’ve had over 600 since the blog began in 2012, and there have been a bit under 1100 posts. I think it is more in the spirit of the blog when comments are made to document them heavily by links to pertinent papers, and including quotes from pertinent sections or paragraphs. Figures are encouraged. Discussion is fine, but I would like it to be based upon evidence and reasoned argument, not opinion. If that’s too much work, please don’t bother.

Posted in science

Why Americans and Britons work such long hours

Why Americans and Britons work such long hours.

Screenshot_20181225-155957

mill001GFSB-P_0014_-_Gosnold_Textile_Mills_-_Orchard_Street_-_New_Bedford.jpgcambridge-as-history-the-industrial-revolution-by-1800-causes-of-the-industrial-revolution-by-1800-2-638

Posted in business, economics, labor, statistics | Leave a comment

Negative Nuclear Power

NegativeLearningNuclearPower_2018-02-28_115617

This post was originally a little too concise. (I posted it from my Google Pixel 2.) The referenced papers are Grubler (2010), Boccard (2014), and Escobar-Rangel and Lévêque, as well as a slide presentation by them.

In addition, there is new and important research about why nuclear plants have negative learning curves.

First, there is a rebuttal to a critique of the 2010 learning curve costs work, which of course also cites the critique, so you can find it there.

Second, the work by by Escobar-Rangel and Lévêque is especially important, because it is based upon an update of their data, and their working paper and presentation offers a statistically justified explanation of why the negative learning curve. In short, there are two explanations. First, because procuring utilities are rarely the engineering firms that build a nuclear power plant, the firms that do build them as cost-plus jobs. This means each one is different. Second, to improve margins and take advantage of lessons learned, the engineering firms that build reactors have proposed bigger reactors over time, with more safety and other features. More complex jobs are inherently more risky in terms of cost and completion time.

The implicit criticism is that nuclear power reactor procurement should have pursued developing a modular product which could be replicated, and achieved scale by adding a number of the units together. To the degree they did not do this, the lessons of the learning curve were squandered early in design rather than being realized for end customers. Moreover, it is possible that any procurement with a high price tags suffers this phenomenon: It was see on the B-2 bomber procurement and is seen on the large nuclear submarines with missile-launching capabilities. While these are supposed to be identical vehicles, they are not, because of their staggered delivery and the shortcuts taken to meet delivery deadlines.

Consequently, the conclusion is that the reason why nuclear power does not see the advantages seen especially by renewables is that the units for renewables are essentially commodities, and are replicated in large numbers. This appears to be true of some fossil-fuel-based plants as well:

DecreasingCapitalCostsInEnergy--Junginger-Sark-et-al2010

It is critically important for units of production to be produced in cookie-cutter fashion.

CourDesComptesData--2013

Image | Posted on by | Leave a comment

“Electric Dreams”, from BBC Radio 4’s “Costing the Earth”

I just listened to the podcast from Peter Gibbs of BBC Radio 4‘s program, “Costing the Earth”. He recounts the experience of owning and driving a nominally 200 mile range EV:

Is the time finally right to buy an electric car? Peter Gibbs has just taken the plunge. We join him on his first road trip to see if Britain really is ready to wave goodbye to diesel and petrol.

Gibbs interviews Emission Analytics‘ Nick Molden and Roads Minister Jesse Norman.

Produced by Alasdair Cross.


(Click on figure to see a larger image, and use browser Back Button to return to blog.)
From K. Palmer, J. E. Tate, Z. Wadud, J. Nellthorp, “Total cost of ownership and market share for hybrid and electric vehicles in the UK, US and Japan“, Applied Energy, 2018, 209, 108-119.
Posted in Anthropocene, bridge to somewhere, clean disruption, CleanTechnica, disruption, electric vehicles, electricity, electricity markets, EVs | Leave a comment

Towards the end of 2018, Newtonmas, and on commenting standards that have excelled

I have published 1,036 posts at my blog, and the very first was published on 29th November 2012. It concerned dangers of indiscriminately using clustering algorithms, such as K-means. For example, K-means cannot successfully recognize many clusters which are not convex.

In case there’s any doubt and also, I am hardly retired: I am fully employed as a data scientist, statistician, and quantitative engineer. I intend to be for several more years, at least. There are lots of pretty problems out there in the Internet space, and people can check out some of my more recent adventures.

Also, WordPress reports that for the 1,036 posts, I have received 602 comments which met the commenting standards I have clearly specified. That’s about 5 comments for every 8 posts, but, of course, they are bursty. I have no idea what sets some posts up for comments and not others.

With a few exceptions, which I treasure, most comments have come in on policy posts not technical posts.

Although he doesn’t know it, my mentor and introduction to the world of technical blogging is the great P Z Myers with his blog, Pharyngula. The way he dealt with creationists and the rabidly religious was and is wonderful. BTW, I am an atheist and physical materialist, too. I celebrate Newtonmas.

By After Godfrey Knellerhttp://www.newton.cam.ac.uk/art/portrait.html, Public Domain, Link

Oh, and by the way, the above is an expanded but polite rebuttal to a malicious slanderous scree made elsewhere in a comment about me. I do not know what hair got up that model train track lovin’ guy’s ass, but he’s no longer welcome to comment here. I don’t know what he’s about. I’m going to dismiss him as a denier-in-sheep’s-clothing whose purpose it is to tie up people who want climate action who have better things to do.

Posted in blog, P Z Myers, Pharyngula, Wordpress | 1 Comment

Earth’s energy imbalance: Rise in ocean heat content is accelerating

L. Cheng, K. E. Trenberth, J. Fasullo, T. Boyer, J. Abraham, J. Zhu, “Improved estimates of ocean heat content from 1960 to 2015“, Science Advances, 10 March 2017, 3(3), e1601545.

Abstract

Earth’s energy imbalance (EEI) drives the ongoing global warming and can best be assessed across the historical record (that is, since 1960) from ocean heat content (OHC) changes. An accurate assessment of OHC is a challenge, mainly because of insufficient and irregular data coverage. We provide updated OHC estimates with the goal of minimizing associated sampling error. We performed a subsample test, in which subsets of data during the data-rich Argo era are colocated with locations of earlier ocean observations, to quantify this error. Our results provide a new OHC estimate with an unbiased mean sampling error and with variability on decadal and multidecadal time scales (signal) that can be reliably distinguished from sampling error (noise) with signal-to-noise ratios higher than 3. The inferred integrated EEI is greater than that reported in previous assessments and is consistent with a reconstruction of the radiative imbalance at the top of atmosphere starting in 1985. We found that changes in OHC are relatively small before about 1980; since then, OHC has increased fairly steadily and, since 1990, has increasingly involved deeper layers of the ocean. In addition, OHC changes in six major oceans are reliable on decadal time scales. All ocean basins examined have experienced significant warming since 1998, with the greatest warming in the southern oceans, the tropical/subtropical Pacific Ocean, and the tropical/subtropical Atlantic Ocean. This new look at OHC and EEI changes over time provides greater confidence than previously possible, and the data sets produced are a valuable resource for further study.

Posted in Anthropocene, climate change, geophysics, global warming, Hyper Anthropocene, oceanography, oceans, science

People ask me, When you want to feel optimistic, or be optimistic, what do you do?

Updated, 1st January 2019

Simple. I bring up the latest, and listen to Professor Tony Seba of Stanford University.

I am also a devoted enthusiast for the energy programs of the Institute for Local Self-Reliance, per John Farrell’s excellent work.

Update: 2019-01-01

China’s big push for solar energy:


There’s a detailed report, too.

Bloomberg New Energy Finance has estimated that as of 2017 there are US$237 billion of coal assets at risk of being stranded, and the figure is growing. The picture is unhappy in another way: There were 56 TWh of wind and solar generation curtail in 2016, awaiting interconnection of the rich renewables resources of China’s north, with its principal consumption in the southeast. Reflecting this, few if any of the coal-rich provinces of the north have been authorized to build any more power generating plants or expand and start coal mines.

Posted in Ørsted, Bloomberg New Energy Finance, CleanTechnica, energy, energy utilities, Tony Seba

“You say you love your children above all else, and yet you are stealing their future in front of their very eyes.”

Not much else needs to be said here. Hat tip to WaPo.

Thunberg accused leaders of speaking only about “green eternal economic growth because you are too scared of being unpopular.”

“You only talk about moving forward with the same bad ideas that got us into this mess even when the only sensible thing to do is pull the emergency brake,” she said. “You are not mature enough to tell it like it is.”

View story at Medium.com

Posted in Anthropocene, bridge to nowhere, Carbon Worshipers, climate change, climate disruption, Hyper Anthropocene | Leave a comment

Plastics in the oceans!

From Woods Hole Oceanographic Institution:

(Click on image to see a larger figure, and use browser Back Button to return to blog.)

(Click on image to see a larger figure, and use browser Back Button to return to blog.)
Posted in marine debris, microplastics, oceanic eddies, oceanography, oceans, plastics, WHOI, Woods Hole Oceanographic Institution | Leave a comment

Cold Massachusetts Thanksgiving powered by Massachusetts solar

Update, 2018-12-15

(Click on image to see larger figure, and use your browser Back Button to return to reading blog.)

Thanksgiving in 2018 was cold, but it was also sunny. That means the 150,000 solar installations in Massachusetts could delivery on their combined 2.7 GW promise and actually delivered 1.5 GW towards the 15.5 GW needed. Of course, production varied during the day. With additional solar and wind, and additional storage, generation could have been banked to offset the “duck curve” seen.

What’s striking is that, despite the availability of wind and solar resources, wind generation is, at times curtailed by direction and order, since neither ISO-NE nor utilities have a way of routing some excess power or storing it when they don’t need it. This has been identified as the surest sign of a grid which is falling behind the needs of modern distributed generation. Called Do-Not-Exceed orders, they are a vestige of an attitude of central command-and-control grid management, rather than planning and moving to a design where the grid more-or-less manages itself, based upon technical window-ahead signals and predictions. Note ISO-NE has championed a market system to achieve this. That is a form of feedback control system, but it is one having appreciable lags in response. If the system errs in its predictions, whether on demand or on supply, there are energy assets wasted or shortfalls in provisioning. Such errors are why conventional grid managers put so much emphasis upon “dispatchable resources”.

People who can move predominantly off the grid, whether now or in the next 10 years, can insulate themselves from this kind of administrative fragility. Some can’t, and need to live with it, at least until the Massachusetts Department of Public Utilities and their Governor see the need to modernize.

Thanksgiving wasn’t the only banner day. Here was 21st April 2018:

(Click on image to see larger figure, and use your browser Back Button to return to reading blog.)

Our own solar generation on those days looked as depicted in the following figures. Note there are two PV arrays reported, a 10 kW one, and a 3.4 kW one. They of course both feed our home and the neighborhood grid, but as they were installed at different times, are monitored separately. The 3.4 kW wasn’t online for 21 April 2018, yet.

The difference in generation intensity is primarily due to length of solar day and tree shading at low sun angles in Autumn. On Thanksgiving, many of our tree still had leaves on them. I’ve included a look at 2ne August for contrast.

(Click on image to see larger figure, and use your browser Back Button to return to reading blog.)

(Click on image to see larger figure, and use your browser Back Button to return to reading blog.)

(Click on image to see larger figure, and use your browser Back Button to return to reading blog.)

(Click on image to see larger figure, and use your browser Back Button to return to reading blog.)

(Click on image to see larger figure, and use your browser Back Button to return to reading blog.)

Update, 2018-12-15

As @dumboldguy pointed out in a comment, while the contribution of renewables to Massachusetts electrical energy looks impressive in these figures, the “Y axis of the first graph begins at” 9 GW, not zero. This shows how far Massachusetts needs to go to get any appreciable amount of renewables for electricity. And, moreover, it also shows, as I responded, how silly it is to claim that renewables are destabilizing the grid in Massachusetts: They are barely making a dent.

Also, and something which @dumboldguy did not say, but I insist upon saying again, the Massachusetts public’s dislike of onshore wind essentially means they are opting for natural gas as an electricity-generating source and this, necessarily, means they are opting for the new pipelines that come with it. That’s because:

(Click on image to see larger figure, and use your browser Back Button to return to reading blog.)

The differential in price between offshore wind and the price of onshore and the slightly higher price of natural gas means that kWh for kWh, offshore wind won’t compete with natural gas any time soon. Here’s the Lazard Levelized Cost of Energy (unsubsidized) analysis from 2018:

(Click on image to see larger figure, and use your browser Back Button to return to reading blog.)

I have annotated it to point out price of onshore wind versus offshore, and various gas generation prices. Note that solar is expensive (see top), without subsidies. Note, too, that because the grid is antiquated in Massachusetts, using wind and solar means relying upon peaking gas generation plants, if only for part of the time. Note how expensive they are, ignoring greenhouse gas effects.

Posted in American Solar Energy Society, Ørsted, Bloomberg New Energy Finance, clean disruption, CleanTechnica, decentralized electric power generation, decentralized energy, disruption, distributed generation, electrical energy storage, electricity markets, energy utilities, explosive methane, green tech, Green Tech Media, grid defection, ILSR, investment in wind and solar energy, ISO-NE, local generation, local self reliance, marginal energy sources, Mark Jacobson, Massachusetts Clean Energy Center, Mathematics and Climate Research Network, pipelines, solar democracy, solar domination, solar energy, solar power, SolarPV.tv, Sonnen community, Spaceship Earth, the energy of the people, the green century, Tony Seba, utility company death spiral, wind energy, zero carbon | 4 Comments

Remember …

Updated, 2018-12-51, 14:10 ET

Remember who it was that told you it would okay, just fine to continue to emit CO2 as we have been, despite over 50 years of science, scientists, from physicists, to chemists, to engineers, and biologists saying it won’t be, it can’t be.

The AAAS recounts this. So does the American Chemical Society, and do you seriously think they are going to push some agenda when their members are employed predominantly by industry? Even if you think “How can it be? go and discover.

When your wealth depletes, and your kids and grandkids suffer and even die, remember who told you it would be okay and who kept this once great country from doing something about this.

Hold them accountable. This is more important than nearly anything else.

There are people who say that climate change does not cause big disasters. I find that highly disingenuous.

Would you continue to play a card game, any game of chance, if the odds of your losing increased steadily over time, because the cards are biased or the chance device is weighted against you? That’s weather today.

Update, 2018-12-11, 17:40 ET

“There is extraordinary frustration,” a U.S. intelligence official said. The CIA and other agencies continue to devote enormous “time, energy and resources” to ensuring that accurate intelligence is delivered to Trump, the official said, but his seeming imperviousness to such material often renders “all of that a waste.”

How do people think the United States and international scientific communities feel about government and public responses to their repeated warnings?

Update, 2018-12-15, 14:10 ET

Influential people working to implement greenhouse gas mitigation continue to indulge in Magical Thinking. But, unfortunately, a modest fee on Carbon is worse than none. It needs to be stiff enough to hurt and change behavior: Beginning at a couple of hundred dollars per metric tonne CO2. Also, although it is more difficult and costly to enforce, it would be better to apply this to the consumption end that the source end. In any case, a source end increase of $200/tonne would increase the price of gasoline in the United States by about US$1.80, and natural gas by US$11/cubic foot.

Posted in American Association for the Advancement of Science, American Chemical Society, Anthropocene, climate change, ecological disruption, global warming, stranded assets, the right to be and act stupid, the tragedy of our present civilization | Leave a comment

Codium fragile, for 5th December 2018

Less frequent than I originally intended, but here’s today’s:

(Click image to see a larger figure, and use browser Back Button to return to blog.)
Posted in Akamai Technologies, economics, entrpreneurs, Tom Leighton | Leave a comment

Edward Gorey, mischievous artist, droll mirror of Life

Updated, 2018-12-05

I’m a fan of Edward Gorey’s work and life story. There is, today, a profile of Edward Gorey and his life by Joan Acocella at and of The New Yorker.

I’m a member of the House, and just visited it with Claire. See photos below.

The Gorey Store has a lot of exquisite items for sale for the holidays, sure to bring a smile.

And I think most of them are great for kids. See, for evidence, the Jones and Ponton Killing Monsters.

My relation with Gorey’s work began in sophomore year of high school, 1968, 50 years ago, when I came across a small work of drawings on one of those crowded shelves in a bookstore in Harvard Square. I got the notion of getting a copy for my teacher of literature and debating coach. I did. He seemed delighted. Mr Gorey’s book reminded me of him.


Update, 2018-12-04

I, of course, have no direct experience of Edward Gorey, even to make a first impression. Docents at the Gorey House suggest Mr Gorey was shy or, if not shy, someone who thought “Why would anyone want to know me?” For comparison to Dery’s biography, there is the slim volume by Mr Gorey’s good friend, Alexander Theroux for comparison. (The Strange Case of Edward Gorey, Alexander Theroux.) I have little reason to doubt Ms Acocella’s remarks about the inconsistencies of Dery’s analysis of Mr Gorey. A senior docent at the Gorey House, which sells Mr Dery’s book in their gift shop, implied there were shortcomings, but was nevertheless appreciative that there was, at last, some biography.

While Ms Acocella’s review of Dery’s attempt might show a fondness for Mr Gorey, shortcomings are present in her treatment, too. She doesn’t mention Theroux at all. She doesn’t mention the existence of the Gorey House in Yarmouth Port, even less giving it a plug. And she’s incomplete in her assessment of Mr Gorey’s bequests, for example, to “animal-welfare societies” and she chooses to highlight Bat Conservation International. (That was too cute.) It was in The New Yorker and she’s a major writer for them, so I presume Ms Acocella did have room to mention the contrast between Mr Gorey’s animal welfare interests and cats, and his (early?) fondness for raccoon coats. (I know, “It was another time ….”) A list of such societies is presented in the photo below, from the House.

I also disagree with Ms Acocella’s ready agreement that Mr Gorey had somehow “lost his talent”. Gorey continued to produce, to struggle to find expression, to be himself. Note the remark he made upon ink and papers:

I think it remiss, too, to omit that, as minor as it might be, Mr Gorey has a small, quiet, cultish following, of which I consider myself a part. No doubt he’ll eventually be mythologized, like Tolkien, as any regalers of any life are bound to do. It’s inevitable since most records about it are not in the record. But it’s a way to continue Mr Gorey’s joy.


Update, 2018-12-05

The curator of the Edward Gorey House kindly recommended to me another review of Dery’s biography, this one by Evan Kindley at The New Republic. (I would gladly credit the curator’s name, but I haven’t asked permission, so so don’t want to presume.) I had a read.

It’s author has offered many interesting columns. I was drawn to his profile of Kurt Vonnegut’s years at General Electric, during the time Vonnegut wrote science fiction. While I respect Mr Vonnegut’s books and ideas (but not, I think, as much as my wife, Claire, does), the important thing is to know who is writing a review of a biography. Mr Edward Gorey was, to me, a vastly more important artist than, say, Mr Vonnegut. I’ll say why before the end. Kindley picked for his choice of review another’s book on Mr Vonnegut’s time at GE. It’s clear he thought that connection both curious, even exotic, given Mr Vonnegut’s later views, and formative. Accordingly, there is a notion of some homonuclear model at work in Kindley’s head, perhaps of a preformatory artist. That’s relevant.

I like the Kindley review. It feels more honest, committed in some ways, than the view-from-afar of Acocella. But:

You can feel him pushing the limits of his chosen mediu — the illustrated book — just as Stein and Queneau pushed the novel, Beckett the play, or Duchamp the painting … He is at once essentially limited and infinitely ambitious.

I don’t buy it. That’s a major puzzler-solver being described. Mr Gorey, and again I am no expert, seems more to me the essence of the genius, which is the child forever at play, walking down a beach, picking up a shell and getting all excited about it. Then, in an hour, or a day, becoming bored with it, and moving on. I think he’s more someone who erects a frame, builds a building, and tears the frame away — and, incidentally, some of the building — leaving it stable, but barely so, and also leaving its admirers wondering how does it stand up?

I think Mr Kindley’s analysis of the Dery Gorey-was-gay proposition is spot on. I see it as a statistician: How can someone legitimately infer Mr Gorey’s interests there by simple association of friends? I’d wager the circles he encountered had a higher-than-average propensity of declared same-gender-preferring people, and, so, if he picked friends at random, that’s what he’d get. Or bisexual. Or queer. I think Mr Gorey’s own characterization should suffice. What did he really have to gain by suppressing such?

There are also minor quibbles:

  • What’s this No. 37 Penpoint thing? Is it a metaphor? Mr Gorey reported what he used: Hunt #204.
  • And regarding “Gorey’s characters often strike balletic poses and tend to stand with their feet turned out, in ballet positions”, which is actually a quote from Mr Dery’s biography, but Kindley seems to heartily agree, well, it’s (a) a stable way to stand, and (b) it is arguably a more interesting pose for a viewer to see a character, including leaving the character’s body language more open.

Why is Mr Gorey an important artist to me?

One of the poets my sophomore literature teacher (the Gorey booklet recipient) introduced was one Wallace Stevens. This would be a life-changing introduction, and Mr Stevens has always coupled me into thought and feeling closer than nearly any formal religion. I was brought up Catholic, including a thorough Catholic education. I turned pantheist, then agnostic, then converted to Judaism. I dwelt there for years, raising two sons in the tradition. I was intrigued by Buddhism, practiced being a Jew-Bu, and then I blasted out to where I felt most at home, an atheist, nay, physical materialist. It’s not that, for instance, Catholicism or Judaism were “wrong”. It is a path. I’m (now) happily affiliated with the Unitarian Universalist congregation of Needham, Massachusetts. (You need to know who’s writing this, too.)

A singular excerpt from one of Mr Stevens’ poems (The Idea of Order at Key West) goes:

Oh! Blessed rage for order, pale Ramon,
The maker’s rage to order words of the sea,
Words of the fragrant portals, dimly-starred,
And of ourselves and of our origins,
In ghostlier demarcations, keener sounds.

Now those are words to live by. And, I think at core, Kindley didn’t miss this about Gorey when he observed about what readers and viewers might think:

He put all that work into this?

It’s true, my guidewords hew closer to those of Milton who, while being fully critical, not praising, wrote (Paradise Lost):

… or if they list to try
Conjecture, he his Fabric of the Heav’ns
Hath left to thir disputes, perhaps to move
His laughter at thir quaint Opinions wide
Hereafter, when they come to model Heav’n
And calculate the Starrs …

In one way or another, those words, and to some extent, Stevens’ Idea, are the story of my personal life.

So, Ms Acocella quotes Edward Gorey near the start of her review, and Mr Kindley underscores in his:

I’m beginning to feel that if you create something, you’re killing a lot of other things. And the way I write, since I do leave out most of the connections, and very little is pinned down, I feel that I am doing a minimum of damage to other possibilities that might arise in a reader’s mind.

And that’s it. Vonnegut is less a lover of ambiguity. He doesn’t let it flow. His stories have a point. That’s a problem.

In the end, ambiguity is all we have. Whether it’s what’s left out in a story, or what a scientific calculation implies but does not say, or is Mr Stevens a poet or an insurance company executive, was Mr Gorey a goth or not, these are unanswerable. (Well, nearly so: Gorey referred to the gothic as a costume, like his raccoon coat, as quoted by Mr Kindley.) No, not that. They should not be answered. For, art is, if anything, as the comic Gilda Radner said in a famous quote:

I wanted a perfect ending. Now I’ve learned, the hard way, that some poems don’t rhyme, and some stories don’t have a clear beginning, middle, and end. Life is about not knowing, having to change, taking the moment and making the best of it, without knowing what’s going to happen next.

Delicious Ambiguity.

So Mr Gorey reminds us with every page of sketches, every attempt at play. And we badly need reminding. It is for me, at least, a refuge, and a source of meaning.



Posted in Edward Gorey, Yarmouth Port | Leave a comment

Quick program note: Abandoning Github for my own code

I’ve abandoned Github to store my own code and have, instead, opted to simply dump it into a shared read-only Google Drive folder.

Too much trouble, and I don’t really need deep source control, even though Google offers it with its Google Drive, for free.

What really set me onto this choice was the apparent bias Atlassian Bitbucket has for retaining code-like material rather than datasets, and their 2 Gb ceiling. I was willing to live with that, but, then, fell into the difficulty of trying to expunge some largish datasets and having to prune them from the Github history, all doing this from Windows Sourcetree, or, rather, the command line side support it offers, which is hobbled.

I use Sourcetree to grab and keep up with others Github repositories.

Posted in science | Leave a comment

The Climate Crunch

(with the possibility of rapid 15-20 foot SLR out there)

David Suzuki aptly calls the corner we’ve painted ourselves into “the climate crunch”.

See his article.

Why a “crunch”?

Had we heeded early warnings and had political representatives done more than talk, we likely could have addressed the problem with minimal societal disruption. But the industry-funded denial machine, which continues today, has been effective. Concern about climate change and other environmental issues has diminished as the problems have intensified. Politicians continue to think in terms of brief election cycles, focusing on short-term gains from exploiting fossil fuels rather than long-term benefits of conserving energy and shifting to cleaner sources.

Meanwhile, greenhouse gas emissions continue to rise and carbon sinks like forests and wetlands are still being destroyed. Even if we stopped using fossil fuels tomorrow, we’ve emitted so much carbon dioxide and other greenhouse gases that we wouldn’t be able to avert worsening of the consequences already happening. But we still have time — albeit very little — to ensure the problem doesn’t become catastrophic. The Intergovernmental Panel on Climate Change, which is conservative in its estimates, gives us about 12 years to take decisive action.

The thing is, circumstances are so bad now that fixing this will take large, industrial scale measures, and be triply costly, (a) to make a rapid transition away from fossil fuels, (b) adapt to the impacts that are ever increasing and weren’t anticipated to come this quickly, and (c) to remove Carbon Dioxide from the climate system so to limit further deterioration.

Even those who accept the science and the urgency are, in my opinion, pursuing pipe dreams. Some think we can jettison capitalism and solve this. Some think we need to make environmental justice our primary constraint. Some think we can solve this by pursuing marketplace measures for solar energy (which includes wind). Some think we can protect all ecosystems while rolling out the measures we need to take to fix the situation.

It’s too late.

We need to do this fast. We don’t have a lot of time. The kind of future I see is one where the world as an economy does Carbon Dioxide removal as the central economic activity, akin to building the tombs of pharaohs was for ancient Egypt. Corporations can and must exist because, frankly, we don’t have the centuries or decades available to create an alternative structure. Government planning doesn’t work. (Look at the administrative nightmares that are the U.S. EPA or the Army Corps of Engineers as described in Mary Christina Wood’s Nature’s Trust.) We need global scale engineering and technical skills. We need capital.

Exerpt of API President Frank Ikard’s 1965 speech on climate change and fossil fuels. API is the American Petroleum Institute.

Quick take from Professor Richard Alley:

Full interview with Professor Alley:

Posted in adaptation, American Association for the Advancement of Science, American Petroleum Institute, an ignorant American public, an uncaring American public, Anthropocene, being carbon dioxide, bridge to somewhere, carbon dioxide, carbon dioxide capture, carbon dioxide sequestration, Carbon Worshipers, cement production, clear air capture of carbon dioxide, climate, climate change, climate disruption, climate economics, Cult of Carbon, David Suzuki, emissions, geoengineering, global warming, Hyper Anthropocene, James Hansen, klaus lackner, Wally Broecker | Leave a comment

Media treatment of the 4th National Climate Assessment

Regarding media treatment of the 4th National Climate Assessment:

(Updated, 29 Nov 2018)

The Fourth National Climate Assessment (NCA4) fulfills that mandate in two volumes. This report, Volume II, draws on the foundational science described in Volume I, the Climate Science Special Report (CSSR).2 Volume II focuses on the human welfare, societal, and environmental elements of climate change and variability for 10 regions and 18 national topics, with particular attention paid to observed and projected risks, impacts, consideration of risk reduction, and implications under different mitigation pathways. Where possible, NCA4 Volume II provides examples of actions underway in communities across the United States to reduce the risks associated with climate change, increase resilience, and improve livelihoods.

This assessment was written to help inform decision-makers, utility and natural resource managers, public health officials, emergency planners, and other stakeholders by providing a thorough examination of the effects of climate change on the United States.

Considering the collective effort and review put into preparing this report, complete with a review by the National Academies, and a public comment period, you would think digital and visual media would spend more time on it. But no. Well, at least that’s what I thought. Actually, print and online media didn’t do too badly.

I was alerted to this by Peter Sinclair’s blog Climate Denial Crock of the Week.

In contrast:

Moreover, perhaps because of blowback or second thoughts, AC360 on CNN did carry an interview with Hayhoe.

The Washington Post only treated the report as part of their continuing conversation regarding President Trump.

Commonwealth Magazine offered two op-ed pieces, one by Craig Altemose on a “Green New Deal” for Massachusetts, and the other by Eric Wilkinson on how Boston needs to do more on climate change. Both are excellent, but neither alluded to the National Climate Assessment. Rather they cited Massachusetts own evaluations of needs and risks. The Magazine, on the other hand, carried a story written by Bruce Mohl featuring, once again, Gordon Van Welie of ISO-NE about the challenges of running a New England-wide power grid over the next several years, and Dan Dolan of New England Power Generators Association lamenting the “existential crisis” that faces New England wholesale markets for electricity. Unlike past articles, neither came out in favor of expanding the role of natural gas. That’s being done, in part, by the governments of Maine and New Hampshire. Were Altemose and Wilkinson “balanced reporting”?

Then again, Van Welie and Dolan aren’t exact Bernard McNamee, nominee to be the FERC Chair:

And that’s fortunate.

The Economist carries quite a few articles regarding climate change, its impacts, and its mitigation, but these are primarily from an international perspective. They hardly mentioned NCA4. However, there was this.

The Financial Times carried a brief newspiece on the report, saying less about its contents than about the reactions of 45, Senator Sheldon Whitehouse, and May Boeve, Executive Director of 350.org.

I have already commented on how FiveThirtyEight covered the NCA4. Their parent, ABCNews, mentioned the report but principally focussed upon it being delivered from an Executive where the head immediately disparaged its findings

“Let’s keep moving. This is just a bunch of papers about climate change.” (from The New Yorker)

Posted in journalism, science | Leave a comment

Comment on “How Much Does Climate Science Matter In A World Run By Politics?” (from FiveThirtyEight.com)

It’s odd that 538 only accepts comments from people with Facebook accounts, despite being associated with ABCNews, which has its own user accounting system. I’m commenting here instead #fivethirtyeight.

Anyway, per this post, a recent article and podcast at 538 demonstrates there is a poor understanding regarding global warming, climate change, its consequences, and these assessments, even by educated Democrats. Taking the last first, the latest National Climate Assessment is the 4th, and it’s authorized and required by an act of Congress, once every 4 years. However, there are basically two volumes produced, an updated assessment of climate science, and, then, in the next year, an updated assessment of impacts. These reports are hardly produced in isolation: In addition to being compiled and written by a large team of scientists, they are each independently reviewed by the National Academies of Science, Medicine, and Engineering. Moreover, there is a comment period where the public can comment on the reports. Comments by the Academies and by the public are addressed by the team from the U.S. Global Change Program producing the reports and these are available at the site.

All that said, there is also a misunderstanding about the scope of climate change. CO2 is not like most other pollutants in that it has a very long life. That means it accumulates, and, not only is the USA a major producer of CO2, it owns a substantial chunk of the accumulated emissions. Moreover, because of CO2’s long life and other physical aspects, such as 90% of the excess heat going into oceans, the trouble is that if we collectively stop emitting, we’ll keep damage and change from getting worse, but it won’t reverse, not for at least centuries. Moreover, there is a lag between the forcings and causes of additional energy and manifestations as effects. This is a very system, and, if we stop, it will keep getting worse for a decade or more. Some systems on Earth, like ice sheets, respond even more slowly. It’s agreed by many glaciologists, for example, that the West Antarctic Ice Sheet (WAIS) is doomed to collapse, even if that will take a couple of centuries to be realized.

To the comment that why is warming bad, the historical record which, by now, is much better established than it was for NCA2 or even NCA3, shows humanity has never lived in a time when temperatures overall were this extreme. It isn’t just temperature, it’s energy available to weather systems and moisture aloft that matters, not to mention things like loss of ice.

(Hat tip to The Economist which reprinted the graph from The Lancet.)

Also, because of temperatures and oceanic acidification, while primary productivity of oceans and forests may increase for a time, ultimately these will be limited and reverse. Experiments show that plants get used to having an abundance of CO2 and aren’t as effective sinks. There are some controlled experiments which even suggest forests and plantings could be net CO2 sources, if plant respiration exceeds rate of CO2 consumption. A lot depends upon the microbial mix in soils where plants grow, and this is sensitive to temperature, CO2 concentration in atmosphere, and available moisture. For example, arid conditions aren’t conducive to CO2 take-up. It is believed, too, that enhanced growth is limited by available Nitrogen.

And there are other impacts as well anticipate by the science, such as changes in oceanic circulation, which could have major consequences for regional weather and distribution of moisture. The trouble with these kinds of perturbations is that they are beyond direct experience by people, even if there is really solid evidence they’ve happened before.

I think the posture of the present administration that the NCA is a report produced by some fringe group really is at odds with the process and its depth. It hardly is a surprise. It’s produced on a regular schedule. It’s possible for anyone to engage with it. And the emergent understanding available on climate change and global warming is breathtaking in depth as well as breadth: It’s understood by ecologists and biologists as well as geophysicists. Even doctors and epidemiologists are seeing its effects.

FiveThirtyEight‘s political podcast on this report missed a lot of these aspects. In that respect, their journalism was disappointing here.

By the way, to the claim of 45 that the United States is among the cleanest of countries on emissions, it just ain’t:

And, since cumulative emissions are what matter, the United States has a lot it’s responsible for:

But this doesn’t prevent 45 or Forbes, for that matter, pointing their fingers elsewhere:

Posted in Accuweather, American Association for the Advancement of Science, American Meteorological Association, an ignorant American public, an uncaring American public, anti-science, climate, climate change, climate disruption, denial, FiveThirtyEight, global warming | Leave a comment

Bad Science kills. When quality is repeatedly sacrificed for quantity, we all pay.

https://www.bbc.co.uk/programmes/m0001b1k (from 28th November 2018)

An episode of Richard Dawkins‘ “Trust Me, I’m a Scientist.

Posted in American Association for the Advancement of Science, atheism, Boston Ethical Society, evidence, obfuscating data, science, Science magazine | Leave a comment

Not just having bad ideas, but because of deliberate ignorance despite overwhelming evidence, necessarily bad people

I’m afraid I need to agree with Krugman’s conclusion:

While Donald Trump is a prime example of the depravity of climate denial, this is an issue on which his whole party went over to the dark side years ago. Republicans don’t just have bad ideas; at this point, they are, necessarily, bad people.

There can be no excusing a systematic denial of reality, or of our single best means of understanding it, Science, no matter what the perceived economic consequences.

Understand, of course, I have no uncritical love for Democrats either, because they are not actual climate champions, and because they have simply assumed climate hawks, like myself, have no other choice than to support them, given the travesty that’s the Republican Party. Even Senator Elizabeth Warren supports paying people to live in high risk coastal areas and opposes properly assessing risk of re-flooding and damage.

Am I supposed to support her?

This is called denial. It’s a psychological condition.

And a providential warning …

Update, 2018-11-27

More reaction from Climate Denial Crock of the Week.

Tamino weighs in as well.

Posted in American Association for the Advancement of Science, an ignorant American public, an uncaring American public, Anthropocene, anti-science, Carl Sagan, climate, climate change, climate disruption, global warming, Hyper Anthropocene | 3 Comments