Tit-For-Tat in Repeated Prisoner’s Dilemma: President Donald Trump creates the Green New Deal

Jonathan Zasloff at Legal Planet offers “Donald Trump creates the Green New Deal”. The closing excerpt:

But what goes around comes around. A President Harris, or Warren, or Booker, etc. etc. can just as easily declare a National Emergency on Climate Change — one that would have a far better factual predicate than Trump’s patently false border emergency — and he or she will a lot more money to move around. After all, a lot of the climate crisis is about infrastructure, and if the relevant statute allows the President to move money from one project to another, then it is very easy to do that. Or the $100 billion that DOD has for national security emergencies: given that both the Pentagon and the heads of the national intelligence agencies have already said that climate represents a serious national security challenge, it’s not a hard legal lift (assuming intellectually honest and consistent judges, which of course we cannot). This fund must be for a military purpose, and a smarter, more energy efficient energy grid could do the trick.

It’s no way to run a democracy. But Trump and the GOP have made it clear that they do not believe in democracy, and as Robert Axelrod demonstrated years ago in his classic book The Evolution of Cooperation, the best strategy in repeat-player games to facilitate cooperation is playing Tit-For-Tat.

See also Generous Tit-For-Tat.

Update, 2019-02-18

Dan Farber writes on “National Security, Climate Change, and Emergency Declarations” at Legal Planet that:

If the Supreme Court upholds Trump, it will have to uphold an emergency declaration for climate change.

One reason why it would be hard for the Supreme Court to overturn a climate change declaration is that some attributes of climate change and immigration are similar. Both issues involve the country’s relations with the outside world, an area where presidential powers are strong. But it isn’t as if we suddenly found out about border crossings or climate change. Given these similarities, it would be very difficult for the conservative majority to explain why it was deferring to the President in one case but not the other.

The only major difference actually cuts strongly in favor of an emergency declaration for climate change: The U.S. government has already classified climate change as a serious threat to national security, and it is a threat that is getting stronger daily. Recent science indicates that climate action is even more urgent than we thought.

Trump’s stated justification in his proclamation is that “the problem of large-scale unlawful migration through the southern border is long-standing, and despite the executive branch’s exercise of existing statutory authorities, the situation has worsened in certain respects in recent years.” Trump’s stated justification in his proclamation is that “the problem of large-scale unlawful migration through the southern border is long-standing, and despite the executive branch’s exercise of existing statutory authorities, the situation has worsened in certain respects in recent years.” Climate change, too, is a “longstanding problem,” and it certainly has gotten worse despite the effort of the executive branch (Obama) to address the problem. Federal agencies, as well as Congress, have made it clear that climate is a serious threat to our nation.

Posted in climate change, game theory, global warming, Green New Deal | Leave a comment

“What’s new with recycling”

South Shore Recycling Cooperative Director Claire Galkowski,

claireSSRC

spoke in Norwell, at the South Shore Natural Science Center, a couple of weeks ago:

Posted in Amory Lovins, Anthropocene, biofuels, Carbon Cycle, Claire Galkowski, coastal communities, Commonwealth of Massachusetts, EBC-NE, ecomodernism, ecopragmatist, education, extended producer responsibility, extended supply chains, green tech, greenhouse gases, local self reliance, Massachusetts, microplastics, paper, plastics, public health, quantitative ecology, recycling, science, solid waste, South Shore Recycling Cooperative, sustainability | Tagged | 1 Comment

“Is the Green New Deal’s ambition smart policy?”

Ann Carlson is the Shirley Shapiro Professor of Environmental Law and the co-Faculty Director of the Emmett Institute on Climate Change and the Environment at UCLA School of Law. Writing at Legal Planet, she takes on assessing the Green New Deal, admitting she is “conflicted about a proposal that seems untethered to what is actually achievable.” She begins:

At the the heart of the Green New Deal — which demands slashing U.S. carbon emissions by 2030 by shifting to 100 percent clean energy — is a major conundrum. Even the most enthusiastic proponents of ambitious climate policy don’t believe the goals are achievable, technologically let alone politically. Stanford Professor Marc Z Jacobsen, for example, among the most ardent advocates for decarbonizing the electricity grid completely, believes that we can achieve 100 percent renewable energy by 2050, three decades after the Green New Deal’s target date. Ernie Monitz, the former Secretary of Energy under President Obama, laments that he “cannot see how we could possibly go to zero carbon in a 10-year time frame.” A number of columnists have noted that the Green New Deal will never become law because of its expense, its political impracticability and its technological infeasibility. And yet, the Green New Deal has attracted huge public support, the endorsement of all of the 2020 Democratic candidates for President, and a large number of Senators and members of Congress. It promises to mobilize a generation of young activists to work to solve the existential crisis of their lives.

Read on. She’s more optimistic than it sounds, although, I think Professor Carlson is realistic.

I remarked in a comment:

I wish the GND proponents well, too, although I worry about a couple of things.

First, the comparison with other environmental programs, while inspiring, is a little inappropriate. There has never been a problem of this scale, and not one whose amplification is so thoroughly integrated in with the daily comforts of affluent humans. Fossil fuels do have high energy densities, and that can be convenient.

Also, related to this, benefits do not accrue if we simply cease emitting. We have a timetable, and Nature will not scrub the harmful materials on any reasonable human timetable, and conditions at the moment we succeed at achieving zero emissions will persist for centuries. The alternative, artificial removal of atmospheric CO2, is both horrifically expensive (multiples of 2014 Gross World Product size at present prices) and pursuit of the technology has been explicitly rejected by GND proponents. (They’ve ruled out advanced nuclear technologies, too.)

Second, without policy which is “tethered to what is actually achievable”, GND suggests the bar is lower than it actually is and could, in itself, both present a moral hazard and make people think climate change is not being mitigated purely for reasons of politics and greed. (This is in bounds because the rejection of negative emissions technology is done because it, too, could be a kind of moral hazard.) Sure, those are involved, but it is also true people don’t like the things that a GND-style solution, or a Professor Mark Z Jacobson solution entail. In my opinion, their choice is silly, but people are people.

Third, aspirational, engineering-free solutions to big, big problems are likely to founder, because they won’t assess and contain their own complications, particularly if they are rushed. Uncoordinated rollout of zero Carbon energy won’t only trash pieces of the grid which will have repercussions for the less well off and people of color, but could also exacerbate climate conditions and regional weather. Large scale plantings, for example, of Jatropha curcas, thought to be a way of doing rapid CO2 drawdown and projecting biodiesel oils, could change albedo in the wrong direction for the arid regions it loves, and, indeed, could do itself in if the same regions transform into tropics. Uncoordinated rollouts of wind farms will affect weather system energies. That’s no reason not to do it, but it needs to be studied and thought through.

Fourth, there is (still) a substantial education component needed, one done in a manner that evades the impression climate change-fixing proponents are pulling their punches. For if byproducts of climate change are severe enough to move people into action, and gets them to accept sacrifices needed to do so, then they probably will expect to see improvements once these changes are made. The science says that expectation is unreasonable, because of the inertia of the climate system and because the human emissions impact is a perturbation on a geological scale in a geological moment. The political ramifications of this realization are both difficult to assess but could be damaging to the long term health of the collective project.

I did not mention other things, such as the intrinsic greenhouse gas emissions from agriculture, even if planting, harvesting, fertilization, transport, and processing are all decarbonized. Cement production is a big piece of emissions, too. The troubling thing is that GND doesn’t mention these: It focuses almost exclusively upon energy.

Update, 2019-02-11, 23:45 ET

Encouragement.

Posted in Anthropocene, anti-intellectualism, bollocks, bridge to somewhere, cement production, clear air capture of carbon dioxide, climate business, climate change, climate disruption, climate economics, climate education, global warming, Green New Deal, greenhouse gases, negative emissions, zero carbon | 1 Comment

From the YEARS Project: How Climate Impacts Mental Health (#climatefacts)

Dr Kate Marvel: “We need courage, not hope, to face Climate Change“.

Also the magnificent “We should never have called it Earth“, also from Dr Marvel.

In “Hope, despair and transformation: Climate change and the promotion of mental health and wellbeing“, ritze, Blashki, Burke, and Wiseman [International Journal of Mental Health Systems, 2008, 2(13)] note in a section titled “Emotional distress arising from awareness of climate change as global environmental threat”:

The question that McKibben raises is how psychologically, emotionally and politically should we as human beings respond to this fundamental change in the relationship between the human species and the world we inhabit?
.
.
.
For many people, the resulting emotions are commonly distress and anxiety. People may feel scared, sad, depressed, numb, helpless and hopeless, frustrated or angry. Sometimes, if the information is too unsettling, and the solutions seem too difficult, people can cope by minimising or denying that there is a problem, or avoiding thinking about the problems. They may become desensitised, resigned, cynical, skeptical or fed up with the topic. The caution expressed by climate change skeptics could be a form of denial, where it involves minimising the weight of scientific evidence/consensus on the subject. Alternatively, it could indicate that they perceive the risks of change to be greater than the risks of not changing, for themselves or their interests …
.
.
.
Notwithstanding the enormity of the climate change challenge, we know what many of the solutions are, and there are many actions that citizens can take individually and collectively to make a difference at household, local, national and global level. When people have something to do to solve a problem, they are better able to move from despair and hopelessness to a sense of empowerment.

Blashki, et al include a table from the Australian Psychological Society about how individuals can respond to the stress of being aware of climate change and its impacts:

Finally, there is the tongue-in-cheek yet serious work by Nye and Schwarzennager:

Posted in American Association for the Advancement of Science, Arnold Schwarzennegger, attribution, Bill Nye, climate change, climate grief, global warming | 1 Comment

Alright! I’m tired of all this serious shtuff … It’s time for some CLIMATE ADAM!

Posted in Anthropocene, carbon dioxide, climate change, glaciers, global warming, Hyper Anthropocene, ice sheet dynamics, oceans, sea level rise | Leave a comment

Status of Solar PV in Massachusetts

From PV Magazine‘s John Weaver:

At Solar Power Northeast, the DOER of Massachusetts noted that with the mandated 400 MW of qualified projects program review upcoming, and heavy volume deployed in National Grid territory, there is strong consideration to expand and evolve the SMART program.

Posted in Amory Lovins, Bloomberg New Energy Finance, clean disruption, CleanTechnica, Commonwealth of Massachusetts, decentralized electric power generation, decentralized energy, distributed generation, investment in wind and solar energy, ISO-NE, Massachusetts, Massachusetts Clean Energy Center, solar democracy, solar domination, solar energy, solar power, sustainability, the energy of the people | Leave a comment

“Applications of Deep Learning to ocean data inference and subgrid parameterization”

This is another nail in the coffin of the claim I heard at last year’s Lorenz-Charney Symposium at MIT that machine learning methods would not make a serious contribution to advancements in the geophysical sciences.

T. Bolton, L. Zanna, “Applications of Deep Learning to ocean data inference and subgrid parameterization“, Journal of Advances in Modeling Earth Systems, 2019, 11.

Posted in American Meteorological Association, American Statistical Association, artificial intelligence, Azimuth Project, deep learning, deep recurrent neural networks, dynamical systems, geophysics, machine learning, Mathematics and Climate Research Network, National Center for Atmospheric Research, oceanography, oceans, science, stochastic algorithms | Leave a comment

The shelf-break front, fisheries, climate change, and finding things out

From Woods Hole Oceanographic Institution.

Support them.

Claire and I do.

Posted in biology, climate change, climate disruption, ecological disruption, ecological services, ecology, global warming, oceanography, oceans, quantitative biology, quantitative ecology, WHOI, Woods Hole Oceanographic Institution | Leave a comment

Wake up, Massachusetts! Especially, Green Massachusetts!

I’ve been looking over the set of bills proposed for the current Massachusetts legislative session. There are more of them, all dealing with aspects of greening energy supply and transport. And Governor Baker’s S.10 is very welcome. (By the way, I don’t see any counter-proposals from those who don’t like the Governor politically, so, I’d say, they have no right to complain.) Adaptation to climate in Massachusetts is a serious thing:

and there will be many uncomfortable choices we’ll be facing soon, both pocketbook choices and choices of social equity. Indeed, many of the bills have environmental justice and social justice aspects. I’m all for that, as long as these are put in perspective.

It’s 2019. While Massachusetts has a Global Warming Solutions Act, it’s far from perfect, putting up an imperfect target of 2050 and, even then, deliberating excluding whole classes of emissions, such as waste-to-energy facilities. Even accepting it as a great goal, even if the impacts upon Massachusetts are controlled by many and varied parties all over the world, the Commonwealth currently has no believable roadmap for achieving those goals which are, after all, a law. This is especially true relating to transportation and to heating of homes. The world’s bullseye for containing emissions — a long shot — is 2030. Some say even that’s too late, given we’ve made so little progress, and governments and communities are faced with buying fossil fuel infrastructure and retiring it early, well ahead of the end of its depreciation lifetime.

All the evidence year after year is that the rate of impact from climate change is accelerating. What Massachusetts faces is the discomfort and significant cost of purchasing homes — at a substantial loss to their owners, and loss in tax base for their towns — on the coasts and inland which are too risky for their inhabitants, their towns, and the Commonwealth to permit their owners to continue to live there. This is called managed retreat (see also). And I see nothing, other than S.10, which begins to address this. And S.10 is modest.

I also don’t see on the energy side a developed appreciation for What’s Happening Out There. Climate change is important. It is the issue. Environmental justice or not, social justice or not, if this problem is not solved, none of the progress that has been made in 150 years of social advancement will matter: “All the good you’ve done, all the good you can imagine doing will be wiped out, just wiped out ….” (Van Jones). But, and these aspects are good, that’s not the only dynamic for which Massachusetts needs to plan.

Have you looked at solar and wind costs to generate a KWh of electricity recently?

They are tearing through the floor, especially onshore wind but, soon to be followed by solar. Why? Because Mr Market is seeing that their plummeting costs are not fantasies — Forbes writes about this all the time these days — they are a result of a differentiating technology, and that, yeah, there’s a pony in the barn. Solar and wind, supported by and supporting expanding energy storage, are going to Eat the Lunch of everyone in the energy industry. And this is happening with the fiercest antagonist to these technologies occupying the United States White House, with many supporting opponents numbered among the Republicans of Congress. Imagine what they will do with tailwinds?

But, there’s a problem. Massachusetts residents do not like to live near wind turbines or even large solar farms. Some complain that solar farms cause leveling of new growth forest — even if new growth forest does little or nothing to sequester CO2 — and impact habitat. And they just don’t like the looks. Massachusetts residents who say these things are really complaining about the low energy density per unit area which solar and wind have. That’s true. Fossil fuels have a high energy density. Nuclear power has a high energy density. Hydropower has a reasonably high energy density, but you can’t just find it anywhere. If you want to supply energy needs with wind and solar, you need a lot of land. Massachusetts isn’t a big state. Accordingly, if you want to supply energy needs with with and solar, you need to build them close to where people live. That’s better, in fact, because then you don’t need to run ecosystem-destroying transmission lines through forests.

If this is unacceptable, and you don’t want CO2 emissions, there is no choice but nuclear power or hydropower. As I noted, there’s only so much hydropower, and there needs to be cooperation among the people who live in states the transmission needs to cut in order to get access to it.

Nuclear power, as presently practiced, has a large cost problem. There are measures being pursued to fix that, but it’s not clear how soon these will be available. We need nuclear power that’s modular, with small units, that can be combined into arbitrary sizes, that can be toggled on and off as needed, that’s air-cooled, where each of the units are portable. We need nuclear power in commodity chunks. The industry chose not to do that in the 1960s and they have suffered with their choice ever since. Modular units can just be trucked away intact if they are broken or need their wastes scrubbed. If a unit fails, the generation doesn’t all go down because there are many more companions generating. Having cooling water is an ecological and climate problem — many reactors need to go offline if their nearby cooling rivers dry up in droughts — so air cooling is a natural response.

But nuclear power isn’t popular.

Facts are, unless Massachusetts residents opt for onshore wind turbines and big solar, both backed by substantial storage, all located near residentially zoned areas, they are going to end up with natural gas as their energy supply. It’s dense. It can be hidden.

But, if they do, the future of Massachusetts not only lacks a clean energy future, it also has a future of a rustbelt. That’s because natural gas will eventually be the most expensive energy source. Coal and oil will be long gone. Conventional nuclear power is too expensive even now because they suffer from a negative learning curve. Everyone will be using wind and solar, backed in places by storage, but as everyone adopts these, the storage will be needed less and less.

What will be Massachusetts’ fate?

With expensive electrical energy, not only will companies not want to do business in Massachusetts because their energy supply isn’t clean, an increasing criterion over time, due to shareholders and customers, but it will be the most expensive energy anywhere. It will get worse. The companies supplying Massachusetts don’t live in isolation. Selling natural gas anywhere will become more and more difficult, and some and eventually all of those companies will go bankrupt. To maintain energy, Massachusetts will need to buy those assets and run them, perhaps by giving them to someone else to run, but this will be expensive, and this will go on the tax base. That will be an additional disincentive for companies to build and work in Massachusetts, and for people to live in Massachusetts.

In addition, there will be the inevitable costs and charges from climate change. These Massachusetts does not have complete control, but to the degree it doesn’t champion means for zeroing emissions and using 100% zero Carbon energy, it will stifle its significant voice encouraging others that this is a feasible model. That voice can do more to nudge the rest of the world in the zero Carbon direction, much more than anything Massachusetts will do by zeroing its own emissions. These costs will ultimately fall on the Commonwealth’s books and, so, upon the taxpayers, whether they live it or not, whether or not the ability of the Commonwealth to pay is supposedly constrained by law. Solvency is a powerful reason for overturning laws.

So, from what I see, either Massachusetts residents learn to live next to onshore wind and big solar farms, or they choose new nuclear power — and we don’t know how long that’ll take — or they choose natural gas, with the economic downsides I have just described.

I don’t think many in the progressive and environmental movements in Massachusetts have thought about these tradeoffs. They somehow think demand can be reduced so these tradeoffs are not necessary. They are not thinking quantitatively, or, for that matter, factually. It appears to me many of them have an agenda to pursue, and evidence just gets in the way. This is not serving the Commonwealth.

Climate reality is an elixir which exposes the truth. Whether it’s Thwaites Glacier or the slowdown of the Gulf Stream, or excessive precipitation, Massachusetts will need to deal with these.

Fortunately, should Massachusetts residents change their minds, onshore wind turbines are very easy and inexpensive to construct, as are big solar farms. And flooded properties are cheap to buy up.

What kind of future do you want, Massachusetts? Do you want to plan, and help it be a good one? Or do you want to bury your head in the ever eroding sand?

“Climate change is coming for your assets”

Posted in American Association for the Advancement of Science, Anthropocene, Cape Wind, climate business, climate change, climate disruption, coastal communities, Commonwealth of Massachusetts, decentralized energy, electric vehicles, electrical energy storage, electricity markets, emissions, fossil fuels, global warming, Governor Charlie Baker, Hyper Anthropocene, ice sheet dynamics, investment in wind and solar energy, leaving fossil fuels in the ground, sea level rise, seawalls, solar domination, solar energy, solar power, the value of financial assets, wind energy, wind power, wishful environmentalism, zero carbon | Leave a comment

Repeating Bullshit

Yeah, how much was it?

And was it different? I mean, not based on how Curry or Tisdale feel, but by the numbers.

Open Mind

Question: How does a dumb claim go from just a dumb claim, to accepted canon by the climate change denialati?

Answer: Repetition.

Yes, keep repeating it. If it’s contradicted by evidence, ignore that or insult that. Repeat it again. If you’re asked for evidence, ignore that or insult that, just keep repeating it. That’s how things get burned into brains.

View original post 539 more words

Posted in American Statistical Association, anomaly detection, changepoint detection, climate change, Grant Foster, Mathematics and Climate Research Network, maths, science, statistics, Tamino, time series, unreason | Leave a comment

Stream flow and P-splines: Using built-in estimates for smoothing

Mother Brook in Dedham Massachusetts was the first man-made canal in the United States. Dug in 1639, it connects the Charles River at Dedham, to the Neponset River in the Hyde Park section of Boston. It was originally an important source of water for Dedham’s mills. Today it serves as an important tool for flood control on the Charles River.

mb_img_20171216_151349-01

Like several major river features, Mother Brook is monitored by gauges of flow maintained by the U.S. Geological Survey, with careful eyes kept on their data flows by both agencies of the Commonwealth of Massachusetts, like its Division of Ecological Restoration, and by interested private organizations, like the Neponset River Watershed Association and the Charles River Watershed Association. (I am a member of the Neponset River Watershed Association.) The data from these gauges are publicly available.

Such a dataset is a good basis for talking about a non-parametric time series smoothing technique using P-splines (penalized B-splines), an example of local regression, and taking advantage of the pspline package to do it. Since this, like most local regression techniques, demands a choice of a smoothing parameter, this post strongly advocates for pspline as a canonical technique because:

  • it features a built-in facility for choosing the smoothing parameter, one based upon generalized cross validation,
  • like loess and unlike lowess in R, it permits multiple response vectors and fits all of them simultaneously, and
  • with the appropriate choice in its norder parameter, it permits the estimation of derivatives of the fitted curve as well as the curve itself.

Finally, note that while residuals are not provided directly, they are easy to calculate, as will be shown here.

In fairness, note that loess allows an R formula interface, but both smooth.Pspline and lowess do not. Also, smooth.Pspline is:

  • intolerant of NA values, and
  • demands the covariates each be in ascending order.
Note from 2019-01-30

Note that the lack of support by the pspline package for the multivariate case has thrown, so to speak, the gauntlet down, in order to find a replacement. Since I’m the one who, in the moment, is complaining the loudest, the responsibility falls to me. So, accordingly, I commit to devising a suitable replacement. I don’t feel constrained by the P-spline approach or package, although I think it foolish not to use it if possible. Such a facility will be the subject of a future blog post. Also, I’m a little joyful because this will permit me reacquaintance with some of the current FORTRAN language definition, using the vehicle of Simply Fortran, and its calling from R. This is sentimental, since my first programming language was FORTRAN IV on an IBM 1620.

References

For completeness, consider the AdaptFit package and related SemiPar package which also offer penalized spline smoothing but are limited in their support for multiple responses.

(Update, 2019-01-29)

I re-encountered this paper by Professor Michael Mann from 2004 which addresses many of these issues:

Incidentally, Professor Mann is in part responding to a paper by Soon, Legates, and Baliunas (2004) criticizing estimators of long term temperature trends. The Dr Soon of that trio is the famous one from the Heartland Institute who has been mentioned at this blog before.

The dataset

What’s does stream flow on Mother Brook look like? Here’s eight years of it:

(Click on image for a larger figure, and use browser Back Button to return to blog.)

Smoothing with P-splines, Generalized Cross Validation

Using a cubic spline model, the package pspline finds a smoothing parameter (“spar“) of 0.007 is best, giving a Standard Error of the Estimate (“SEE”) of 0.021:

(Click on image for a larger figure, and use browser Back Button to return to blog.)

Forcing the spline fit to use spar values which are larger, one of 0.5, and one of 0.7 produces a worse fit. This can also be seen in their larger G.C.V criteria, of 228 and of 237, compared with the automatic 185:

(Click on image for a larger figure, and use browser Back Button to return to blog.)

(Click on image for a larger figure, and use browser Back Button to return to blog.)

Code

The code for generating these results is shown below.


#
# Mother Brook, P-spline smoothing, with automatic parameter selection.
# Jan Galkowski, bayesianlogic.1@gmail.com, 27th January 2019.
# Last changed 28th January 2019.
#

library(random)   # For external source of random numbers
library(FRACTION) # For is.wholenumber
library(tseries)  # For tsbootstrap
library(pspline)

source("c:/builds/R/plottableSVG.R")

randomizeSeed<- function(external=FALSE)
{
  #set.seed(31415)
  # Futz with the random seed
  if (!external)
  {
    E<- proc.time()["elapsed"]
    names(E)<- NULL
    rf<- E - trunc(E)
    set.seed(round(10000*rf))
  } else
  {
    set.seed(randomNumbers(n=1, min=1, max=10000, col=1, base=10, check=TRUE))
  }
  return( sample.int(2000000, size=sample.int(2000, size=1), replace=TRUE)[1] )
}

wonkyRandom<- randomizeSeed(external=TRUE)

stopifnot( exists("MotherBrookDedham") )

seFromPspline<- function(psplineFittingObject, originalResponses, nb=1000, b=NA)
{
  stopifnot( "ysmth" %in% names(psplineFittingObject) )
  #
  ysmth<- psplineFittingObject$ysmth
  #
  if (is.null(dim(originalResponses)))
  {
    N<- length(which(!is.na(ysmth)))
    stopifnot( length(originalResponses) == N )
  } else
  {
    stopifnot( all( dim(originalResponses) == dim(ysmth) ) )
    N<- nrow(ysmth)
  }
  #
  if (is.na(b))
  {
    b<- round(N/3)
  } else
  {
    stopifnot( is.wholenumber(b) && (4 < b) && ((N/100) < b) )
  }
  #
  R<- originalResponses - ysmth
  #
  # Don't assume errors are not correlated. Use the Politis and Romano stationary
  # bootstrap to obtain estimates of standard deviation(s) and Mean Absolute Deviation(s), 
  # where these are plural of there is more than one response.
  #
  # The standard error of the estimate is then just adjusted for the number of non-NA
  # observations.
  #
  if (is.null(dim(originalResponses)))
  {
    Ny<- 1
    booted.sd<- tsbootstrap(x=R, nb=nb, statistic=function(x) sd(x, na.rm=TRUE), m=1, b=b, type="stationary")
    SD<- mean(booted.sd$statistic)
    SEE<- SD/sqrt(N)
    booted.mad<- tsbootstrap(x=R, nb=nb, statistic=function(x) mad(x, constant=1, na.rm=TRUE), m=1, b=b, type="stationary")
    MAD<- mean(booted.mad$statistic)
  } else
  {
    Ny<- ncol(ysmth)
    SD<- rep(NA, Ny)
    SEE<- rep(NA, Ny)
    MAD<- rep(NA, Ny)
    for (j in (1:Ny))
    {
      nonNA<- which(!is.na(R[,j]))
      booted.sd<- tsbootstrap(x=R[nonNA,j], nb=nb, statistic=function(x) sd(x, na.rm=TRUE), m=1, b=b, type="stationary")
      SD[j]<- mean(booted.sd$statistic)
      SEE[j]<- SD/sqrt(length(nonNA))
      booted.mad<- tsbootstrap(x=R[nonNA,j], nb=nb, statistic=function(x) mad(x, constant=1, na.rm=TRUE), m=1, b=b, type="stationary")
      MAD[j]<- mean(booted.mad$statistic)
    }
  }
  return(list(multivariate.response=!is.null(dim(originalResponses)), number.of.responses=Ny,
              SD=SD, MAD=MAD, SEE=SEE))
}

MotherBrookDedham.nonNA<- which(!is.na(MotherBrookDedham$gauge))
# Note method == 3 is Generalized Cross Validation (Craven and Wahba, 1979), and
# the value of spar is an initial estimate. The choice of norder == 2 is arbitrary.
MotherBrookDedham.fitting<- smooth.Pspline( x=MotherBrookDedham.nonNA, y=MotherBrookDedham$gauge[MotherBrookDedham.nonNA], 
                                            norder=2, spar=0.3, method=3)
# Using 90 days as mean block length, about a quarter of a year
MotherBrookDedham.estimate.bounds<- seFromPspline(psplineFittingObject=MotherBrookDedham.fitting, 
                                                  originalResponses=MotherBrookDedham$gauge[MotherBrookDedham.nonNA], nb=1000, b=91)

fx<- openSVG(root="MotherBrookDedham-RawFlowData-Daily-withSmooth", width=24, height=round(24/2), pointsize=8)

plot(MotherBrookDedham$gauge, type="n", xaxt="n", ylab="mean (over day) cubic feet per second", main="",
     xlab="", cex.lab=2, cex.axis=2, ylim=c(-80, 650))
title(main=sprintf("Raw flow data, Mother Brook at Dedham, agency %s, site %s, fit with cubic smoothing spline", 
      MotherBrookDedham$agency_cd[1], MotherBrookDedham$site_no[1]), 
      cex.main=3, font.main=2, family="Times")     
N<- nrow(MotherBrookDedham)
S<- seq(1, N, 30)
axis(side=1, at=S, line=-13, labels=MotherBrookDedham$datetime[S], las=2, cex.axis=2, font.axis=2, cex.lab=1.5, tick=FALSE)
abline(v=S, lty=6, col="grey")
points(1:N, MotherBrookDedham$gauge, pch=21, cex=1.2, col="blue", bg="blue")
lines(MotherBrookDedham.nonNA, MotherBrookDedham.fitting$ysmth, lwd=1, lty=1, col="green")
text(which.max(MotherBrookDedham.fitting$ysmth), max(MotherBrookDedham.fitting$ysmth), pos=2, offset=2,
     font=2, cex=2, labels=sprintf("Found smoothing SPAR = %.3f, and G.C.V. value =  %.1f", 
                                   MotherBrookDedham.fitting$spar, MotherBrookDedham.fitting$gcv), family="Helvetica")
text(which.max(MotherBrookDedham.fitting$ysmth), 0.95*max(MotherBrookDedham.fitting$ysmth), pos=2, offset=2,
     font=2, cex=2, labels=sprintf("SD = %.3f, MAD = %.3f, SEE = %.3f", 
                                   MotherBrookDedham.estimate.bounds$SD, MotherBrookDedham.estimate.bounds$MAD, 
                                   MotherBrookDedham.estimate.bounds$SEE), family="Helvetica")
closeSVG(fx)

# Force the same P-spline to use an arbitrary smoother SPAR by electing method == 1, and setting SPAR = 0.5.
MotherBrookDedham.fitting.p5<- smooth.Pspline( x=MotherBrookDedham.nonNA, y=MotherBrookDedham$gauge[MotherBrookDedham.nonNA], 
                                            norder=2, spar=0.5, method=1)
# Using 90 days as mean block length, about a quarter of a year
MotherBrookDedham.estimate.bounds.p5<- seFromPspline(psplineFittingObject=MotherBrookDedham.fitting.p5, 
                                                  originalResponses=MotherBrookDedham$gauge[MotherBrookDedham.nonNA], nb=1000, b=91)

fx<- openSVG(root="MotherBrookDedham-RawFlowData-Daily-withSmooth-with-SPARp5", width=24, height=round(24/2), pointsize=8)

plot(MotherBrookDedham$gauge, type="n", xaxt="n", ylab="mean (over day) cubic feet per second", main="",
     xlab="", cex.lab=2, cex.axis=2, ylim=c(-80, 650))
title(main=sprintf("Raw flow data, Mother Brook at Dedham, agency %s, site %s, fit with cubic smoothing spline", 
      MotherBrookDedham$agency_cd[1], MotherBrookDedham$site_no[1]), 
      cex.main=3, font.main=2, family="Times")     
N<- nrow(MotherBrookDedham)
S<- seq(1, N, 30)
axis(side=1, at=S, line=-13, labels=MotherBrookDedham$datetime[S], las=2, cex.axis=2, font.axis=2, cex.lab=1.5, tick=FALSE)
abline(v=S, lty=6, col="grey")
points(1:N, MotherBrookDedham$gauge, pch=21, cex=1.2, col="blue", bg="blue")
lines(MotherBrookDedham.nonNA, MotherBrookDedham.fitting.p5$ysmth, lwd=1, lty=1, col="green")
text(which.max(MotherBrookDedham.fitting.p5$ysmth), max(MotherBrookDedham.fitting.p5$ysmth), pos=2, offset=2,
     font=2, cex=2, labels=sprintf("Found smoothing SPAR = %.3f, and G.C.V. value =  %.1f", 
                                   MotherBrookDedham.fitting.p5$spar, MotherBrookDedham.fitting.p5$gcv), family="Helvetica")
text(which.max(MotherBrookDedham.fitting.p5$ysmth), 0.95*max(MotherBrookDedham.fitting.p5$ysmth), pos=2, offset=2,
     font=2, cex=2, labels=sprintf("SD = %.3f, MAD = %.3f, SEE = %.3f", 
                                   MotherBrookDedham.estimate.bounds.p5$SD, MotherBrookDedham.estimate.bounds.p5$MAD, 
                                   MotherBrookDedham.estimate.bounds.p5$SEE), family="Helvetica")
closeSVG(fx)

# Force the same P-spline to use an arbitrary smoother SPAR by electing method == 1, and setting SPAR = 0.7.
MotherBrookDedham.fitting.p7<- smooth.Pspline( x=MotherBrookDedham.nonNA, y=MotherBrookDedham$gauge[MotherBrookDedham.nonNA], 
                                            norder=2, spar=0.7, method=1)
# Using 90 days as mean block length, about a quarter of a year
MotherBrookDedham.estimate.bounds.p7<- seFromPspline(psplineFittingObject=MotherBrookDedham.fitting.p7, 
                                                  originalResponses=MotherBrookDedham$gauge[MotherBrookDedham.nonNA], nb=1000, b=91)

fx<- openSVG(root="MotherBrookDedham-RawFlowData-Daily-withSmooth-with-SPARp7", width=24, height=round(24/2), pointsize=8)

plot(MotherBrookDedham$gauge, type="n", xaxt="n", ylab="mean (over day) cubic feet per second", main="",
     xlab="", cex.lab=2, cex.axis=2, ylim=c(-80, 650))
title(main=sprintf("Raw flow data, Mother Brook at Dedham, agency %s, site %s, fit with cubic smoothing spline", 
      MotherBrookDedham$agency_cd[1], MotherBrookDedham$site_no[1]), 
      cex.main=3, font.main=2, family="Times")     
N<- nrow(MotherBrookDedham)
S<- seq(1, N, 30)
axis(side=1, at=S, line=-13, labels=MotherBrookDedham$datetime[S], las=2, cex.axis=2, font.axis=2, cex.lab=1.5, tick=FALSE)
abline(v=S, lty=6, col="grey")
points(1:N, MotherBrookDedham$gauge, pch=21, cex=1.2, col="blue", bg="blue")
lines(MotherBrookDedham.nonNA, MotherBrookDedham.fitting.p7$ysmth, lwd=1, lty=1, col="green")
text(which.max(MotherBrookDedham.fitting.p7$ysmth), max(MotherBrookDedham.fitting.p7$ysmth), pos=2, offset=2,
     font=2, cex=2, labels=sprintf("Found smoothing SPAR = %.3f, and G.C.V. value =  %.1f", 
                                   MotherBrookDedham.fitting.p7$spar, MotherBrookDedham.fitting.p7$gcv), family="Helvetica")
text(which.max(MotherBrookDedham.fitting.p7$ysmth), 0.95*max(MotherBrookDedham.fitting.p7$ysmth), pos=2, offset=2,
     font=2, cex=2, labels=sprintf("SD = %.3f, MAD = %.3f, SEE = %.3f", 
                                   MotherBrookDedham.estimate.bounds.p7$SD, MotherBrookDedham.estimate.bounds.p7$MAD, 
                                   MotherBrookDedham.estimate.bounds.p7$SEE), family="Helvetica")
closeSVG(fx)

The code is available online here and requires a utility from here.

So, what’s the point?

Having a spline model for a data actually offers a lot. First, the estimate of SEE and MAD give some idea of how accurate prediction using the model might be. With eight years of data, such models are in hand.

Also, having a spline model is the basis for detecting changes in stream flow rates over time. Mother Brook might not be the best example of long run stream flow rates, since the Army Corps can change their policies in how they manage it, but the same kinds of flow time series are available for many other flows in the region.

To the point about changes in flow rates, having a spline model permits estimating derivatives which, in this case, are exactly these values.

Moving on, once several such flows have been modeled using splines, these can serve as the basis for various kinds of regressions, whether on the response side or on the covariates side. For example, is there statistical evidence for a link between stream flows and temperature? The Clausius-Clapeyron relation suggests there should be, at least at the regional and global scale. It would be interesting to examine if it can be seen here.

To me, it would be also interesting to see if some of the riverine connections in the region could be inferred from examination of flow rates alone. Downstream flows see a pulse of water from precipitation and melt, but their pulses are lagged with respect to earlier ones. Sure, one could examine such connections simply by looking at a map, or Google Earth, but there are other hydrological applications where these connections are latent. In particular, connections between subterranean water sources and surface flows might be reveals if these kinds of inferences are applied to them.

(Update, 2019-01-29)

The scholarly literature such as the paper by Professor Mann cited above which critiques and explains that by by Soon, Legates, and Baliunas (2004) shows careful consideration of these techniques matters.

mb_ortho_2019-01-27_180158

Posted in American Statistical Association, citizen data, citizen science, Clausius-Clapeyron equation, Commonwealth of Massachusetts, cross-validation, data science, dependent data, descriptive statistics, dynamic linear models, empirical likelihood, environment, flooding, floods, Grant Foster, hydrology, likelihood-free, meteorological models, model-free forecasting, non-mechanistic modeling, non-parametric, non-parametric model, non-parametric statistics, numerical algorithms, precipitation, quantitative ecology, statistical dependence, statistical series, stream flow, Tamino, the bootstrap, time series, water vapor | 2 Comments

50,000+ golf balls, along a coast

KQED carried a story about free diver and 16 y.o. Alex Weber who discovered not only a new source of plastic pollution, but another testament to the casual, careless sloppiness of people.

sealwithgolfballs

And Ms Weber has converted it into a crusade against marine pollution, and a technical article in a scientific publication. Writing with Professor Matt Savoca of Stanford University, Weber and her dad, Michael Weber, also a co-author of that paper, found over 50,000 balls just offshore of a California golf course, with new ones arriving every day. See her golf ball project page.

wheregolfcourse_2019-01-26_182205

A number of the balls are in usable condition:

golfballpollution_2019-01-26_181615

Quoting from the Conclusion of their article:

In central California, the Pebble Beach Golf Links host 62,000 rounds of golf per year and has been in operation since 1919 (Dunbar, 2018). The average golfer loses 1–3 balls per round (Hansson and Persson, 2012), which implies that between 62,000 and 186,000 golf balls are lost to the environment each year at the Pebble Beach Golf Links. This translates to 3.14–9.42 tons of debris annually. While a portion of these balls is lost to non-oceanic regions adjacent to the course, the coast and intertidal environments still have a high likelihood of accumulating mishit balls. Using a conservative estimate of 10,000–50,000 balls lost to sea annually gives a range of 1–5 million golf balls lost to the coastal environment during the century that this course has been in operation. These projected numbers indicate that this issue has been overlooked for decades.

I salute Ms Weber, her dad, and Professor Savoca. And look forward to reading their paper.

2019-01-26_182425

Update, 27th January 2019

Accolades to authors Weber, Weber, and Savoca, and collection colleagues Johnston, Sammet, and Matthews for a most impressive piece of work!

The conditions on dives are cold and sometimes treacherous. Representative collections take planning and working around environmental and safety constraints. Revisits showed a glimpse of golfballs pollutant dynamics.

And it didn’t stop there: The huge population of golfballs needed to be characterized by age and wear.

The sampling areas and processes needed documentation.

This is a substantial body of field research, backed up by background scholarship.

Posted in American Association for the Advancement of Science, an uncaring American public, coastal communities, coasts, consumption, ecological disruption, Ecological Society of America, ethics, field research, Florida, Humans have a lot to answer for, marine debris, oceans, plastics, pollution, science, sustainability, sustainable landscaping | Tagged | Leave a comment

“Pelosi won, Trump lost”

U.S. House of Representatives Democratic Leader Nancy Pelosi speaks to reporters after she was re-elected to her post on Capitol Hill in Washington

From Alex Wagner, contributing editor at The Atlantic and CBS News correspondent. Excerpt from “Pelosi won, Trump Lost“:

“Nancy’s Prerogative” might be the name of an Irish bar, but in this case it signaled the waving of the presidential white flag, a fairly shocking thing to see on any war front. Trump’s pugilistic impulses, after all, have been virtually unchecked—especially these days, when he is without administration minders. But Pelosi has rendered Trump unable to employ his traditional weaponry. He couldn’t even muster the juju necessary to formulate that most Trumpian of Trump battle strategies, a demeaning nickname. “Nancy Pelosi, or Nancy, as I call her,” Trump said on Wednesday, “doesn’t want to hear the truth.”

.
.
.

Trump has intersected with powerful women before — Hillary Clinton, most notably — and showed little hesitation to diminish and demean. But Pelosi, who once joked to me she eats nails for breakfast, is a ready warrior. She is happy to meet the demands of war, whereas Clinton was reluctant, semi-disgusted, and annoyed to be dragged to the depths that running against Trump demanded. The speaker of the House is, technically, a coastal elite from San Francisco, but she was trained in the hurly-burly of machine politics of Baltimore by her father, Mayor Thomas D’Alesandro. It is not a coincidence that Pelosi has managed, over and over, to vanquish her rivals in the challenges for Democratic leadership: she flocks to the fight, not just because she usually wins, but apparently because she likes it.

Read it, particularly the quote from former Trump Organization executive Barbara Res, repeated from The New York Times.

There is a similar article at The Washington Post by Jennifer Rubin titled “Trump lost. Period.”

U.S. President Donald Trump listens to remarks at a discussion on School Safety Report at the White House in Washington

concession

Posted in "Big Bang Theory", alchemy, citizenship, Donald Trump, dump Trump, ecopragmatist, politics, reason, San Francisco, Speaker Nancy Pelosi | Leave a comment

“Collective reflection” and working together on climate issues in Massachusetts

This is an excerpt from an article which appeared at RealClimate. That, in turn, is a translation of the same article which appeared in Le Monde on 11th January 2019.

Recent discussions at climate-related blogs and among environmental activists make the portions of the excerpt which I have highlighted in bold especially pertinent.

What if the focus on the moods of climate scientists was a way to disengage emotionally from the choices of risk or solutions to global warming? Since the experts are worrying about it for us (it’s their daily life, isn’t it?), let’s continue our lives in peace. If feelings and expressing emotions – fear, anger, anguish, feelings of helplessness, guilt, depression – in the face of risks are legitimate, even necessary, to take action demands that we go beyond that. Catastrophism often leads to denial, a well-known psychic mechanism for protecting oneself from anxiety. Managing risk is part of our daily lives and supposes that we are not in such denial (active or passive) as it prevents clear and responsible action. Because we know that many hazards carry predictable risks, human societies have learned to anticipate and cope, for example, to limit the damage of storms or epidemics. The challenge of climate change is to build a strategy not in response to an acute and clearly identified risk, but in anticipation of a gradual, chronic increase in climate risks.

The climate scientists are alright (mostly), but that’s not the important question. The dispassionate management of climate risk will require that everyone – citizens, decision makers, teachers, intermediate bodies, companies, civil society, media, scientists – in their place and according to their means, take the time for a collective reflection, first of all through mutual listening. The news shows it every day: this process is hobbling along, too slowly for some, too fast for others. It will need to overcome emotional reactions, vested interests, and false information from the merchants of doubt. Those who are unable to review their strategy and have everything to lose from the exit from fossil-fuel based energies will use nit-picks, manipulation, short-termism, and promote binary and divisive visions, all of which undermine trust and pollute the debate. But despite that…

Every degree of warming matters, every year counts, every choice counts. The challenge is immense because of the nature and magnitude of the unprecedented risk. It requires doing everything to overcome indifference and fatalism.

And, in this regard, but obviously with no support from the authors of the above piece, one of the most constructive things the climate-concerned of Massachusetts can do right now, whatever your political background and stripe, is to throw your support behind Governor Baker’s proposal to tax real estate transfers as a funding source for climate mitigation and adaptation. While The Globe quoted ELM and other environmental groups of having cautious support for the Governor’s proposal, to stand on the sidelines and fail to give him support for the proposal against the likes of the Massachusetts Association of Realtors, quoted in the article, and probably Speaker Robert DeLeo means they are more interested in their side winning than on making progress towards the common goal of mitigating climate change, adapting, and preventing. I have criticized Governor Baker, too. But this and his Executive Order 569 are really welcome, and I walk back what I said there: The Governor has either learned, or I was wrong in the first place.

I’m not the only one supporting him: Foley-Hoag thinks this is a good idea, but wants the Governor to do more.

The risks are here. The risks are now. There is already a 1-chance-in-100 per year of an 8 inch rain or more in 24 hours. No Massachusetts stormwater infrastructure is capable of dealing with half of that. You think that risk small? There’s an 10% chance of that happening one or more times in 10 years. There’s a 4+% chance of that happening in 5 years. The chance of a 7 inch rain or more in 24 hours is 2% each year. Yet Massachusetts codes allow 1960s standards for diurnal rain projections to be the standard. These are no longer the 1960s.

So, which is it, all the people that say they want to fix climate change? Support or not? And if you don’t support this, where is your specific counterproposal? And if you don’t have one, you don’t deserve the label “climate activist” or “environmental activist”. Just settle for politician.

Update, 2019-01-23

A measure and program I find highly constructive is the Ceres Commit to Climate program for corporations.

resilience_initiatives_header

epif_final_820

Posted in Anthropocene, being carbon dioxide, citizenship, climate change, climate disruption, Commonwealth of Massachusetts, EBC-NE, Ecology Action, ecomodernism, ecopragmatism, environment, global warming, Governor Charlie Baker, greenhouse gases, Hyper Anthropocene, ILSR, investment in wind and solar energy, lobbying, local generation, Massachusetts, Massachusetts Clean Energy Center, New England, rights of the inhabitants of the Commonwealth | Leave a comment

What if Juliana v United States fails?

This is a replica of a comment I made at another site. As of 23:55 EST on 21st January, it hasn’t been release from moderation. Perhaps the moderator is busy. I do not know. I am proceeding as if it will not be released, because I will be too busy during the next week or so to monitor.

I am posting this as an expansion of an opinion I offered in public a few months ago, unable to include these ideas because the opinion was strongly constrained by time.

[From a fellow Commentor:]

So, who IS going to fix it? Oh, wait—-I know—-the free market that gave us the problem in the first place!

I’m hoping that the judiciary, via the Public Trust doctrine, might force the government to fix this. Professor Mary Wood’s book, Nature’s Trust, makes it very clear that Executive branch agencies entrusted with the preservation of the natural world become a licensing mechanism for permitting its wholesale destruction, in large measure, as you imply, by the “free market forces” leaning upon government. But that behavior appears ingrained in the Executive, and it doesn’t know how or what else to do without a guiding constraint. They’d do it, in other words, even without the lobbying lean, simply because there are non-business, non-corporate constituencies out there who don’t like to be constrained. Plenty of examples in the book. The American University Law Review article is a good synopsis.

But, there is an aspect to notions of harm in case law which suggests that if a condition is shared by a large population, and, in this case, all the population, there is no standing to sue. The harm must be differentiated and special. This aspect is what darkens my view of this avenue. If we get that far, the other darkening comes from the likelihood that the remedy Juliana seeks would be granted in a form which has any resemblance to the original.

As I have said publicly,

For should the plaintiffs of Juliana fail, the last government branch, the judiciary, abdicates responsibility for solving this urgent problem. And so the Constitution will have failed one of its existential requirements: To provide for the common defense. For Nature has laws, too, and we have been breaking them for a long time, ever more intensely. But Nature does not have courts of grievance or redress. Nature just acts. In a catastrophic sea level rise, perhaps triggered by a collapse of a distant ice sheet, Moakley Courthouse itself, the land you stand on would be lost, and all that there [City of Boston]. While disappointing, were Juliana to be overturned, this should not be a reason for despair. It would not mean the Constitution should be replaced. It would just mean it is useless for solving certain kinds of critically important problems. Its failure would imply the Constitution is becoming a dusty, old thing, irrelevant, like the Articles of Confederation are to us, a ceremonial relic. Let’s hope not.

There will be solutions for solving climate in any case, Constitution or not. They may well be horrifically expensive. And, while there’s no solution without first zeroing emissions, solutions will exist. These will lie beyond the Constitution, I hope Chief Justice Roberts and his colleagues understand the import of that.

Solutions “beyond the Constitution” are solutions where global economic interests decide that climate change must be stopped, for their business is being harmed and their wealth is being lost. The “free market” is no more monolithic than any other group or section of human behavior or collective, and for every company which profits from sale of fossil fuels and use of atmosphere for sewer, there are three or more which simply use them as a means to an end. If a product harms during its use, and the buyer is not forewarned, the buyer, whether individual or corporation, has every right to pursue damages from the purveyor. Beyond that, the buyers have every motivation to band together with the similarly harmed and devise a means of fixing the situation.

The trouble, of course, is to the degree these remedies are extra-governmental and extra-Constitutional, these agents governments have little opportunity to steer these remedies. They might have steered, by participating early on, but the governments, listening to @Gingerbaker’s “Us” chose to pursue the comfortable, uncontroversial paths. To the extent governments cannot fix the problem without the consortium of collective buyers, they’re stuck. This is unfortunate. But this is what happens when fundamental responsibilities are repudiated.

Professor Dan Farber has recently offered his opinion of the status of Juliana. He’s an attorney. I’m not.

callendar_2019-01-21_172722

arrhenius_2019-01-21_173244

Posted in an ignorant American public, an uncaring American public, Anthropocene, being carbon dioxide, Boston Ethical Society, carbon dioxide capture, clear air capture of carbon dioxide, climate, climate business, climate change, climate disruption, climate economics, corporate litigation on damage from fossil fuel emissions, corporate supply chains, corporations, ecological disruption, ecomodernism, economics, ecopragmatism, environment, environmental law, extended producer responsibility, extended supply chains, First Parish Needham, fossil fuel divestment, fossil fuels, global warming, greenhouse gases, Hyper Anthropocene, investment in wind and solar energy, Juliana v United States, leaving fossil fuels in the ground, Mary C Wood, optimization, Our Children's Trust, pollution, population biology, population dynamics, Principles of Planetary Climate, quantitative biology, quantitative ecology, radiative forcing, rationality, reasonableness, sea level rise, sustainability, the tragedy of our present civilization, tragedy of the horizon, United States Constitution, United States Government, UU, UU Needham, zero carbon | Leave a comment

“About” section of this blog has been revised, and rules of commenting made more prominent

See the About section of this blog for a revision in the blog’s description and rules governing commenting made more explicit and prominent. In fact, I have copied these at the bottom of this post.

The heading of the blog has also been changed to more properly express my position and approach to addressing the climate emergency, and acknowledging that I am an ecopragmatist and embrace the Ecomodernist Manifesto.

Rules Regarding Posting Comments on this Blog

  • I will not tolerate climate denial comments.
  • I will not tolerate creationist comments.
  • I will not tolerate insults or anything like slander against cultural groups, or groups based upon sexual preference or sexual identity.
  • This is primarily a technical blog. While friendly discussion and humor is welcome, positions and proclamations or arguments are expected to be accompanied by evidence or citations of evidence, whether as links or as figures or equations. \LaTeX is available. Authors of long derivations or similar contributions might want to consider using Overleaf/ShareLaTeX for their pieces.
  • Commenters who articulate extended interesting positions pertinent to the blog’s purpose may be asked to rewrite their comment as a guest post instead. If this happens, the comment will be held for moderation and the commenter contacted.
  • Commenters are expected to use unique handles, that is, they oughtn’t use multiple pseudonyms or email addresses for the same person. This is not only a rule of this blog, but is a stipulation of the TOS for wordpress.com.
  • I am happy to fix syntactic mistakes in comments. I find WordPress really ought to provide a way for commenters to revise their postings, and, in the absence of such, am happy to help.
  • I will never delete a comment without first simply holding it for moderation, and approaching the commenter, asking them to revise it, or explaining why I am holding it. In the absence of a reply, the comment may be held in moderation indefinitely.
  • Comments may be deleted if the change is virulently opposed, or if the commenter has engaged in a series of violations of rules. Ultimately, as has happened, a commenter who abuses the rules will be banned from participation.
  • In the end, I reserve the right to determine what’s appropriate here or not. This is my blog. I pay for it. There is no subsidy or advertising that helps pay for it.

Posted in Anthropocene, blog, bridge to somewhere, Buckminster Fuller, CleanTechnica, climate change, ecology, Ecology Action, ecomodernism, ecopragmatism, ecopragmatist, engineering, global warming, Hermann Scheer, Hyper Anthropocene, ILSR, Joseph Schumpeter, leaving fossil fuels in the ground, local generation, local self reliance, Mark Carney, reasonableness, secularism, solar democracy, solar domination, Stewart Brand, technology, the energy of the people, the green century, Tony Seba, wind energy, wind power, zero carbon | Leave a comment

“From Single Use to Zero Waste: What’s New with Recycling”

Wednesday, January 30, 2019, 7:00-8:30 pm, at the South Shore Natural Science Center


Map:


The South Shore Natural Science Center and the South Shore Recycling Cooperative (SSRC) present:

This event will be live-streamed at the SSRC Facebook page.

Posted in ecology, environment, environmental law, recycling, South Shore Recycling Cooperative | Leave a comment

On the rheology of cats

Important paper.

Overview.

jtnmm-are-cats-liquid-nobel-prize-1

(Dedicated to dumboldguy.)
Posted in science | 1 Comment

A never-ending litany of vituperation

There is a blog commenter whose handle is dumboldguy who used to comment here. The rules for commenting at this blog are clear and posted. He made some comments, along with extraneous material, and I first left the comments but edited the extraneous material.

He was annoyed by the editing, and I pointed out this is my blog and the rules are the rules.

He continued to comment, and I held a couple of those for moderation, asking him to provide references and links.

In the end, dumboldguy became angry and non-productive, and began to accuse all kinds of unfairness, and basically call me names. His standard for me at other blogs is “ecoquack”. I really don’t care. I was once called a “tree-hugging ecoweenie” by a climate denier, and I’m kind of proud of that moniker.

I don’t get a lot of comments. As I’ve noted, and noted to him, I don’t write this blog to make it popular. It has a reasonable following. I enjoy well-written comments, but in the spirit of the blog, claims should be footnoted with links to evidence and the like. I mostly use the blog to express things, and have a convenient place to put material, so I can cite it easily.

I also use the blog to document technical findings. dumboldbuy for some reason finds these most irritating at all, accusing me of some kind of elitism because I post them.

In the end, I needed to ban him from the blog. No problem. It happens. Users can be banned for the same reason WordPress has an automatic spam filter on comments.

But dumboldguy has continued his attacks in comments at other blog sites, most related to climate change mitigation and climate justice. I am not attempting to refute him here. I do engage at the other sites when appropriate.

However, I am beginning this blog post today to record the more abrasive of his comments at other sites, and provide a record of these attacks, by date. I am not providing his comments in full, but do provide links to them. Those will serve to provide context, unless dumboldguy manages to get himself banned elsewhere.

I don’t expect the updating of this blog post will end anytime soon.

2019-01-20, Sunday

link

Looks to me like ecoquack is suffering from the engineer’s typical inability to understand the English language, as well as the engineer’s tendency to focus on technology as the answer to all human problems.
.
.
.
If ecoquack could climb out of his engineer’s silo and really see what is being said, he would realize that the key words are OPPOSE CORPORATE SCHEMES THAT PLACE PROFITS OVER COMMUNITY BENEFITS, INCLUDING MARKET BASED MECHANISMS.

Looks to me like ecoquack is a trying to hijack the urgency of the climate emergency to advance his own set of objectives.

link

You again miss the point, just as the author of the Atlantic piece did. It’s not about mitigating climate change and competing “technologies”, it’s about fighting the politicians and so-called “capitalists” that want to prolong the system that gave us climate change in the first place. If we can’t break their stranglehold on what does or doesn’t get done, we are going nowhere.

And why do you again insist on throwing more maundering BS and self-admiration into your comment? We don’t give a rodent’s rear end that you know LaTeX and use it to crap up what could be said clearly in plain English—-it proves my point about your engineer’s cluelessness about communication.

And please don’t mention YOUR BLOG here again—-as I’ve said here before, it is not a site worth visiting except to view your self-admiration and egotism. Anyone who doesn’t agree with what they see there will be ignored or quickly banned if they persist.

link

JFC! Do you never tire of spouting bullshit and then sitting back and admiring how smart you are?

Now you’re going to say that the “proponents” are part of the great left wing conspiracy to take over the world? Have you even read Klein’s book?

Moderator's note: Actually, I have. I didn't think much of it. Suffice it to say I am a disciple of Hermann Scheer, Buckminster Fuller, and, above all, of Stewart Brand. See Brand's important book. Yeah, I'm an ecopragmatist as well as a solar revolutionary. dumboldguy apparently dislikes ecomodernism a lot.

I have—a copy sits on my bookshelf, and it is one of the best books ever written about climate change (or rather, as I said, how run-amok capitalism is the REAL problem). What parts of it do you dispute? Cite page numbers and let’s debate her points.

“….many countries are making progress reducing their emissions, and they care not revolutionary at all. Quite capitalist in fact”. BULLSHIT! The major emitters are NOT making progress.

Making common cause with conservatives? Massachusetts? What “goods” could be made from captured carbon? You waste our time with even MORE inane Bullshit!

“The engineering expertise is in corporations”? Actually, it’s drawn to wherever there is money to pay for it, as the government did by spending huge quantities on the Manhattan Project and Going to the Moon.

You are sounding more like a Republican corporate shill every time you open your mouth. Is it your “objective” to get your hooks into some of that $$$$?

2019-01-21, Monday

link

Ecoquacky does it again!

“Cumulative emissions are all that matter, because of the longevity, in atmosphere, of Carbon Dioxide. Annual emissions don’t matter at all. It’s ALL owned by the United States and Europe”.

Lord love a duck, but that’s one of the dumber things Quacky has said here. Yes, the US and Europe ARE to much to blame for the size of the cumulative emissions—-not surprising since that’s where the Industrial Revolution began and has been polluting longest—-but to say “annual emissions don’t matter at all” totally ignores the FACT that the rest of the world (whose population far outnumbers the West) is now producing an ever-increasing quantity of CO2 ANNUALLY , wants to have a living standard like that in the West, is going to NEED millions of air conditioners to survive the coming heat waves, and is still burning too much COAL (coal being the subject Quacky refuses to discuss).

Perhaps it’s time to remind Quacky of the old saw that everyone is entitled to their OPINION, no matter how half-assed, but NOT to their own facts. It is a simple FACT that ALL emissions—-past, present, and future—-are of concern.

link

Quacky just can’t quit. Now he’s swinging over to some BS about “moral and ethical responsibility” and “compensation”? WTF is he talking about?

How did we get to that from “cumulative emissions are all that matter”? (and why doesn’t he want to talk about coal—-the stake through the heart of humanity?)

I will repeat—-yes, cumulative emissions MAY have already doomed us, but if we don’t deal strongly with the “annual emissions” yet to come from EVERY country in the world, there is virtually no hope.

link

You agree? Swell! Who cares?

link

Redsky is correct, Quacky. You’re not.

CO2 levels remained stable for 1000’s of years until the Industrial Revolution. It wasn’t until the 1960’s that they started to ramp up, with the level in 1960 being ~315 ppm, only 40 ppm higher than it was 120 years before in 1840.

Those of us who were more aware than you of “the possibility of emissions having an effect” were worried about more visible and imminent threats back then—-dirty air and dirty water, toxic industrial waste, lead in gasoline and paint, DDT, acid rain, the ozone hole, resource depletion, overpopulation, SST’s, nuclear power, and more.

It wasn’t until the 1980’s and Hansen that we began to pay attention to GHG, and the near 100 ppm rise in ~60 years from 1960 until today, which is ~5 times the rate of increase before 1960.

http://www.sealevel.info/co2_and_ch4c.html

Not sure what your point is with “the government had been warned and cautioned repeatedly”. That’s not news, and it’s water over the dam anyway. Or is it just that you like to hear yourself quack?

Posted in blog, science | Leave a comment

“… [N]ew renewable energy capacity could quadruple that of fossil fuels over next three years”

This is utility-scale capacity only. See the footnote from the original post repeated at the bottom. Also, given uncertainties related to federal data availability at federal Web sites during the partial federal shutdown, I have copied the cited report and placed it so it is publicly available in a safe location.

Quoting:

Washington DC – According to an analysis by the SUN DAY Campaign of the latest data released by the Federal Energy Regulatory Commission (FERC), natural gas dominated new electrical generating capacity in 2018. However, renewable energy sources (i.e., biomass, geothermal, hydropower, solar, wind) may be poised to swamp fossil fuels as new generating capacity is added over the next three years.

FERC’s “Energy Infrastructure Update” report (with data through November 30, 2018) notes that new natural gas generation placed in service during the first 11 months of 2018 totaled 16,687 MW or 68.46% of the total (24,376 MW). Renewable sources accounted for only 30.12% led by wind (3,772 MW) and solar (3,449MW).(*)

However, the same report indicates that proposed generation and retirements by December 2021 include net capacity additions by renewable sources of 169,914 MW. That is 4.3 times greater than the net new additions listed for coal, oil, and natural gas combined (39,414 MW).

Net proposed generation additions from wind alone total 90,268 MW while those from solar are 64,066 MW — each greater than that listed for natural gas (56,881 MW). FERC lists only a single new 17-MW coal unit for the three-year period but 16,122 MW in retirements. Oil will also decline by 1,362 MW while nuclear power is depicted as remaining largely unchanged (i.e., a net increase of 69 MW).

FERC’s data also reveal that renewable sources now account for 20.8% of total available installed U.S. generating capacity.(**) Utility-scale solar is nearly 3% (i.e., 2.94%) while hydropower and wind account for 8.42% and 7.77% respectively.

(*) FERC only reports data for utility-scale facilities (i.e., those rated 1-MW or greater) and therefore its data does not reflect the capacity of distributed renewables, notably rooftop solar PV which accounts for approximately 30% of the nation’s installed solar capacity.

(**) Capacity is not the same as actual generation. Capacity factors for nuclear power and fossil fuels tend to be higher than those for most renewables. For the first ten months of 2018, the U.S. Energy Information Administration reports that renewables accounted for 17.6% of the nation’s total electrical generation – that is, a bit less than their share of installed generating capacity (20.8%).

Source:

FERC’s 6-page “Energy Infrastructure Update for November 2018” was released in early January 2019. In a seeming departure from its norm, FERC did not announce the release of this report on its web page and a specific release date does not appear on the report itself. However, it is assumed the report was issued within the past week. It can be found at: https://www.ferc.gov/legal/staff-reports/2018/nov-energy-infrastructure.pdf. For the information cited in this update, see the tables entitled “New Generation In-Service (New Build and Expansion),” “Total Available Installed Generating Capacity,” and “Proposed Generation Additions and Retirements by October 2021.”

Posted in American Solar Energy Society, Anthropocene, Bloomberg New Energy Finance, BNEF, bridge to somewhere, Buckminster Fuller, clean disruption, CleanTechnica, decentralized electric power generation, decentralized energy, electricity, FERC, green tech, ILSR, investment in wind and solar energy, John Farrell, Joseph Schumpeter, leaving fossil fuels in the ground, local generation, local self reliance, natural gas, rate of return regulation, solar democracy, solar domination, solar energy, solar power, Sonnen community, the energy of the people, the right to know, the value of financial assets, Tony Seba, wind energy, wind power, zero carbon | Leave a comment

A look at an electricity consumption series using SNCDs for clustering

(Slightly amended with code and data link, 12th January 2019.)

Prediction of electrical load demand or, in other words, electrical energy consumption is important for the proper operation of electrical grids, at all scales. RTOs and ISOs forecast demand based upon historical trends and facts, and use these to assure adequate supply is available.

This is particularly important when supply is intermittent, such as solar PV generation or wind generation, but, to some degree, all generation is intermittent and can be unreliable.

Such prediction is particularly difficult at the small and medium scale. At large scale, relative errors are easier to control, and there are a large number of units drawing upon or producing electrical energy which are amassed. At the very smallest of scales, it may be possible to anticipate usage of single institutions or households based upon historical trends and living patterns. This has only partly been achieved in devices like the Sense monitor, and prediction is still far away.

Presumably, techniques which apply to the very small could be scaled to deal with small and moderate size subgrids, although the moderate sized subgrids will probably be adaptations of the techniques used at large scale.

There is some evidence that patterns of electrical consumption directly follow the behavior of the building’s or home’s occupants that day, modulated by outside temperatures and occurrence of notable or special events. Accordingly, being able to identify the pattern of behavior early in a day can offer power prior information for the consumption pattern that will hold later in the day.

There is independent evidence occupant do, in a sense, select their days from a palette of available behaviors. This has been observed in Internet Web traffic, as well as secondary signals in emissions from transportation centers. Discovering that palette of behaviors is a challenge.

This post reports on an effort do such discovery using time series of electricity consumption for 366 days from a local high school. Consumption is sampled every 15 minutes.

Here is a portion of this series, with some annotations:

The segmentation is done automatically with a regime switching detector. The portion below shows these data atop a short-term Fourier spectrum of the same (STFT):

The point of this exercise is to cluster days together in a principled day, so to attempt to derive a kind of palette. One “color” of such a palette would be a cluster. Accordingly, if a day is identified, from the preliminary trace of its electricity consumption as being a member of a cluster, the bet is that the remainder of the day’s consumption will follow the patterns of other series seen in the cluster. If more than one cluster fits, then some kind of model average across clusters can be taken as predictive, obviously with greater uncertainty.

(Click on figure to see larger image and then use browser Back Button to return to blog.)

Each day of 366 for the 2007-2008 academic year was separated and pairwise dissimilarities for all days were calculated using a Symmetrized Normalized Compression Divergence (SNCD) described previously. The dissimilarity matrix was used with the default hierarchical clustering function, hclust, in R and its Ward-D2 method. That clustering produced the following dendrogram:

The facilities of the dynamicTreeCut package of R were used to find a place to cut the dendrogram and thus identify clusters. The cutreeDynamic function was called on the result of hierarchical clustering, using the hybrid method, and a minimum cluster size setting of one, to give the cluster chooser free range.

There were 5 clusters found. Here they are in various ways.

First, the dates and their weekdays:


$`1`
 2007-09-06  2007-09-07  2007-09-10  2007-09-14  2007-09-17  2007-09-18  2007-09-21  2007-09-25  2007-09-27  2007-10-01  2007-10-02  2007-10-03  2007-10-04  2007-10-09 
 "Thursday"    "Friday"    "Monday"    "Friday"    "Monday"   "Tuesday"    "Friday"   "Tuesday"  "Thursday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"   "Tuesday" 
 2007-10-10  2007-10-22  2007-10-23  2007-10-29  2007-10-31  2007-11-02  2007-11-05  2007-11-06  2007-11-13  2007-11-21  2007-11-28  2007-12-03  2007-12-04  2007-12-05 
"Wednesday"    "Monday"   "Tuesday"    "Monday" "Wednesday"    "Friday"    "Monday"   "Tuesday"   "Tuesday" "Wednesday" "Wednesday"    "Monday"   "Tuesday" "Wednesday" 
 2007-12-06  2007-12-11  2007-12-12  2007-12-14  2007-12-17  2007-12-18  2007-12-19  2008-01-03  2008-01-04  2008-01-11  2008-01-15  2008-01-16  2008-01-17  2008-01-18 
 "Thursday"   "Tuesday" "Wednesday"    "Friday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"    "Friday"    "Friday"   "Tuesday" "Wednesday"  "Thursday"    "Friday" 
 2008-01-22  2008-01-23  2008-01-24  2008-01-29  2008-01-30  2008-01-31  2008-02-05  2008-02-06  2008-02-07  2008-02-11  2008-02-12  2008-02-13  2008-02-25  2008-02-27 
  "Tuesday" "Wednesday"  "Thursday"   "Tuesday" "Wednesday"  "Thursday"   "Tuesday" "Wednesday"  "Thursday"    "Monday"   "Tuesday" "Wednesday"    "Monday" "Wednesday" 
 2008-02-28  2008-03-10  2008-03-12  2008-03-13  2008-03-14  2008-03-19  2008-03-24  2008-03-25  2008-04-01  2008-04-02  2008-04-03  2008-04-04  2008-04-11  2008-04-23 
 "Thursday"    "Monday" "Wednesday"  "Thursday"    "Friday" "Wednesday"    "Monday"   "Tuesday"   "Tuesday" "Wednesday"  "Thursday"    "Friday"    "Friday" "Wednesday" 
 2008-04-28  2008-04-30  2008-05-05  2008-05-07  2008-05-09  2008-05-12  2008-05-19  2008-05-22  2008-05-27  2008-05-28  2008-06-01  2008-06-02  2008-06-04  2008-06-05 
   "Monday" "Wednesday"    "Monday" "Wednesday"    "Friday"    "Monday"    "Monday"  "Thursday"   "Tuesday" "Wednesday"    "Sunday"    "Monday" "Wednesday"  "Thursday" 
 2008-06-07  2008-06-10  2008-06-13  2008-06-17  2008-06-18  2008-06-19  2008-06-23  2008-06-24  2008-06-27  2008-07-01  2008-07-02  2008-07-05  2008-08-11  2008-08-18 
 "Saturday"   "Tuesday"    "Friday"   "Tuesday" "Wednesday"  "Thursday"    "Monday"   "Tuesday"    "Friday"   "Tuesday" "Wednesday"  "Saturday"    "Monday"    "Monday" 
 2008-08-27 
"Wednesday" 

$`2`
 2007-09-03  2007-09-04  2007-09-08  2007-09-12  2007-09-13  2007-09-15  2007-09-20  2007-09-24  2007-09-29  2007-10-06  2007-10-07  2007-10-08  2007-10-12  2007-10-15 
   "Monday"   "Tuesday"  "Saturday" "Wednesday"  "Thursday"  "Saturday"  "Thursday"    "Monday"  "Saturday"  "Saturday"    "Sunday"    "Monday"    "Friday"    "Monday" 
 2007-10-20  2007-10-27  2007-10-28  2007-10-30  2007-11-03  2007-11-22  2007-11-23  2007-11-26  2007-12-01  2007-12-13  2007-12-24  2007-12-26  2007-12-28  2007-12-31 
 "Saturday"  "Saturday"    "Sunday"   "Tuesday"  "Saturday"  "Thursday"    "Friday"    "Monday"  "Saturday"  "Thursday"    "Monday" "Wednesday"    "Friday"    "Monday" 
 2008-01-05  2008-01-14  2008-01-21  2008-01-25  2008-02-02  2008-02-04  2008-02-09  2008-02-10  2008-02-15  2008-02-18  2008-02-19  2008-02-20  2008-02-21  2008-02-22 
 "Saturday"    "Monday"    "Monday"    "Friday"  "Saturday"    "Monday"  "Saturday"    "Sunday"    "Friday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"    "Friday" 
 2008-03-04  2008-03-06  2008-03-15  2008-03-18  2008-03-23  2008-03-28  2008-03-29  2008-04-05  2008-04-10  2008-04-16  2008-04-17  2008-04-18  2008-04-21  2008-04-22 
  "Tuesday"  "Thursday"  "Saturday"   "Tuesday"    "Sunday"    "Friday"  "Saturday"  "Saturday"  "Thursday" "Wednesday"  "Thursday"    "Friday"    "Monday"   "Tuesday" 
 2008-04-25  2008-05-01  2008-05-02  2008-05-08  2008-05-21  2008-05-24  2008-05-29  2008-06-08  2008-06-12  2008-06-21  2008-06-25  2008-06-26  2008-07-04  2008-07-06 
   "Friday"  "Thursday"    "Friday"  "Thursday" "Wednesday"  "Saturday"  "Thursday"    "Sunday"  "Thursday"  "Saturday" "Wednesday"  "Thursday"    "Friday"    "Sunday" 
 2008-07-07  2008-07-13  2008-07-18  2008-07-21  2008-07-22  2008-07-23  2008-07-24  2008-07-29  2008-07-30  2008-08-01  2008-08-02  2008-08-05  2008-08-06  2008-08-08 
   "Monday"    "Sunday"    "Friday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"   "Tuesday" "Wednesday"    "Friday"  "Saturday"   "Tuesday" "Wednesday"    "Friday" 
 2008-08-09  2008-08-10  2008-08-12  2008-08-13  2008-08-15  2008-08-16  2008-08-20  2008-08-28 
 "Saturday"    "Sunday"   "Tuesday" "Wednesday"    "Friday"  "Saturday" "Wednesday"  "Thursday" 

$`3`
 2007-09-05  2007-09-11  2007-09-19  2007-09-26  2007-09-28  2007-10-05  2007-10-11  2007-10-16  2007-10-17  2007-10-18  2007-10-19  2007-10-24  2007-10-25  2007-10-26 
"Wednesday"   "Tuesday" "Wednesday" "Wednesday"    "Friday"    "Friday"  "Thursday"   "Tuesday" "Wednesday"  "Thursday"    "Friday" "Wednesday"  "Thursday"    "Friday" 
 2007-11-01  2007-11-07  2007-11-08  2007-11-09  2007-11-14  2007-11-15  2007-11-16  2007-11-19  2007-11-20  2007-11-27  2007-11-29  2007-11-30  2007-12-07  2007-12-10 
 "Thursday" "Wednesday"  "Thursday"    "Friday" "Wednesday"  "Thursday"    "Friday"    "Monday"   "Tuesday"   "Tuesday"  "Thursday"    "Friday"    "Friday"    "Monday" 
 2007-12-20  2007-12-21  2007-12-27  2008-01-02  2008-01-07  2008-01-08  2008-01-09  2008-01-10  2008-01-28  2008-02-01  2008-02-08  2008-02-14  2008-02-26  2008-02-29 
 "Thursday"    "Friday"  "Thursday" "Wednesday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"    "Monday"    "Friday"    "Friday"  "Thursday"   "Tuesday"    "Friday" 
 2008-03-03  2008-03-05  2008-03-07  2008-03-08  2008-03-11  2008-03-17  2008-03-26  2008-03-27  2008-03-31  2008-04-07  2008-04-08  2008-04-09  2008-04-14  2008-04-15 
   "Monday" "Wednesday"    "Friday"  "Saturday"   "Tuesday"    "Monday" "Wednesday"  "Thursday"    "Monday"    "Monday"   "Tuesday" "Wednesday"    "Monday"   "Tuesday" 
 2008-04-24  2008-04-29  2008-05-06  2008-05-13  2008-05-14  2008-05-15  2008-05-16  2008-05-20  2008-05-23  2008-05-30  2008-06-03  2008-06-06  2008-06-09  2008-06-11 
 "Thursday"   "Tuesday"   "Tuesday"   "Tuesday" "Wednesday"  "Thursday"    "Friday"   "Tuesday"    "Friday"    "Friday"   "Tuesday"    "Friday"    "Monday" "Wednesday" 
 2008-06-14  2008-06-16  2008-06-22  2008-07-14  2008-07-25  2008-08-19  2008-08-26 
 "Saturday"    "Monday"    "Sunday"    "Monday"    "Friday"   "Tuesday"   "Tuesday" 

$`4`
2007-09-01 2007-09-02 2007-09-09 2007-09-16 2007-09-22 2007-09-23 2007-09-30 2007-10-13 2007-10-14 2007-10-21 2007-11-04 2007-11-10 2007-11-11 2007-11-12 2007-11-17 2007-11-18 
"Saturday"   "Sunday"   "Sunday"   "Sunday" "Saturday"   "Sunday"   "Sunday" "Saturday"   "Sunday"   "Sunday"   "Sunday" "Saturday"   "Sunday"   "Monday" "Saturday"   "Sunday" 
2007-11-24 2007-11-25 2007-12-02 2007-12-08 2007-12-09 2007-12-15 2007-12-16 2007-12-22 2007-12-23 2007-12-25 2007-12-29 2007-12-30 2008-01-01 2008-01-06 2008-01-12 2008-01-13 
"Saturday"   "Sunday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday"  "Tuesday" "Saturday"   "Sunday"  "Tuesday"   "Sunday" "Saturday"   "Sunday" 
2008-01-19 2008-01-20 2008-01-26 2008-01-27 2008-02-03 2008-02-16 2008-02-17 2008-02-23 2008-02-24 2008-03-01 2008-03-02 2008-03-09 2008-03-16 2008-03-21 2008-03-22 2008-03-30 
"Saturday"   "Sunday" "Saturday"   "Sunday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday"   "Sunday"   "Sunday"   "Friday" "Saturday"   "Sunday" 
2008-04-06 2008-04-12 2008-04-13 2008-04-19 2008-04-20 2008-04-26 2008-04-27 2008-05-03 2008-05-04 2008-05-10 2008-05-11 2008-05-17 2008-05-18 2008-05-25 2008-05-31 2008-06-15 
  "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday"   "Sunday" "Saturday"   "Sunday" 
2008-06-29 2008-07-12 2008-07-19 2008-07-20 2008-07-26 2008-07-27 2008-08-03 2008-08-17 2008-08-24 2008-08-31 
  "Sunday" "Saturday" "Saturday"   "Sunday" "Saturday"   "Sunday"   "Sunday"   "Sunday"   "Sunday"   "Sunday" 

$`5`
 2008-03-20  2008-05-26  2008-06-20  2008-06-28  2008-06-30  2008-07-03  2008-07-08  2008-07-09  2008-07-10  2008-07-11  2008-07-15  2008-07-16  2008-07-17  2008-07-28 
 "Thursday"    "Monday"    "Friday"  "Saturday"    "Monday"  "Thursday"   "Tuesday" "Wednesday"  "Thursday"    "Friday"   "Tuesday" "Wednesday"  "Thursday"    "Monday" 
 2008-07-31  2008-08-04  2008-08-07  2008-08-14  2008-08-21  2008-08-22  2008-08-23  2008-08-25  2008-08-29  2008-08-30 
 "Thursday"    "Monday"  "Thursday"  "Thursday"  "Thursday"    "Friday"  "Saturday"    "Monday"    "Friday"  "Saturday" 

Note that most of the weekend days are in cluster 4 along with a Christmas Tuesday (25 December 2007) and Veterans Day (observed) on a Monday, 12 November 2007, and a Good Friday, 21 March 2008. Assigning meanings to the other clusters depends upon having events to mark them with. It’s known, for example, that the last day of school in 2008 was 20th June 2008. Unfortunately, the academic calendars for 2007-2008 have apparently been discarded. (I was able to find a copy of the 2008 Westwood High School yearbook, but it is not informative about dates, consisting primarily of photographs.) Accordingly, it’s necessary to look for internal consistency.

There is a visual way of representing these findings. The figure below, a reproduction of the one at the head of the blog post, traces energy consumption for the high school during each day. The abscissa shows hours of the day, broken up into 96 15-minute intervals. For each of 366 days, the energy consumption recorded is plotted, and the lines connected. Each line is plotted in a different color depending upon the day of the week. The colors are faded by adjusting their alpha value so they can be seen through.

Note how days with flat energy consumption tend to be in a single color. These are apparently weekend days.

Atop of each of the lines describing energy consumption, a black numeral has been printed which gives the cluster number to which the day was assigned. These are printed at the highest point of their associated curves, but these are jittered so they don’t stack atop one another and make them hard to distinguish.

(Click on figure to see larger image and then use browser Back Button to return to blog.)

The clusters go along with consumption characters. A proactive energy management approach would entail examining the activities done on the days in each of the clusters. Of special interest would be clusters, such as clusters 1 and 3 which have very high energy usage.

Code and data

The code and data reviewed here are available in my Google replacement for a Git repository.

Future work

I am next planning to apply this clustering technique to long neglected time series of streamflow in Sharon, MA and on the South Shore.

Posted in American Statistical Association, consumption, data streams, decentralized electric power generation, dendrogram, divergence measures, efficiency, electricity, electricity markets, energy efficiency, energy utilities, ensembles, evidence, forecasting, grid defection, hierarchical clustering, hydrology, ILSR, information theoretic statistics, local self reliance, Massachusetts, microgrids, NCD, normalized compression divergence, numerical software, open data, prediction, rate of return regulation, Sankey diagram, SNCD, statistical dependence, statistical series, statistics, sustainability, symmetric normalized compression divergence, time series | 1 Comment

On plastic bag bans, and the failure to realize economic growth cannot be green

(Updated 2019-01-12.)

Despite the surge of interest in plastic bag bans, the environmental sustainability numbers haven’t been run. For example, it makes no sense to trade using paper bags instead of plastic ones, even if the paper is recycled, because paper is a nasty product to make, and more emissions are involved shipping paper bags than plastic ones. Paper bags are heavier, get wet, and cost towns and residents to recycle or dispose.

The City of Cambridge, Massachusetts, put fees on all retail bags, but did that after studying the matter for seven years. Reports on their study are available at the City of Cambridge Web site.

Even reusable bags have an impact to be made, and, if used, must be reused one or two hundred times to offset their own upstream environmental impacts in comparison with plastic bags, downstream impacts and all. The biggest problem people have with reusable bags is remembering to bring them along.

We don’t really know what happens to plastic bags in oceans, apart from anecdotal evidence of harm to macroscale creatures. Cigarette filters and microplastics seem to persist.

See the podcast from BBC’s “Costing the Earth”

for some of the complexities.

Wishful environmentalism can be damaging: It consumes policy good will, energy on the part of activists, and misses addressing substantial problems, like expansive development, which cause far greater harm to the natural world. And, worse, the “feel good” of not using plastic bags, or helping to ban them tends to justify personal behaviors which are more damaging, such as taking another aircraft flight for fun that hasn’t been properly offset in its emissions (*). Air travel is a huge contributor, and has, thus far, never been successfully penalized for its contributions to human emissions. The last round on that was fought during the Obama administration which fiercely negotiated with Europe not to have to pay extra fees for landing in EU airports.

The hard fact is economic growth cannot be green. Quoting:

Study after study shows the same thing. Scientists are beginning to realize that there are physical limits to how efficiently we can use resources. Sure, we might be able to produce cars and iPhones and skyscrapers more efficiently, but we can’t produce them out of thin air. We might shift the economy to services such as education and yoga, but even universities and workout studios require material inputs. Once we reach the limits of efficiency, pursuing any degree of economic growth drives resource use back up.

These problems throw the entire concept of green growth into doubt and necessitate some radical rethinking. Remember that each of the three studies used highly optimistic assumptions. We are nowhere near imposing a global carbon tax today, much less one of nearly $600 per metric ton, and resource efficiency is currently getting worse, not better. Yet the studies suggest that even if we do everything right, decoupling economic growth with resource use will remain elusive and our environmental problems will continue to worsen.

This sounds discouraging, but I am not discouraged. The natural world has repeatedly dealt with species which were resource hogs. That it ends poorly for the species who do is a salutary lesson for those which can observe it, assuming they learn.

Claire bought me a wonderful book for the holidays. It’s Theory-based Ecology by Pásztor, Botta-Dukát, Magyar, Czárán, and Meszéna, and I got it for my Kindle Oasis. It has a number of themes but two major ones are (1) exponential growth of unstructured populations, and (2) the inevitability of population regulation. By the latter they mean organism deaths due to insufficient resources, or, in other words, growth beyond the carrying capacity.

In our case, that kind of collapse or growth is mediated by an economic system, one which suffers its own periodic collapses. Accordingly, the choice is whether to keep hands off and allow such a collapse via a Minsky moment occur on its own, or, instead, intervene and have a controlled descent. We are not as self-sustaining as we collectively think, and developed countries, although wealthier and replete with resources, also have a greater cross section for impact and harm.

Our choice.

Update, 2019-01-12

From The Hill, “Will a market crash get the action we need on climate change?”:

So, what’s the good news? The end of denial by financial markets and government leaders is nearly at hand. For most investors, the risks of climate change loom beyond their investment horizon. It’s been easy for investors to operate in a speculative carbon bubble, acting as though there are no impending costs to earnings-per-share or to liabilities in their portfolios from the buildup of carbon in the atmosphere. But these costs may increasingly look real, and when investors start taking these costs into account, markets will revalue: not just oil and gas stock, but all stocks.

Companies have facilities that will be flooded or be without needed water for production; supply chains will need to be rebuilt; costs of transportation will increase. What about the costs to financial institutions as communities need to be abandoned because of flood or drought? What are the fiscal consequences to governments of rebuilding airports, roads and other critical infrastructure? What will happen to consumer spending?

There will be winners and losers in this revaluation, but as past speculative bubbles have shown us, when they burst, markets move very quickly.

Government leaders have likewise largely operated in a bubble. It is the rare leader who can spend political (or taxpayer) capital on addressing an over-the-horizon problem. When the bubble bursts, government leaders will need to address the real concerns of rebuilding infrastructure, food and water security, and public health threats that will be seen by voters as imminent.


(*) This is actually pretty straightforward to do. Here’s our formula.

There is something called the New England Wind Fund. Essentially, contributions are used to buy WRECs, and one WREC prevents 842 pounds of CO2 emissions on the electric grid. Thecarbonfootprint.com offers a CO2 travel calculator. It tells how much CO2-equivalents are produced from a flight. (They offer calculators for other modes of travel, too.) They also offer you a vehicle for offsetting right on the spot, but I do not recommend using it. They do also make available a check box for additional forcing effects, which I always check. This is because emissions at typical aircraft altitudes are worse than at sea level or on the ground.

The result is in metric tonnes, where 1 metric ton is 1000 kilograms. There are 2.2 lbs per kilogram. So 1 WREC prevents 383 metric tonnes of CO2 emissions.

For a trip, calculate how much emissions you will make in units of WRECs, and then go to the New England Wind Fund site and contribute US$40 for each WREC.

Done.

I don’t recommend using the carbonfootprint.com offset because, while they could be fine, Carbon offsetting programs need constant auditing and checking, and there are some unscrupulous operators out there who use these for greenwashing purposes only. I know New England Wind, though, and these really do get converted into WRECs.

Posted in adaptation, an ignorant American public, an uncaring American public, Anthropocene, development as anti-ecology, E. O. Wilson, environment, evidence, evolution, exponential growth, fragmentation of ecosystems, global warming, greenwashing, Humans have a lot to answer for, Hyper Anthropocene, local self reliance, plastics, population biology, quantitative biology, quantitative ecology, supply chains, sustainability, sustainable landscaping, The Demon Haunted World, the right to be and act stupid, the right to know, the tragedy of our present civilization, the value of financial assets, tragedy of the horizon | Leave a comment

Hogwarts Hymn

Posted in Harry Potter, J K Rowling | Leave a comment

My most political post yet … yeah, but it’s me, and Bill Maher is, most of the time, what I’m down with.

Sorry, but there are distinctions to be made.

Posted in Bill Maher, objective reality | Leave a comment

International climate negotiations, the performance: `Angry and upset’

Climate Adam, who you should follow:

Posted in adaptation, American Association for the Advancement of Science, Anthropocene, carbon dioxide, Carbon Worshipers, climate change, Glen Peters, global warming, Hyper Anthropocene, Kevin Anderson | Leave a comment

Love your home. The place we call home needs love. But love means nothing, without action.

Posted in Ørsted, bridge to somewhere, Buckminster Fuller, climate disruption, decentralized electric power generation, decentralized energy, ecological disruption, electricity, green tech, Green Tech Media, investment in wind and solar energy, local generation, solar democracy, solar domination, solar energy, solar power, Spaceship Earth, sustainability, the energy of the people, the green century, tragedy of the horizon, utility company death spiral, wind energy, wind power, zero carbon | Leave a comment

Series, symmetrized Normalized Compressed Divergences and their logit transforms

(Major update on 11th January 2019. Minor update on 16th January 2019.)

On comparing things

The idea of a calculating a distance between series for various purposes has received scholarly attention for quite some time. The most common application is for time series, but particularly with advent of means to rapidly ascertain nucleotide and ligand series, distances between these are increasingly of inferential interest.

Vitányi’s idea(**)

When considering the divergence between arbitrary sequences, P. M. B. Vitányi has been active since at least 1998, with his work with colleagues Bennett, Gács, Li, and Zurek, per “Information distance” which appeared in IEEE Transactions on Information Theory in that year. Since 1998, he has published papers with these and other colleagues. The paper which concerns the present post is R. Cilibrasi, P. M. B. Vitányi, “Clustering by compression“, which appeared in IEEE Transactions on Information Theory in 2005. That paper and A. R. Cohen, P. M. B. Vitányi, “Normalized Compression Distance of multisets with applications“, IEEE Transactions on Information Theory, 2015, 37(8), 1602-1614, explain why this way of comparing sequences is attractive. For example, from Cohen and Vitányi,

Pairwise normalized compression distance (NCD) is a parameter-free, feature-free, alignment-free, similarity metric based on compression … The way in which objects are alike is commonly called similarity. This similarity is expressed on a scale of 0 to 1 where 0 means identical and 1 means completely different … To define the information in a single finite object one uses the Kolmogorov complexity [15] of that object (finiteness is taken as understood in the sequel). Information distance [2] is the information required to transform one in the other, or vice versa, among a pair of objects … Here we are more concerned with normalizing it to obtain the so-called similarity metric and subsequently approximating the Kolmogorov complexity through real-world compressors [19]. This leads to the normalized compression distance (NCD) which is theoretically analyzed and applied to general hierarchical clustering in [4]. The NCD is parameter-free, feature-free, and alignment-free, and has found many applications in pattern recognition, phylogeny, clustering, and classification ….

This is exciting because it offers a way of, if you will, doing really non-parametric statistics: Not only don’t inferential procedures based upon these not care about statistical distributions which the units of study exhibit, they are opaque to many features which might sidetrack inference with outlying characteristics. These sometimes arise from simple mistakes in measurement or record. It’s to be expected, I think, that use of such techniques will result in loss of statistical power in comparison to inferences based upon good parametric models for a given dataset. On the other hand, it’s almost impossible to make specification errors, or form Likelihood functions improperly. Aspects of models which cause these just are not seen.

Definition and some properties

The basic idea of determining how far apart two sequences \textbf{x} and \textbf{y} are begins by positing a practical compressor, an operator, R(\textbf{s}), which takes a sequence \textbf{s} into a compressed version of \textbf{s}, or \textbf{s}_{C}. Then define a Z(\textbf{s}) = \rho(R(\textbf{s})), where the \rho(.) is a length measure of sorts, each of \textbf{s}_{C}, perhaps the length of the resulting compression in bits or nats. Then

K_{\text{vc}}(x,y) = \frac{Z(x||y) - \min{(Z(x), Z(y))}}{\max{(Z(x), Z(y))}}.

where \textbf{x}||\textbf{y} denotes the concatenation of the sequence \textbf{x} with the sequence \textbf{y}, is interpreted as the normalized compressed divergence between \textbf{x} and \textbf{y}. If

K_{\text{sym}}(\textbf{x}, \textbf{y}) = \frac{K_{\text{vc}}(\textbf{x}, \textbf{y}) + K_{\text{vc}}(\textbf{y}, \textbf{x})}{2}.

is calculated instead, a pseudo-distance is obtained. It is at least symmetric. In general,

K_{\text{vc}}(\textbf{x}, \textbf{y}) \ne K_{\text{vc}}(\textbf{y}, \textbf{x}).

In other words, K_{\text{vc}}(\textbf{x}, \textbf{y}) is not a metric distance. K_{\text{sym}}(\textbf{x}, \textbf{y}) is not one either, but it is symmetric. This why, in the title I refer to it as a symmetrized divergence and to differentiate from the standard Cilibrasi and Vitányi I call it SNCD and I refer to it as a divergence(*).

In fact, the terminology can be confusing. Both fail because they don’t extend across the non-negative Reals. Nevertheless, it is possible to cluster objects using either. It’s difficult to do inferences with these, but definining

\mathcal{L}_{\text{vc}}(\textbf{x}, \textbf{y}) \triangleq \text{logit}(K_{\text{sym}}(\textbf{x}, \textbf{y})).

gets to an index which does extend across the Reals and can be readily used for statistical inference. The logit is a well-known transform for mapping probabilities onto the Real line.

However, the Triangle Inequality is not respected, so the geometry is non-Euclidean.

The point of the post

In addition to introducing Normalized Compressed Divergence to the readership, something which I’ll be using in future posts, I constructed several series which show similarities to one another. The pseudo-distances between related pairs of these were calculated, as were their logits.

Below I show the series, and then I present a table showing the results. Hopefully, this gives some idea of what series are considered familiar or not.

The cases

ypredict0, ypredict1

These are two similar instances of a curve and dataset taken from Sivia and Skilling. Both divergences between these curves and intermediate ones are calculated in the results below.

y.trig0, y.trig1, y.trig2, y.trig3

These are instances of a basic sine series and three modulations of it.

  • y.trig0 shows 4 waves from a sine function with frequency one.
  • y.trig1 shows 4 waves from a related function, one with an amplitude modulation of 1 + \frac{x}{2\pi}.
  • y.trig2 shows 4 waves from a related function, one shifted in phase by an eighth of a wavelength, that is \sin{(x + \frac{\pi}{4})}.
  • y.trig3 shows 4 waves from a related function, one chirped in frequency, as \sin{(x (1 + 4\epsilon))}, where \epsilon steps in 0 \le \epsilon \le 8 \pi in thousandths.

Some practicalities

When calculating a compressed version of a signal, generally speaking, practical compressors demand a string version of the signal. I have chosen to use the xzip compressor with the “-9e” header option, as specified in the R internal memCompress function. This means a length \rho(R(\textbf{s})) divergences cannot be zero, but, nevertheless, the divergence could be.

Also, numeric signals need to be converted to characters. There are many ways that could be done. I originally used the technique in the TSclust package of R, since that was the only package which overtly claimed to offer an NCD dissimilarity measure. But it turns out that has problems, at least for numerical series. The is also a TSdist package which simply imports the corresponding dissimilarity measures from TSclust.

A problem in TSclust

The TSclust package of R has a dissimilarity calculation. Consulting the source, it’s clear this is not symmetrized, and is patterned literally after Cilibrasi and Vitányi:


###################################################################################
####### Clustering by Compression (2005), Cilibrasi, R., Vitanyi, P.M.B.,  ########
######## Normalized Compression Distance ##########################################
###################################################################################
diss.NCD <- function(x,y, type="min") {
    .ts.sanity.check(x,y)
    comp <- .compression.lengths(x,y, type)  
    (comp$cxy - min(comp$cx,comp$cy)) / max(comp$cx, comp$cy)
}

#common part of compression methods,
#calculate the sizes of the compressed series and of their concatenation
.compression.lengths <- function(x, y, type) {      
    methods <- type
    type = match.arg(type, c("gzip", "bzip2", "xz", "min"))
    if (type == "min") { #choose the best compression method of the three 
        methods <- c("gzip", "bzip2", "xz")
    }
    xy <- as.character(c(x,y))
    x <- as.character(x)
    y <- as.character(y)
    cxym <- sapply( methods, function(m) { length( memCompress(xy, type=m) )})
    cxy <- min(cxym)
    cx <- min(sapply( methods, function(m) { length( memCompress(x, type=m) )}))
    cy <- min(sapply( methods, function(m) { length( memCompress(y, type=m) )}))
    list(cx=cx, cy=cy, cxy=cxy)
}

Apart from the fact that .ts.sanity.check doesn’t work, not that the numeric series are simply converted to character strings in .compression.lengths before being subjected to compression and, subsequently, calculation of the NCD. This cannot be correct.

Consider what would happen if there were two time series, \mathbf{A} and \mathbf{B}, each of length 100. Suppose \mathbf{A} consists of a sequence of 50 copies of the string “3.1” followed by a sequence of 50 copies of the string “3.2”. Suppose \mathbf{B} consists of a sequence of 50 copies of the string “3.2” followed by a sequence of 50 copies of the string “3.1”. Using the dissVSTR function from below, which operates purely on strings, not on statistical series, the SNCD is 0.7.

Now consider two other series, like these but slightly modified, \mathbf{C} and \mathbf{D}, also each of length 100. But instead, suppose \mathbf{C} consists of a sequence of 50 copies of the string “3.01” followed by a sequence of 50 copies of the string “3.02” and \mathbf{D} consists of a sequence of 50 copies of the string “3.02” followed by a sequence of 50 copies of the string “3.01”. If these were statistical series, the values are close to one another than they were. But since they are strings, and have an additional numeral zero, dissVSTR actually shows their SNCD is larger, about 0.73, implying they are father apart.

I first tried to fix this in an earlier version of this post by precalculating a combined set of bins for the pooled values in both series based upon standard binning logic in the sm package and then quantiles using the hdquantile function of the Hmisc package. I then assigned a character to each of the resulting bins and used the R cut function to determine to which bin each of the individual series belonged and coded that. This was a bit better, but I still think it was wrong: Bins corresponding to bigger values hadn’t any more mass than smaller bins and, so, a distance apart at a larger bin was ranked in information distance exactly the same way as a bin apart at the low end.

Remedy

The remedy I’ve chosen is to do code differently if SNCD of the the pair of objects is calculated on strings (or files) or numerical series. For strings, there’s no problem going right ahead and calculating the compressed versions of the strings. But for numerical statistical series, as in the last suggestion above, quantiles of the pooled values from the two series (without duplicates removed) are calculated using the hdquantile function from Hmisc. The number of quantiles points is one more than the rounded version of 1 + \log_{2}{(n)}, where n is the length of the longer of the two series, in the case they are of different length. So


hdquantile(x=c(x,y), probs=seq(0, 1, 
               length.out=round(log(max(n.x, n.y)/log(2) + 1)), 
               names=FALSE)

calculates the pooled quantiles.

Next, assign a unique string character to each of the resulting bins. But instead of just using that character by itself, replicate it into a string with the same number of repetitions as the bin number. Thus, if there are m bins, the bin containing the smallest values has its character just of length unity, but the last bin, bin m, has its corresponding character replicated m times.

Finally run over the two series, and code them by emitting the repeated bin labels corresponding to the bin in which they fall. This is the result submitted for compression comparison.

There is an additional thing done to accommodate NA values in the series, but the reader can check the code below for that.

The results

There are 6 cases which serve as end members in various capacities, as shown above. The divergences between ypredict1 and ypredict2 are shown, as are divergences between y.trig0 and y.trig1, y.trig0 and y.trig2, y.trig0 and y.trig3, y.trig1 and y.trig2, y.trig1 and y.trig3, and finally y.trig2 and y.trig3.

Also shown are intermediate morphings between end members of these pairs. If \mathbf{y}_{1} is one end member and \mathbf{y}_{2} is a second end member, then

\mathbf{y}_{\epsilon} = (1 - \epsilon) \mathbf{y}_{1} + \epsilon \mathbf{y}_{2}.

is the \epsilon intermediate morphing between the two.

Using the divergence between ypredict1 and ypredict2 as a baseline, the modulation of the sine case, y.trig0, into its phase shifted counterpart, or y.trig2, is see as the least change. The amplitude modulated version of y.trig0 called y.trig1 has a substantial divergence, but not as much as the chirped version called y.trig3. The intermediate versions of these behavior predictably. It is a little surprising that once ypredict1 is transformed 0.7 of the way to ypredict2, the divergence doesn’t worsen. Also, in the case of the sinusoids, divergences as the curves approach the other end point do not change monotonically. That isn’t surprising, really, because there’s a lot going on with those sinusoids.

The code producing the results

Intermediate datasets, R code for the above results is available from a directory in my Google drive replacement for Git.

New versions of the codes have been uploaded to the Google drive. The old versions are still there.

The key code for producing the SNCDs from numerical statistical series is:


library(Matrix) # Data structure for large divergence matrices
library(random) # Source of the random function
library(Hmisc)  # Source of the hdquantile function
library(gtools) # Source of the logit function

numericToStringForCompression<- function(x, y)
{
  n.x<- length(x)
  n.y<- length(y)
# (This the default number of bins for binning from the sm package, but there are
#  two vectors here, and they need a common number of bins.)
  nb<-  max(round(log(n.x)/log(2)+1), round(log(n.y)/log(2)+1))
  Q<- hdquantile(c(x,y), probs=seq(0,1,length.out=1+nb), names=FALSE)
  alphaBet<- c(letters, LETTERS, sapply(X=0:9, FUN=function(n) sprintf("-%.0f", n)))
  m<- length(Q) - 1
  stopifnot( (1 < m) && (m <= length(alphaBet)) )
  codes<- c("!", mapply(A=rev(alphaBet[1:m]), K=(1:m), 
                 FUN=function(A,K) Reduce(f=function(a,b) paste0(a,b,collapse=NULL), x=rep(A, (1+K)), init="", right=FALSE, accumulate=FALSE)))
  cx<- 1+unclass(cut(x, Q, labels=FALSE))
  cx[which(is.na(cx))]<- 1
  cy<- 1+unclass(cut(y, Q, labels=FALSE))
  cy[which(is.na(cy))]<- 1
  chx<- codes[cx]
  chy<- codes[cy]
  return(list(x=chx, y=chy))
}

compression.lengths<- function(xGiven, yGiven, type="xz")
{
  if (is.numeric(xGiven))
  {
    coding<- numericToStringForCompression(x=xGiven, y=yGiven)
    x<- coding$x
    y<- coding$y
  } else
  {
    stopifnot( is.character(xGiven) )
    stopifnot( is.character(yGiven) )
    x<- xGiven
    y<- yGiven
  }
  #
  xx<- c(x,x)
  yy<<-c(y,y)
  xy<- c(x,y)
  yx<- c(y,x)
  stopifnot( is.character(xx) )
  stopifnot( is.character(yy) )
  stopifnot( is.character(xy) )
  stopifnot( is.character(yx) )
  zero<- length(memCompress("", type=type))
  cx<- length(memCompress(x, type=type)) - zero
  cy<- length(memCompress(y, type=type)) - zero
  cxx<- length(memCompress(xx, type=type)) - zero
  cyy<- length(memCompress(yy, type=type)) - zero
  cxy<- length(memCompress(xy, type=type)) - zero
  cyx<- length(memCompress(yx, type=type)) - zero
  return(list(cx=cx, cy=cy, cxx=cxx, cyy=cyy, cxy=cxy, cyx=cyx, csymmetric=(cxy+cyx)/2))
}


divc.NCD <- function(xGiven, yGiven, trans=function(x) x) 
{
  typCompr<- "xz"
  if (is.numeric(xGiven))
  {
    coding<- numericToStringForCompression(x=xGiven, y=yGiven)
    x<- coding$x
    y<- coding$y
  } else
  {
    stopifnot( is.character(xGiven) )
    stopifnot( is.character(yGiven) )
    x<- xGiven
    y<- yGiven
  }
  #
  xy<- c(x,y)
  yx<- c(y,x)
  zero<- length(memCompress("", type=typCompr))
  cx<- length(memCompress(x, type=typCompr)) - zero
  cy<- length(memCompress(y, type=typCompr)) - zero
  cxy<- length(memCompress(xy, type=typCompr)) - zero
  cyx<- length(memCompress(yx, type=typCompr)) - zero
  #
  # Symmetrized NCD of the above.
  mnxy<- min(cx, cy)
  mxxy<- max(cx, cy)
  ncd<- max(0, min(1, ( (cxy - mnxy) + (cyx - mnxy) ) / (2*mxxy) ) )
  #
  return(trans(ncd))
}

divs<- function(SERIES, period=25)
{
  stopifnot( is.data.frame(SERIES) ) 
  N<- ncol(SERIES)
  divergences<- Matrix(0, N, N, dimnames=list(NULL, NULL))
  # Since logits are so common in inference, calculate those, too.
  logit.divergences<- Matrix(-Inf, N, N, dimnames=list(NULL, NULL))
  N1<- N-1
  for (i in (1:N1))
  {
    for (j in ((1+i):N))
    {
      d<- divc.NCD(xGiven=SERIES[,i], yGiven=SERIES[,j], trans=function(x) x)
      divergences[i,j]<- d
      divergences[j,i]<- d
      ld<- logit(d)
      logit.divergences[i,j]<- ld
      logit.divergences[j,i]<- ld
    }
    if (0 == (i%%25))
    {
      cat(sprintf("... did %.0f\n", i))
    }
  }
  stopifnot( !is.null(colnames(SERIES)) )
  colnames(divergences)<- colnames(SERIES)
  rownames(divergences)<- colnames(SERIES)
  colnames(logit.divergences)<- colnames(SERIES)
  rownames(logit.divergences)<- colnames(SERIES)
  #
  # Return Matrix objects, leaving conversion to a matrix, a  distance matrix, or a data
  # from to the consumer of the output. Can't anticipate that here. 
  return(list(divergences=divergences, logit.divergences=logit.divergences))
}

dissVSTR<- function(VSTR, period=25, logitp=FALSE)
{
  stopifnot( is.vector(VSTR) ) 
  N<- length(VSTR)
  zero<- length(memCompress(""))
  ncdf<- function(cx, cy, cxy, cyx) { mnxy<- min(cx,cy) ; mxxy<- max(cx,cy) ; return( max(0, min(1, (cxy + cyx - 2*mnxy)/(2*mxxy) ))) }
  #
  CV200) & (0 < period))
  {
    cat(sprintf("Preconditioning of %.0f items completed.\n", N))
  }
  #
  if (logitp)
  {
    dInitial<- -Inf
    trans<- logit
  } else
  {
    dInitial<- 0
    trans<- function(x) x
  }
  #
  divergences<- Matrix(dInitial, N, N, dimnames=list(NULL, NULL))
  #
  N1<- N-1
  for (i in (1:N1))
  {
    sx<- VSTR[i]
    cx<- CV[i]
    for (j in ((1+i):N))
    {
      sy<- VSTR[j]
      cy<- CV[j]
      sxy<- sprintf("%s%s", sx, sy)
      syx<- sprintf("%s%s", sy, sx)
      cxy<- length(memCompress(sxy)) - zero
      cyx<- length(memCompress(syx)) - zero
      d<- trans(ncdf(cx, cy, cxy, cyx))
      if (is.nan(d))
      {
        cat("NANs within VSTR. Inspection:\n")
        browser()
      }
      divergences[i,j]<- d
      divergences[j,i]<- d
    }
    if ((0 < period) && (200 < N) && (0 == (i%%period)))
    {
      cat(sprintf("... did %.0f\n", i))
    }
  }
  colnames(divergences)<- names(VSTR)
  rownames(divergences)<- names(VSTR)
  # Return a Matrix object, leaving conversion to a matrix, a  distance matrix, or a data
  # from to the consumer of the output. Can't anticipate that here.
  return(divergences)
}

You are welcome to use this, but please acknowledge its source:

Jan Galkowski from 667-per-cm.net.

Thanks.

(*) L. Pardo, Statistical Inference Based on Divergence Measures, Chapman & Hall/CRC, 2006.

(**) Others have done important work in this area, including I. J. Taneja (2013) in “Generalized symmetric divergence measures and the probability of error“, Communications in Statistics – Theory and Methods, 42(9), 1654-1672, and J.-F. Coeurjolly, R. Drouilhet, and J.-F. Robineau (2007) in “Normalized information-based divergences“, Problems of Information Transmission, 43(3), 167-189.

Posted in Akaike Information Criterion, bridge to somewhere, computation, content-free inference, data science, descriptive statistics, divergence measures, engineering, George Sughihara, information theoretic statistics, likelihood-free, machine learning, mathematics, model comparison, model-free forecasting, multivariate statistics, non-mechanistic modeling, non-parametric statistics, numerical algorithms, statistics, theoretical physics, thermodynamics, time series | 4 Comments

Winter composting: How to make friends with microbes and defy weather (podcast, too)

(Slightly updated 2019-04-08, although the podcast has not been updated to be consistent.)
(This blog post is accompanied by an explanatory podcast. See below.)

Many people compost. It can be easy or hard, depending upon your tolerance for turning and work, and of the Wild Thing who wants a free meal.

I can imagine and have known dogs to get into bins, and racoons supposedly get into them. There’s plenty of photo evidence online of racoon footprints near compost piles and bins, and I have seen muddy footprints of racoons on the outside of ours, but I haven’t found a single online photo of a racoon in a compost bin. Racoons might be hardy, but my wife, Claire, once had a granddog that got into some partly cooked compost and soon developed seizures because the partly digested compost had neurotoxins from microbial activity. This did require expensive hospitalization.

So it’s a good idea to keep your compost bins protected from the stray mammal.

That’s a shot of our twin compost bins. The active one is above. The other is “cooking down” over the winter, and should be ready for garden and yard use by late Spring. We have a New England-style Blue Bin which collects rainwater from our roof and gutter system. We use this almost exclusively to add water to the compost in late Spring, Summer, and Autumn, and wash out pails and implements.

I took out some compost today and, with the handle of the small pitchfork I use, an essential tool for composting, I was able to knock ice away from the interior of the Blue Bin spigot and get free running water:

It is 1st January, after all, in greater Boston, Massachusetts.

This brings me to what I want to primarily write and talk about: How to do winter composting. Happy to share.

First, let’s look at our composting setup:

When Claire and I remodelled our kitchen, we had stainless steel compost bin installed flush with the quartz rock counter. Uncapped, and accompanied by a pail of compost from First Parish Needham, Unitarian Universalist, which we collect from coffee hour there and compost, sharing the chore with friend, Susan McGarvey, it looks like:

Now, there are several sources for composting which claim meats and cheese are unsuitable. I can imagine that, in the case of open compost piles, or extreme quantities, it might be a good idea to keep these out of your compost. But Claire and I are vegetarians, for the most part, we like our cheese, and we have three cats. That means we have scrapes from cat food.

That’s Darla, by the way.

For the most part, we We never trash our bio-organic waste. The exception is the litter from the cat litter boxes. As of March 2019 we also compost the material from our cat litter boxes, a material which is basically shredded walnut shells and stuff the cats contribute. This is done in a self-standing pile of leaves, yard, and garden scraps in our back yard.

During the Summer and warm months, particularly in drought, keeping the compost piles moist is a key goal. When they start developing flies and lots of insects instead of predominantly worms, that means they are too dry. The typical procedure in non-Winter months is

  1. Keep the moisture in the compost bucket inside to a minimum.
  2. Stir the existing compost well.
  3. Add in the new material, stirring lightly.
  4. Add water. For us, that’s a full compost pail of the water.
  5. Stir the compost briskly, mixing in the water and the new material throughout the pile.

In addition, in other than really hot days, you still should see worms and, often, steam rising from the pile when you first stir it.

Winter is quite different.

No additional water gets added to the compost. However, the pail is rather wet, as I’ll dump excess coffee and soy milk into it, so the compost can, at times, be floating in ambient organic stuff. (I do the dishes, by the way, so I can control this.)

So here are the steps I do. I explain why later and in the podcast. That’s inserted below.

Get organized.

You saw the two compost bins above. The active one always has this extract heavy plastic sheet on it, weighed down by rocks. This is critter deterrent, and it has worked for years. The bins themselves are heavy plastic which resists clawing and chewing. They are actually two balls which, could, in principle, be rolled. But I discovered that that means they can’t be loaded more than halfway, because they are too heavy. So they still next to one another, in place, and I stir them instead.

So Winter composting is really quite different than Summer. In fact, we time things to swap when the leaves come down off the trees. The key point is to keep the compost pile from freezing solid. The only reasonable heat source is the exothermic activity of the microbes cracking and consuming the foodstuffs and organics themselves. In fact, everything about Winter composting is designed to maximize that.

The compost pile begins as a empty contained which is filled with whole leaves. These are compacted a couple of times and topped. Then, an indentation is made in the center of the leaves with the pitchfork, carving out a hole in the middle which will hold the compost. It doesn’t extend to the ground. At least initially, the compost should float on top of leaves, supported by them.

Initially, new compost is added into the hole. About 50-50 compost from an old, existing pile is added to innoculate the new pile with microbes. Leaves are spread atop, and the bin is closed up. And you’re on your way.

Then, it is critical that adding additional new material be done carefully and gently.

Leaves are gently removed from the top of the compost, and pushed to the side. If it is really cold you, you should endeavor to get the task done quickly.

Then, with the pitchfork, plunge into the bolus of compost, and twist it, trying to disturb the compost side to side as little as possible. The idea is you don’t want to break up the clumps of microbes working on that bolus. You want to aerate it, sure. Make sure you reach all parts of the pile, but concentrate on the center. Don’t attempt to stir it aggressively as you would in the Summer. That’ll make the effort a failure, because it’ll freeze.

In fact, when done correctly, the temperature of the compost in the leaves will exceed 60°C. In all likelihood, this will kill off the worms. For Winter composting, that’s fine. We’re interested in the microbes. There’s no danger of fire, because that demands temperatures well above boiling.

Next, add the compost. Here there’s two sources. One came from coffee hour at First Parish, and consists primarily of relatively dry coffee grounds and paper. Add that in first.

Obviously, the plastic bag containing the coffee grounds is left out. As a matter of fact, don’t even attempt to put supposedly compostable plastic bags in a home composting setup, especially in Winter time. These generally don’t break down except in really high temperature industrial composters. And, even then, I wonder.

Next I added in our sloppily wet home compost. The idea is to saturate the relatively dry with this. Do not add additional water. When it gets cold, it’ll freeze into a block of uselessness which will stink like crazy during the Spring thaw.

A critically important step: Take the pitchfork, but do not stir, not even as gently as above. Instead, simply push the forks down through the new material two to three dozen times. The idea is drag up old composting material and microbes through the new material, so to get it well underway before it freezes. The microbes will give off heat, and keep that from happening.

So here’s what it looks like. I’ll sometimes fish a bit of old compost out with the pitchfork and spread atop the new for good measure.

The leaves are scraped back over the compost bolus, and I check to make sure it remains insulated in all directions.

The next step is to close it up, and clean up.

However, today, even though there are no worms in the pile, we have friend visiting, a slug, which, apparently, associated with the oak leaves, has decided that the compost pile is a nice place to hang out during the cold Winter.

I didn’t disturb it and placed it back into the leaves.

I put the two lids atop, the rocks back, and then clean up.

That’s rainwater from our barrel. On days when the barrel is frozen up, I need to plan ahead and take out a pail of preferable warm-to-hot water to help with the cleanup. By the way, here’s a tip on the barrel which is intended to prevent something we encountered last winter. We failed to drain the barrel enough ahead of the deep cold, possibly because of excess rains. And when it frozen, it expanded, and rotated by itself on its vertical axis, making it difficult to direct rainwater and such into it.

Our solution was to deliberately drain it deeply after the new compost bin setup, even if this meant we were wasting rainwater.

What to do with the washed out compost bin’s water, whatever it’s source?

Claire has a row of chipped, composting leaves off to one side of the back yard. I pour the excess water, with its slurry of food, on top of these. In addition to filtering, this helps the microbes in these side composters, yet isn’t enough to attract any animal who really cares.

I’ve seen wintering and migrating Robins, though, poking around these piles. Perhaps they have some worms and other critters, like the slug shown above.

That’s it.


A technical summary: The biochemistry and microbiology of home composting is still largely unexplored. I have been able to find one comprehensive paper:

E. Ermolaev, C. Sundberg, M. Pell, H. Jönsson, “Greenhouse gas emissions from home composting in practice“, Bioresource Technology, 2014, 151, 174-182.

While this paper is titled to suggests its interest is principally greenhouse gas emissions, they actually did a serious dive into what was going on. There are several other papers which emphasize the greenhouse gas emissions of composting:

There is also this 2003 paper by Kathryn Parent and the American Chemical Society:

K. E. Parent, “Chemistry and Compost“, 2003, American Chemical Society.

And recent paper was indicated:

M.A. Vázquez, M. Soto, “The efficiency of home composting programmes and compost quality“, Waste Management, June 2017, 64, 39-50.

Posted in agroecology, argoecology, Botany, Carbon Cycle, composting, ecological services, Ecological Society of America, ecology, engineering, environment, fermentation, First Parish Needham, karma, local self reliance, Nature, science, solid waste management, sustainability, sustainable landscaping, Unitarian Universalism, UU, UU Humanists, UU Needham, water as a resource | Leave a comment

Gov Jerry Brown on Meet the Press, a parting comment on 2018 at Bill Gates’ Notes, and the best climate blog post of 2018

Segment One

Outgoing Governor Jerry Brown of California on NBC’s Meet the Press this morning:

I’ll miss him there, but I don’t think Gov Jerry is going anywhere soon.

Segment Two

Bill Gates Notes offered an end of year summary remark to which I posted a comment today, 30th December 2018 at 12:33 EST (no direct link available, sorry), reproduced below:

Thanks, Bill, for your year end insights, documenting where we are, and your continued leadership.

As someone who grew up with computers (FORTRAN in 6th grade on IBM 1620), and was often dreaming of a technological future, I must say that the only part of that dream which came true were computing, and the Internet. It’s a great part, don’t get me wrong, but I wish we had more in the direction of sustainable economies and living. That said, and at age 66, I remain part of the computing industry, and I continue to be excited by the phenomenon which Marc Andreessen described in 2000, that “Software is eating the world”. Everywhere anyone turns, traditional devices which used to use mechanical connections and actuators are being displaced by general purpose computers, often embedded, and things like electric motors controlled by pulse-modulated signals. These are cheaper, lighter, less power hungry, and offer finer, smarter controls. This extends to analog applications of all kinds, from control boards for music systems and video, to automobile controls. I await the day when they make their long anticipated debut as part of civil engineering projects.

On nuclear, I recently studied the field, and believe that the long lamented negative learning curve it exhibits is due solely to the failure of that industry to fail to create modestly sized modular units which can be produced like commodities. Instead, nuclear power has been a cost-plus business and they build bigger and more elaborate all the time, which means these inevitably overshoot schedules and cost targets. We need something like 1 MW reactors which can be lashed together to obtain both arbitrary sizing and greater reliability. (If I lose one server in a farm of ten thousand, like, who cares?) It would be good if they were portable. It would be especially good if they had design safeguards so the materials could not be diverted for nasty purposes, especially dirty bombs. I believe that’s possible, but I also believe that this will require a triumph of imagination, and I can’t see existing players — any more than IBM in its day — coming to that on their own. I wish hope and purpose in this direction for your efforts. No doubt nuclear power was incentivized in the direction they pursued, but the path may have also depended upon contingencies which no one really chose.

By the way, “Software is eating the world” is the corporate motto of Andreesen-Horowitz VC company.

Segment Three

And my vote for the best single climate-related blog post of the year is Eli Rabett‘s Heat has no hair. It begins:

Among physicists and chemists, well at least the theoretical side of the latter it is well known that electrons have no hair by which is meant that a bunny can’t tell one electron from another. This has serious consequences in quantum mechanics because in a multi-electron system you have to allow for each electron to be anywhere any electron is and it gets quite complicated. True, when an atom is ionized you can trace the electron as it is expelled from the atom, but you can’t say WHICH electron it was. Same for electron capture. You could identify an atom before it is captured, but once it was captured you can not identify it from any of the others in the atomic system.

The same thing is true of heat. Heat in an object, perhaps better thermal energy, is random motion of atoms and molecules, translation, vibration, whatever. You can say where heat entering an object came from (say radiation from the sun), but if there is more than one source (trivial case). once it is randomized and in the object you can’t say where it came from.

Which brings Eli to the evergreen claim of those who deny the greenhouse effect, that radiation is not important compared to convection.

Read more at the original link. As I wrote in a related comment:

All the best for your continued explanations and wish you happiness, health, and continued good spirits. Your writing is a joy.


Happy New Year, everyone. Let’s hope the Angry Beast continues to be kind, and we learn some respect. To understand how far we have yet to go, how long we have known, it is worth taking a look at a publication from 2003, an issue of Wild Earth, one called Facing the Serpent. Although they did not mean The Angry Beast, that’s where we are. As Dr Kate Marvel remarked this year, it will take courage, not hope.

Posted in American Association for the Advancement of Science, American Chemical Society, American Meteorological Association, an ignorant American public, Anthropocene, anti-science, astronomy, atmosphere, attribution, being carbon dioxide, Berkeley Earth Surface Temperature project, Bill Gates, Blackbody radiation, bridge to somewhere, California, carbon dioxide, cement production, climate, climate change, climate zombies, development as anti-ecology, ecological services, economics, Eli Rabett, energy flux, environment, evidence, friends and colleagues, global warming, Grant Foster, greenhouse gases, Hyper Anthropocene, investment in wind and solar energy, Jerry Brown, Lawrence Berkeley National Laboratory, leaving fossil fuels in the ground, meteorology, nuclear power, oceanography, oceans, Principles of Planetary Climate, quantum mechanics, science, sea level rise, solar democracy, solar energy, solar power, sustainability, the energy of the people, the green century, the tragedy of our present civilization, tragedy of the horizon, University of California, University of California Berkeley, water as a resource, wind energy, wind power, wishful environmentalism, zero carbon | Leave a comment

Mark Carney is aligned with the geo-biological-physical everything

Bank of England Governor Mark Carney might not be popular on all his pronouncements, but he’s the most comprehensively educated on the matter of climate risk of any in the international discussion groups of the OECD.

Some people will be a climate [change] denier … or take a view that the speed with which domestic policy will change will lag international agreements. People can be on the other side of the spectrum as well. That’s called a market, but the market needs information.

He is, of course, completely right about how markets work.

This has actually moved onwards.

This is definitely worth a look, even if you need to pay for it.

Posted in Anthropocene, capitalism, climate change, economic trade, economics, global warming | Leave a comment