Professor Kevin Anderson: “Climate’s holy trinity”

24th January 2019, Oxford, England, UK

Appalling failure:

Who is responsible:

Yeah, it’s us.

Posted in climate, climate change, climate disruption, climate grief, global blinding, global warming, Kevin Anderson | Leave a comment

On bag bans and sampling plans

Plastic bag bans are all the rage. It’s not the purpose of this post to take a position on the matter. Before you do, however, I’d recommend checking out this:

and especially this:

and the Woods Hole Oceanographic Institution has many articles about plastics in the oceans.

Good modern governance means having evidence-based decisions. So, if a bag ban of any kind, or a bag tax of any kind is going to be imposed, it makes sense to assess how much and what kinds of use of bags are prevalent before the ban or tax, and how this changes after the ban or tax. This kind of thing used to need to be done with professional surveyors and statisticians. But with the availability of online datasets, access to the experience of others, widely available and open-source computing, and new survey technology and methods, expensive professional options aren’t the only way this can be done. Professional surveyors tend to argue otherwise. But, facts are, you can learn a lot by using Google Earth and Google Maps these days.

Surveys are designed around answering specific questions. If the objective is to estimate how many bags of one kind or another are being consumed per week in a town or county, that’s one question. If the objective is to estimate how many people regularly choose paper over plastic, or bring-their-own-bags, that’s an entirely different question. The governance and the group need to choose what’s important to them.

Surveys are also designed around the skillsets of the people involved in conducting them. With a volunteer organization, it is important that the procedure be something they can readily be trained in, and I say “trained” because no survey can do without training, however simple.

Surveys also ought to be easy on the surveyors, especially if they are volunteers. The requirements of when they need to be on stations oughtn’t be so onerous that they might not arrive on time, or not show up at all, and, worse, misrepresent to the group what happened. So, for instance, even if there are shoppers using bags in a store at 6:00 a.m., it’s probably not going to get covered well if a sampling plan were to require it.

Surveys also ought to be easy to explain to those who want to know how they were done. Along with this, it is critically important that, as part of an analysis of the primary quantities of interest, like plastic bags used per week, the survey’s contribution to overall uncertainty is quantified.

All that noted, there is a lot interested and committed citizens can do to gather data like this, and interpret results. This can be important, whether or not a town or county governance consider its findings as inputs. It can serve as a check on their result. It can also serve as a check on their budget, meaning why did the pay for some expensive professional organization to do something when something good enough for the purpose could have been had much cheaper? That said, any old surveying or sampling technique which appears to be good enough isn’t good enough. That is, there is some training and learning involved.

Returning to the bag ban matter, use of bags is key to the project. As with any policy, if a regulation is imposed and there is no evidence it helps or has untoward consequences, it ought to be revoked. To do that means measuring a baseline, and then measuring after the regulation is in place. It probably is a good idea to measure a couple of times after the regulation is in place. In statistics and engineering, this general approach is called A/B testing, which is explained better here. As mentioned above, how one gets counts — they are nearly always counts — and then analyzes them depends very much on the question being asked.

But, in the case of bags, there’s the really important question of where and when to sample. In this case, I’m setting aside bags given out in stores other than grocery stores. And, in this case, I’m using the example of my home town, Westwood, Massachusetts.

Counting noses or bags assumes there’s a sampling frame in hand. In Westwood’s case, the concern is the population of residents or visitors frequenting local grocery stores. And in Westwood’s case, there’s a desire to count people and their preferences for bags, whether plastic, paper, some mix, or whether they bring their own bags and some mix, as well as size of order. So this means counting people.

There are three grocery stores in Westwood: Roche Brothers in Islington, Wegman’s at University Station, and Lambert’s. There are convenience and other stores which sell small amount of groceries, but these were assumed to show behaviors which would be exhibited by the populations of the three majors. But, still, surveying these stores either demands deep cooperation on the part of their owners, an outrageous commitment on the part of volunteers, or a sampling scheme that is constructed with knowledge of who goes where when. Where to find such a thing?

Google.

Most substantial grocery store entries on Google Maps now present a bar chart of when they are most frequently visited. It looks like this:

Now, I’ve discovered that that dashed line is a fixture marking a certain number of visits per hour. It is constant for a given store across days of the week. And it is at least roughly consistent across stores in an area. This is great. This was helped, in part, by a visit to one of the stores by a volunteer to take data for a half hour. She was collecting data for me, and was also trying out a data collection form and seeing how difficult it was or easy to get the kind of data that was pertinent. Doing this is an excellent idea.

But, wait, you say: These aren’t numbers. It’s a bar chart.

Digitizing. I learned this when I took courses in Geology. An amazing amount of data is recovered by digitizing figures in scientific journals. Why not Google?

There are several digitizing applications out there. I’ve tried a couple and, so far, I like WebPlotDigitizer best. So, I did. Digitize, that is. How?

Here I’ve marked, by hand, two points on the bar chart, attempting to ascertain the height of the dashed line in pixels from the baseline. Note the original images aren’t produced to the same resolution or size, so it’s important to calibrate each one. In the upper right you can see WebPlotDigitizer‘s close-up of the place where the cursor is. That’s a little hard to see, since there’s so much real estate there, but here’s a close-up of the bottom:

And here’s a close-up of the upper right, a close-up of a close-up:

The completed digitization looks like this:

and results in a .csv file which looks partly like:

There look to be extra points in the digitization, which I’ll explain. It is important to note that the code I reference later which is available to the public demands digitization be done in this style. That code has no other documentation. I don’t give a recipe. That said, it’s not difficult to figure out.

The first point I take is the baseline, not in any of the bars of the bar plot. The next point I take is on the dashed upper score. I then do two bars, taking the baseline of the bar, and the upper horizontal of the bar. The rest of the bars have only their upper horizontal marked. The point is to get a good estimate of the baseline, obtained as an average of three baseline observations, the initial outside of bars, and then from two bars. There is one observation on the upper score, and then there are observations of the upper horizontals for the rest of the bars.

The heights of the bars can be estimated from the difference between the reading for their upper bars and the estimate of the baseline as the mean of three observations. These can be divided by the distance between the estimate of baseline and the upper score in order to calculate a portion of the range to upper score.

Note that because these are pixel coordinates, the ordinate values of the observations higher up on the bar plot are lower in coordinates than, say, the baselines. This is because distances on the ordinate are measured (ultimately) as pixels from the top of the image. Accordingly, some distance calculations need to have their signs reversed.

The accompanying R code reads in these .csv files and then extracts the heights for each of the hours in a day. There is a system for the Westwood case which you can understand by reading the code where each of the separate stores’ files are assimilated into a single corresponding matrix of scored versus hour of day and day of week.

In the end what’s in hand is a matrix of values proportional to numbers of visits to the stores. Calibration and actual counts have indicated that a value of unity corresponds to about 140 visitors.

Now that traffic to stores is available, or at least, something proportional to traffic, it is a matter of constructing a sampling plan. A plan which is proportional to traffic makes the most sense. This is equivalent to sampling time intervals where probability of electing an interval is proportional to the estimated traffic in the interval. For this study, the surveyors expressed a desire not to be surveying more than 60-90 minutes at a time. I settled for 60 minutes. So the question became one of finding a set of samples of individual hours for a store weighted by the probability of traffic.

The melt.array function of the R reshape2 package was handy here, and I was able to use the sampling-without-replacement of the R built-in sample to achieve the appropriate election. The volunteers had a strict constraint on the total number of times they wanted to visit stores. The code in the R file generateSamplingPlan.R produces several options, based upon the setting of the N.stage1 variable. They also did not want to survey before 9:00 a.m. and after 10:00 p.m.

The result is a sampling plan which looks a little like this:

The code and data supporting this post is available in a repository. Note that it is live, and exists to support an ongoing project, so there is no promise of stability. Note, however, that it is subject to Google’s version controls system.

So, what happens after the regulation or ordinance is adopted? What’s the sampling plan to find out how things are going?

At first it seems that simply repeating the days and times would be the best. On the other hand, remember that the sampling plan designed was intended to expose data collection to as representative a set of people as could be had given the constraints on that sampling. So, in principle, it shouldn’t harm at all to generate new sampling plans with the same constraints, ones which, invariably, will give other times and days. They are all vehicles for getting at what the population prefers.

Posted in bag bans, citizen data, citizen science, Commonwealth of Massachusetts, Ecology Action, evidence, Google, Google Earth, Google Maps, goverance, lifestyle changes, microplastics, municipal solid waste, oceans, open data, planning, plastics, politics, pollution, public health, quantitative ecology, R, R statistical programming language, reasonableness, recycling, rhetorical statistics, sampling, sampling networks, statistics, surveys, sustainability | Leave a comment

A lagomorph has an idea which might save the world

Eli, who offers a clever and consistent consumption-based accounting scheme.

  1. Consumption-based Carbon accounting: Does it have a future?
  2. Consumption-based accounting of CO2 emissions

Aside | Posted on by | Leave a comment

“Renewables are set to penetrate the global energy system more quickly than any fuel in history” (BP, 2019 Energy Outlook)

Selections from BP Energy Outlook: 2019 edition:

In the ET scenario, the costs of wind and solar power continue to decline significantly, broadly in line with their past learning curves.

To give a sense of the importance of technology gains in supporting renewables, if the speed of technological progress was twice as fast as assumed in the ET scenario, other things equal, this would increase the share of renewables in global power by around 7 percentage points by 2040 relative to the ET scenario, and reduce the level of CO2 emissions by around 2 Gt.

The impact of these faster technology gains is partly limited by the speed at which existing power stations are retired, especially in the OECD.

If, in addition to faster technological gains, policies or taxes double the rate at which existing thermal power stations are retired relative to the ET scenario, the reduction in emissions is doubled.

This suggests that technological progress without other policy intervention is unlikely to be sufficient to decarbonize the power sector over the Outlook. The ‘Lower carbon power’ scenario described below considers a package of policy measures aimed at substantially decarbonizing the global power sector.

The extent to which the global power sector decarbonizes over the next 20 years has an important bearing on the speed of transition to a lower-carbon energy system.

In the ET scenario, the carbon intensity of the power sector declines by around 30% by 2040. The alternative ‘Lower-carbon power’ (LCP) scenario considers a more pronounced decarbonization of the power sector.

This is achieved via a combination of policies. Most importantly, carbon prices are increased to $200 per tonne of CO2 in the OECD by 2040 and $100 in the non-OECD – compared with $35-50 in OECD and China (and lower elsewhere) in the ET scenario.

Carbon prices in the LCP scenario are raised only gradually to avoid premature scrapping of productive assets.

There is one gloomy projection. Despite the progress on the world scene,

The share of renewables in the US fuel mix grows from 6% today to 18% by 2040.

If that were to come true, in the context of these other changed, it is possible the United States would be regarded a pariah state and have economic sanctions imposed upon it. But … these projections have several built-in assumptions. Recall, BP is a bit like the U.S. Energy Information Administration and the world IAEA in that they are an established bureaucracy of forecasters. Both EIA and IAEA have systematically underestimated the acceleration in solar and wind adoption over the last decade.

Also, it is telling that BP’s assessment regarding the slowness with which wind and energy displace fossil fuel generation is because of the capital costs of retiring existing generation and replacing it. There are two points here.

First, the incremental capital costs for substituting solar+wind+storage for the same unit of fossil fuel energy is much smaller, as long as the accounting is done correctly. In particular, the costs to society are not just the generating plant, but the capital infrastructure needed to mind and bring the fuel to the point of combustion. There are also tremendous Sankey losses associated with Carnot cycle energy production. (See also.) That’s wasted money.

Second, the BP analysis clearly assumes the market and business structure for providing such energy remains intact. That assumption is big, one akin to assuming the there will always be a Sears and always be a Kodak. If, in fact, there are energy sources available at much lower costs per kWh or BTU, the market isn’t going to care about the sunk costs of existing players. It will go around them, and they will either seek government subsidies to remain intact, or economically die.

So, the “pariah state” outcome for the United States is too gloomy. I, instead, see a United State whose economic productivity might be increasing assaulted by challenges from climate change, including impacts to personal wealth and, so, unwillingness to consume at rates comparable to before, direct damage to productive capacity, including extensive damage to supply chains within country, and to basic infrastructure that permits people to get to their jobs, and costs in insurance and of doing business. But, I also see a hunger for cheaper everything, especially energy, and a thriving market willing to supply that with wind and solar and storage, widely distributed, overcoming zoning and other objections because many people have abandoned suburbs due to affordability and proximity to work, and because the gap between cost of energy from zero Carbon sources is so huge, a tenth of the comparable cost from fossil fuel sources.

It’s one thing to be a zealot for fossil fuels. It’s something else to ignore paying but 10% of the cost of something if zealotry is pursued.

Posted in Anthropocene, being carbon dioxide, Bloomberg New Energy Finance, BNEF, BP, bridge to somewhere, Carbon Tax, clean disruption, CleanTechnica, climate change, climate disruption, corporate citizenship, corporate litigation on damage from fossil fuel emissions, decentralized electric power generation, decentralized energy, ecomodernism, economic trade, ecopragmatist, fossil fuel divestment, fossil fuel infrastructure, global warming, Hyper Anthropocene, investing, investment in wind and solar energy, investments, local generation, local self reliance, solar democracy, solar domination, solar energy, solar power, the energy of the people, the green century, the right to be and act stupid, the right to know, the value of financial assets, Tony Seba, tragedy of the horizon, utility company death spiral, wind energy, wind power, zero carbon | Leave a comment

Tit-For-Tat in Repeated Prisoner’s Dilemma: President Donald Trump creates the Green New Deal

Jonathan Zasloff at Legal Planet offers “Donald Trump creates the Green New Deal”. The closing excerpt:

But what goes around comes around. A President Harris, or Warren, or Booker, etc. etc. can just as easily declare a National Emergency on Climate Change — one that would have a far better factual predicate than Trump’s patently false border emergency — and he or she will a lot more money to move around. After all, a lot of the climate crisis is about infrastructure, and if the relevant statute allows the President to move money from one project to another, then it is very easy to do that. Or the $100 billion that DOD has for national security emergencies: given that both the Pentagon and the heads of the national intelligence agencies have already said that climate represents a serious national security challenge, it’s not a hard legal lift (assuming intellectually honest and consistent judges, which of course we cannot). This fund must be for a military purpose, and a smarter, more energy efficient energy grid could do the trick.

It’s no way to run a democracy. But Trump and the GOP have made it clear that they do not believe in democracy, and as Robert Axelrod demonstrated years ago in his classic book The Evolution of Cooperation, the best strategy in repeat-player games to facilitate cooperation is playing Tit-For-Tat.

See also Generous Tit-For-Tat.

Update, 2019-02-18

Dan Farber writes on “National Security, Climate Change, and Emergency Declarations” at Legal Planet that:

If the Supreme Court upholds Trump, it will have to uphold an emergency declaration for climate change.

One reason why it would be hard for the Supreme Court to overturn a climate change declaration is that some attributes of climate change and immigration are similar. Both issues involve the country’s relations with the outside world, an area where presidential powers are strong. But it isn’t as if we suddenly found out about border crossings or climate change. Given these similarities, it would be very difficult for the conservative majority to explain why it was deferring to the President in one case but not the other.

The only major difference actually cuts strongly in favor of an emergency declaration for climate change: The U.S. government has already classified climate change as a serious threat to national security, and it is a threat that is getting stronger daily. Recent science indicates that climate action is even more urgent than we thought.

Trump’s stated justification in his proclamation is that “the problem of large-scale unlawful migration through the southern border is long-standing, and despite the executive branch’s exercise of existing statutory authorities, the situation has worsened in certain respects in recent years.” Trump’s stated justification in his proclamation is that “the problem of large-scale unlawful migration through the southern border is long-standing, and despite the executive branch’s exercise of existing statutory authorities, the situation has worsened in certain respects in recent years.” Climate change, too, is a “longstanding problem,” and it certainly has gotten worse despite the effort of the executive branch (Obama) to address the problem. Federal agencies, as well as Congress, have made it clear that climate is a serious threat to our nation.

Posted in climate change, game theory, global warming, Green New Deal | Leave a comment

“What’s new with recycling”

South Shore Recycling Cooperative Director Claire Galkowski,

claireSSRC

spoke in Norwell, at the South Shore Natural Science Center, a couple of weeks ago:

Posted in Amory Lovins, Anthropocene, biofuels, Carbon Cycle, Claire Galkowski, coastal communities, Commonwealth of Massachusetts, EBC-NE, ecomodernism, ecopragmatist, education, extended producer responsibility, extended supply chains, green tech, greenhouse gases, local self reliance, Massachusetts, microplastics, paper, plastics, public health, quantitative ecology, recycling, science, solid waste, South Shore Recycling Cooperative, sustainability | Tagged | 1 Comment

“Is the Green New Deal’s ambition smart policy?”

Ann Carlson is the Shirley Shapiro Professor of Environmental Law and the co-Faculty Director of the Emmett Institute on Climate Change and the Environment at UCLA School of Law. Writing at Legal Planet, she takes on assessing the Green New Deal, admitting she is “conflicted about a proposal that seems untethered to what is actually achievable.” She begins:

At the the heart of the Green New Deal — which demands slashing U.S. carbon emissions by 2030 by shifting to 100 percent clean energy — is a major conundrum. Even the most enthusiastic proponents of ambitious climate policy don’t believe the goals are achievable, technologically let alone politically. Stanford Professor Marc Z Jacobsen, for example, among the most ardent advocates for decarbonizing the electricity grid completely, believes that we can achieve 100 percent renewable energy by 2050, three decades after the Green New Deal’s target date. Ernie Monitz, the former Secretary of Energy under President Obama, laments that he “cannot see how we could possibly go to zero carbon in a 10-year time frame.” A number of columnists have noted that the Green New Deal will never become law because of its expense, its political impracticability and its technological infeasibility. And yet, the Green New Deal has attracted huge public support, the endorsement of all of the 2020 Democratic candidates for President, and a large number of Senators and members of Congress. It promises to mobilize a generation of young activists to work to solve the existential crisis of their lives.

Read on. She’s more optimistic than it sounds, although, I think Professor Carlson is realistic.

I remarked in a comment:

I wish the GND proponents well, too, although I worry about a couple of things.

First, the comparison with other environmental programs, while inspiring, is a little inappropriate. There has never been a problem of this scale, and not one whose amplification is so thoroughly integrated in with the daily comforts of affluent humans. Fossil fuels do have high energy densities, and that can be convenient.

Also, related to this, benefits do not accrue if we simply cease emitting. We have a timetable, and Nature will not scrub the harmful materials on any reasonable human timetable, and conditions at the moment we succeed at achieving zero emissions will persist for centuries. The alternative, artificial removal of atmospheric CO2, is both horrifically expensive (multiples of 2014 Gross World Product size at present prices) and pursuit of the technology has been explicitly rejected by GND proponents. (They’ve ruled out advanced nuclear technologies, too.)

Second, without policy which is “tethered to what is actually achievable”, GND suggests the bar is lower than it actually is and could, in itself, both present a moral hazard and make people think climate change is not being mitigated purely for reasons of politics and greed. (This is in bounds because the rejection of negative emissions technology is done because it, too, could be a kind of moral hazard.) Sure, those are involved, but it is also true people don’t like the things that a GND-style solution, or a Professor Mark Z Jacobson solution entail. In my opinion, their choice is silly, but people are people.

Third, aspirational, engineering-free solutions to big, big problems are likely to founder, because they won’t assess and contain their own complications, particularly if they are rushed. Uncoordinated rollout of zero Carbon energy won’t only trash pieces of the grid which will have repercussions for the less well off and people of color, but could also exacerbate climate conditions and regional weather. Large scale plantings, for example, of Jatropha curcas, thought to be a way of doing rapid CO2 drawdown and projecting biodiesel oils, could change albedo in the wrong direction for the arid regions it loves, and, indeed, could do itself in if the same regions transform into tropics. Uncoordinated rollouts of wind farms will affect weather system energies. That’s no reason not to do it, but it needs to be studied and thought through.

Fourth, there is (still) a substantial education component needed, one done in a manner that evades the impression climate change-fixing proponents are pulling their punches. For if byproducts of climate change are severe enough to move people into action, and gets them to accept sacrifices needed to do so, then they probably will expect to see improvements once these changes are made. The science says that expectation is unreasonable, because of the inertia of the climate system and because the human emissions impact is a perturbation on a geological scale in a geological moment. The political ramifications of this realization are both difficult to assess but could be damaging to the long term health of the collective project.

I did not mention other things, such as the intrinsic greenhouse gas emissions from agriculture, even if planting, harvesting, fertilization, transport, and processing are all decarbonized. Cement production is a big piece of emissions, too. The troubling thing is that GND doesn’t mention these: It focuses almost exclusively upon energy.

Update, 2019-02-11, 23:45 ET

Encouragement.

Posted in Anthropocene, anti-intellectualism, bollocks, bridge to somewhere, cement production, clear air capture of carbon dioxide, climate business, climate change, climate disruption, climate economics, climate education, global warming, Green New Deal, greenhouse gases, negative emissions, zero carbon | Leave a comment

From the YEARS Project: How Climate Impacts Mental Health (#climatefacts)

Dr Kate Marvel: “We need courage, not hope, to face Climate Change“.

Also the magnificent “We should never have called it Earth“, also from Dr Marvel.

In “Hope, despair and transformation: Climate change and the promotion of mental health and wellbeing“, ritze, Blashki, Burke, and Wiseman [International Journal of Mental Health Systems, 2008, 2(13)] note in a section titled “Emotional distress arising from awareness of climate change as global environmental threat”:

The question that McKibben raises is how psychologically, emotionally and politically should we as human beings respond to this fundamental change in the relationship between the human species and the world we inhabit?
.
.
.
For many people, the resulting emotions are commonly distress and anxiety. People may feel scared, sad, depressed, numb, helpless and hopeless, frustrated or angry. Sometimes, if the information is too unsettling, and the solutions seem too difficult, people can cope by minimising or denying that there is a problem, or avoiding thinking about the problems. They may become desensitised, resigned, cynical, skeptical or fed up with the topic. The caution expressed by climate change skeptics could be a form of denial, where it involves minimising the weight of scientific evidence/consensus on the subject. Alternatively, it could indicate that they perceive the risks of change to be greater than the risks of not changing, for themselves or their interests …
.
.
.
Notwithstanding the enormity of the climate change challenge, we know what many of the solutions are, and there are many actions that citizens can take individually and collectively to make a difference at household, local, national and global level. When people have something to do to solve a problem, they are better able to move from despair and hopelessness to a sense of empowerment.

Blashki, et al include a table from the Australian Psychological Society about how individuals can respond to the stress of being aware of climate change and its impacts:

Finally, there is the tongue-in-cheek yet serious work by Nye and Schwarzennager:

Posted in American Association for the Advancement of Science, Arnold Schwarzennegger, attribution, Bill Nye, climate change, climate grief, global warming | 1 Comment

Alright! I’m tired of all this serious shtuff … It’s time for some CLIMATE ADAM!

Posted in Anthropocene, carbon dioxide, climate change, glaciers, global warming, Hyper Anthropocene, ice sheet dynamics, oceans, sea level rise | Leave a comment

Status of Solar PV in Massachusetts

From PV Magazine‘s John Weaver:

At Solar Power Northeast, the DOER of Massachusetts noted that with the mandated 400 MW of qualified projects program review upcoming, and heavy volume deployed in National Grid territory, there is strong consideration to expand and evolve the SMART program.

Posted in Amory Lovins, Bloomberg New Energy Finance, clean disruption, CleanTechnica, Commonwealth of Massachusetts, decentralized electric power generation, decentralized energy, distributed generation, investment in wind and solar energy, ISO-NE, Massachusetts, Massachusetts Clean Energy Center, solar democracy, solar domination, solar energy, solar power, sustainability, the energy of the people | Leave a comment

“Applications of Deep Learning to ocean data inference and subgrid parameterization”

This is another nail in the coffin of the claim I heard at last year’s Lorenz-Charney Symposium at MIT that machine learning methods would not make a serious contribution to advancements in the geophysical sciences.

T. Bolton, L. Zanna, “Applications of Deep Learning to ocean data inference and subgrid parameterization“, Journal of Advances in Modeling Earth Systems, 2019, 11.

Posted in American Meteorological Association, American Statistical Association, artificial intelligence, Azimuth Project, deep learning, deep recurrent neural networks, dynamical systems, geophysics, machine learning, Mathematics and Climate Research Network, National Center for Atmospheric Research, oceanography, oceans, science, stochastic algorithms | Leave a comment

The shelf-break front, fisheries, climate change, and finding things out

From Woods Hole Oceanographic Institution.

Support them.

Claire and I do.

Posted in biology, climate change, climate disruption, ecological disruption, ecological services, ecology, global warming, oceanography, oceans, quantitative biology, quantitative ecology, WHOI, Woods Hole Oceanographic Institution | Leave a comment

Wake up, Massachusetts! Especially, Green Massachusetts!

I’ve been looking over the set of bills proposed for the current Massachusetts legislative session. There are more of them, all dealing with aspects of greening energy supply and transport. And Governor Baker’s S.10 is very welcome. (By the way, I don’t see any counter-proposals from those who don’t like the Governor politically, so, I’d say, they have no right to complain.) Adaptation to climate in Massachusetts is a serious thing:

and there will be many uncomfortable choices we’ll be facing soon, both pocketbook choices and choices of social equity. Indeed, many of the bills have environmental justice and social justice aspects. I’m all for that, as long as these are put in perspective.

It’s 2019. While Massachusetts has a Global Warming Solutions Act, it’s far from perfect, putting up an imperfect target of 2050 and, even then, deliberating excluding whole classes of emissions, such as waste-to-energy facilities. Even accepting it as a great goal, even if the impacts upon Massachusetts are controlled by many and varied parties all over the world, the Commonwealth currently has no believable roadmap for achieving those goals which are, after all, a law. This is especially true relating to transportation and to heating of homes. The world’s bullseye for containing emissions — a long shot — is 2030. Some say even that’s too late, given we’ve made so little progress, and governments and communities are faced with buying fossil fuel infrastructure and retiring it early, well ahead of the end of its depreciation lifetime.

All the evidence year after year is that the rate of impact from climate change is accelerating. What Massachusetts faces is the discomfort and significant cost of purchasing homes — at a substantial loss to their owners, and loss in tax base for their towns — on the coasts and inland which are too risky for their inhabitants, their towns, and the Commonwealth to permit their owners to continue to live there. This is called managed retreat (see also). And I see nothing, other than S.10, which begins to address this. And S.10 is modest.

I also don’t see on the energy side a developed appreciation for What’s Happening Out There. Climate change is important. It is the issue. Environmental justice or not, social justice or not, if this problem is not solved, none of the progress that has been made in 150 years of social advancement will matter: “All the good you’ve done, all the good you can imagine doing will be wiped out, just wiped out ….” (Van Jones). But, and these aspects are good, that’s not the only dynamic for which Massachusetts needs to plan.

Have you looked at solar and wind costs to generate a KWh of electricity recently?

They are tearing through the floor, especially onshore wind but, soon to be followed by solar. Why? Because Mr Market is seeing that their plummeting costs are not fantasies — Forbes writes about this all the time these days — they are a result of a differentiating technology, and that, yeah, there’s a pony in the barn. Solar and wind, supported by and supporting expanding energy storage, are going to Eat the Lunch of everyone in the energy industry. And this is happening with the fiercest antagonist to these technologies occupying the United States White House, with many supporting opponents numbered among the Republicans of Congress. Imagine what they will do with tailwinds?

But, there’s a problem. Massachusetts residents do not like to live near wind turbines or even large solar farms. Some complain that solar farms cause leveling of new growth forest — even if new growth forest does little or nothing to sequester CO2 — and impact habitat. And they just don’t like the looks. Massachusetts residents who say these things are really complaining about the low energy density per unit area which solar and wind have. That’s true. Fossil fuels have a high energy density. Nuclear power has a high energy density. Hydropower has a reasonably high energy density, but you can’t just find it anywhere. If you want to supply energy needs with wind and solar, you need a lot of land. Massachusetts isn’t a big state. Accordingly, if you want to supply energy needs with with and solar, you need to build them close to where people live. That’s better, in fact, because then you don’t need to run ecosystem-destroying transmission lines through forests.

If this is unacceptable, and you don’t want CO2 emissions, there is no choice but nuclear power or hydropower. As I noted, there’s only so much hydropower, and there needs to be cooperation among the people who live in states the transmission needs to cut in order to get access to it.

Nuclear power, as presently practiced, has a large cost problem. There are measures being pursued to fix that, but it’s not clear how soon these will be available. We need nuclear power that’s modular, with small units, that can be combined into arbitrary sizes, that can be toggled on and off as needed, that’s air-cooled, where each of the units are portable. We need nuclear power in commodity chunks. The industry chose not to do that in the 1960s and they have suffered with their choice ever since. Modular units can just be trucked away intact if they are broken or need their wastes scrubbed. If a unit fails, the generation doesn’t all go down because there are many more companions generating. Having cooling water is an ecological and climate problem — many reactors need to go offline if their nearby cooling rivers dry up in droughts — so air cooling is a natural response.

But nuclear power isn’t popular.

Facts are, unless Massachusetts residents opt for onshore wind turbines and big solar, both backed by substantial storage, all located near residentially zoned areas, they are going to end up with natural gas as their energy supply. It’s dense. It can be hidden.

But, if they do, the future of Massachusetts not only lacks a clean energy future, it also has a future of a rustbelt. That’s because natural gas will eventually be the most expensive energy source. Coal and oil will be long gone. Conventional nuclear power is too expensive even now because they suffer from a negative learning curve. Everyone will be using wind and solar, backed in places by storage, but as everyone adopts these, the storage will be needed less and less.

What will be Massachusetts’ fate?

With expensive electrical energy, not only will companies not want to do business in Massachusetts because their energy supply isn’t clean, an increasing criterion over time, due to shareholders and customers, but it will be the most expensive energy anywhere. It will get worse. The companies supplying Massachusetts don’t live in isolation. Selling natural gas anywhere will become more and more difficult, and some and eventually all of those companies will go bankrupt. To maintain energy, Massachusetts will need to buy those assets and run them, perhaps by giving them to someone else to run, but this will be expensive, and this will go on the tax base. That will be an additional disincentive for companies to build and work in Massachusetts, and for people to live in Massachusetts.

In addition, there will be the inevitable costs and charges from climate change. These Massachusetts does not have complete control, but to the degree it doesn’t champion means for zeroing emissions and using 100% zero Carbon energy, it will stifle its significant voice encouraging others that this is a feasible model. That voice can do more to nudge the rest of the world in the zero Carbon direction, much more than anything Massachusetts will do by zeroing its own emissions. These costs will ultimately fall on the Commonwealth’s books and, so, upon the taxpayers, whether they live it or not, whether or not the ability of the Commonwealth to pay is supposedly constrained by law. Solvency is a powerful reason for overturning laws.

So, from what I see, either Massachusetts residents learn to live next to onshore wind and big solar farms, or they choose new nuclear power — and we don’t know how long that’ll take — or they choose natural gas, with the economic downsides I have just described.

I don’t think many in the progressive and environmental movements in Massachusetts have thought about these tradeoffs. They somehow think demand can be reduced so these tradeoffs are not necessary. They are not thinking quantitatively, or, for that matter, factually. It appears to me many of them have an agenda to pursue, and evidence just gets in the way. This is not serving the Commonwealth.

Climate reality is an elixir which exposes the truth. Whether it’s Thwaites Glacier or the slowdown of the Gulf Stream, or excessive precipitation, Massachusetts will need to deal with these.

Fortunately, should Massachusetts residents change their minds, onshore wind turbines are very easy and inexpensive to construct, as are big solar farms. And flooded properties are cheap to buy up.

What kind of future do you want, Massachusetts? Do you want to plan, and help it be a good one? Or do you want to bury your head in the ever eroding sand?

“Climate change is coming for your assets”

Posted in American Association for the Advancement of Science, Anthropocene, Cape Wind, climate business, climate change, climate disruption, coastal communities, Commonwealth of Massachusetts, decentralized energy, electric vehicles, electrical energy storage, electricity markets, emissions, fossil fuels, global warming, Governor Charlie Baker, Hyper Anthropocene, ice sheet dynamics, investment in wind and solar energy, leaving fossil fuels in the ground, sea level rise, seawalls, solar domination, solar energy, solar power, the value of financial assets, wind energy, wind power, wishful environmentalism, zero carbon | Leave a comment

Repeating Bullshit

Yeah, how much was it?

And was it different? I mean, not based on how Curry or Tisdale feel, but by the numbers.

Open Mind

Question: How does a dumb claim go from just a dumb claim, to accepted canon by the climate change denialati?

Answer: Repetition.

Yes, keep repeating it. If it’s contradicted by evidence, ignore that or insult that. Repeat it again. If you’re asked for evidence, ignore that or insult that, just keep repeating it. That’s how things get burned into brains.

View original post 539 more words

Posted in American Statistical Association, anomaly detection, changepoint detection, climate change, Grant Foster, Mathematics and Climate Research Network, maths, science, statistics, Tamino, time series, unreason | Leave a comment

Stream flow and P-splines: Using built-in estimates for smoothing

Mother Brook in Dedham Massachusetts was the first man-made canal in the United States. Dug in 1639, it connects the Charles River at Dedham, to the Neponset River in the Hyde Park section of Boston. It was originally an important source of water for Dedham’s mills. Today it serves as an important tool for flood control on the Charles River.

mb_img_20171216_151349-01

Like several major river features, Mother Brook is monitored by gauges of flow maintained by the U.S. Geological Survey, with careful eyes kept on their data flows by both agencies of the Commonwealth of Massachusetts, like its Division of Ecological Restoration, and by interested private organizations, like the Neponset River Watershed Association and the Charles River Watershed Association. (I am a member of the Neponset River Watershed Association.) The data from these gauges are publicly available.

Such a dataset is a good basis for talking about a non-parametric time series smoothing technique using P-splines (penalized B-splines), an example of local regression, and taking advantage of the pspline package to do it. Since this, like most local regression techniques, demands a choice of a smoothing parameter, this post strongly advocates for pspline as a canonical technique because:

  • it features a built-in facility for choosing the smoothing parameter, one based upon generalized cross validation,
  • like loess and unlike lowess in R, it permits multiple response vectors and fits all of them simultaneously, and
  • with the appropriate choice in its norder parameter, it permits the estimation of derivatives of the fitted curve as well as the curve itself.

Finally, note that while residuals are not provided directly, they are easy to calculate, as will be shown here.

In fairness, note that loess allows an R formula interface, but both smooth.Pspline and lowess do not. Also, smooth.Pspline is:

  • intolerant of NA values, and
  • demands the covariates each be in ascending order.
Note from 2019-01-30

Note that the lack of support by the pspline package for the multivariate case has thrown, so to speak, the gauntlet down, in order to find a replacement. Since I’m the one who, in the moment, is complaining the loudest, the responsibility falls to me. So, accordingly, I commit to devising a suitable replacement. I don’t feel constrained by the P-spline approach or package, although I think it foolish not to use it if possible. Such a facility will be the subject of a future blog post. Also, I’m a little joyful because this will permit me reacquaintance with some of the current FORTRAN language definition, using the vehicle of Simply Fortran, and its calling from R. This is sentimental, since my first programming language was FORTRAN IV on an IBM 1620.

References

For completeness, consider the AdaptFit package and related SemiPar package which also offer penalized spline smoothing but are limited in their support for multiple responses.

(Update, 2019-01-29)

I re-encountered this paper by Professor Michael Mann from 2004 which addresses many of these issues:

Incidentally, Professor Mann is in part responding to a paper by Soon, Legates, and Baliunas (2004) criticizing estimators of long term temperature trends. The Dr Soon of that trio is the famous one from the Heartland Institute who has been mentioned at this blog before.

The dataset

What’s does stream flow on Mother Brook look like? Here’s eight years of it:

(Click on image for a larger figure, and use browser Back Button to return to blog.)

Smoothing with P-splines, Generalized Cross Validation

Using a cubic spline model, the package pspline finds a smoothing parameter (“spar“) of 0.007 is best, giving a Standard Error of the Estimate (“SEE”) of 0.021:

(Click on image for a larger figure, and use browser Back Button to return to blog.)

Forcing the spline fit to use spar values which are larger, one of 0.5, and one of 0.7 produces a worse fit. This can also be seen in their larger G.C.V criteria, of 228 and of 237, compared with the automatic 185:

(Click on image for a larger figure, and use browser Back Button to return to blog.)

(Click on image for a larger figure, and use browser Back Button to return to blog.)

Code

The code for generating these results is shown below.


#
# Mother Brook, P-spline smoothing, with automatic parameter selection.
# Jan Galkowski, bayesianlogic.1@gmail.com, 27th January 2019.
# Last changed 28th January 2019.
#

library(random)   # For external source of random numbers
library(FRACTION) # For is.wholenumber
library(tseries)  # For tsbootstrap
library(pspline)

source("c:/builds/R/plottableSVG.R")

randomizeSeed<- function(external=FALSE)
{
  #set.seed(31415)
  # Futz with the random seed
  if (!external)
  {
    E<- proc.time()["elapsed"]
    names(E)<- NULL
    rf<- E - trunc(E)
    set.seed(round(10000*rf))
  } else
  {
    set.seed(randomNumbers(n=1, min=1, max=10000, col=1, base=10, check=TRUE))
  }
  return( sample.int(2000000, size=sample.int(2000, size=1), replace=TRUE)[1] )
}

wonkyRandom<- randomizeSeed(external=TRUE)

stopifnot( exists("MotherBrookDedham") )

seFromPspline<- function(psplineFittingObject, originalResponses, nb=1000, b=NA)
{
  stopifnot( "ysmth" %in% names(psplineFittingObject) )
  #
  ysmth<- psplineFittingObject$ysmth
  #
  if (is.null(dim(originalResponses)))
  {
    N<- length(which(!is.na(ysmth)))
    stopifnot( length(originalResponses) == N )
  } else
  {
    stopifnot( all( dim(originalResponses) == dim(ysmth) ) )
    N<- nrow(ysmth)
  }
  #
  if (is.na(b))
  {
    b<- round(N/3)
  } else
  {
    stopifnot( is.wholenumber(b) && (4 < b) && ((N/100) < b) )
  }
  #
  R<- originalResponses - ysmth
  #
  # Don't assume errors are not correlated. Use the Politis and Romano stationary
  # bootstrap to obtain estimates of standard deviation(s) and Mean Absolute Deviation(s), 
  # where these are plural of there is more than one response.
  #
  # The standard error of the estimate is then just adjusted for the number of non-NA
  # observations.
  #
  if (is.null(dim(originalResponses)))
  {
    Ny<- 1
    booted.sd<- tsbootstrap(x=R, nb=nb, statistic=function(x) sd(x, na.rm=TRUE), m=1, b=b, type="stationary")
    SD<- mean(booted.sd$statistic)
    SEE<- SD/sqrt(N)
    booted.mad<- tsbootstrap(x=R, nb=nb, statistic=function(x) mad(x, constant=1, na.rm=TRUE), m=1, b=b, type="stationary")
    MAD<- mean(booted.mad$statistic)
  } else
  {
    Ny<- ncol(ysmth)
    SD<- rep(NA, Ny)
    SEE<- rep(NA, Ny)
    MAD<- rep(NA, Ny)
    for (j in (1:Ny))
    {
      nonNA<- which(!is.na(R[,j]))
      booted.sd<- tsbootstrap(x=R[nonNA,j], nb=nb, statistic=function(x) sd(x, na.rm=TRUE), m=1, b=b, type="stationary")
      SD[j]<- mean(booted.sd$statistic)
      SEE[j]<- SD/sqrt(length(nonNA))
      booted.mad<- tsbootstrap(x=R[nonNA,j], nb=nb, statistic=function(x) mad(x, constant=1, na.rm=TRUE), m=1, b=b, type="stationary")
      MAD[j]<- mean(booted.mad$statistic)
    }
  }
  return(list(multivariate.response=!is.null(dim(originalResponses)), number.of.responses=Ny,
              SD=SD, MAD=MAD, SEE=SEE))
}

MotherBrookDedham.nonNA<- which(!is.na(MotherBrookDedham$gauge))
# Note method == 3 is Generalized Cross Validation (Craven and Wahba, 1979), and
# the value of spar is an initial estimate. The choice of norder == 2 is arbitrary.
MotherBrookDedham.fitting<- smooth.Pspline( x=MotherBrookDedham.nonNA, y=MotherBrookDedham$gauge[MotherBrookDedham.nonNA], 
                                            norder=2, spar=0.3, method=3)
# Using 90 days as mean block length, about a quarter of a year
MotherBrookDedham.estimate.bounds<- seFromPspline(psplineFittingObject=MotherBrookDedham.fitting, 
                                                  originalResponses=MotherBrookDedham$gauge[MotherBrookDedham.nonNA], nb=1000, b=91)

fx<- openSVG(root="MotherBrookDedham-RawFlowData-Daily-withSmooth", width=24, height=round(24/2), pointsize=8)

plot(MotherBrookDedham$gauge, type="n", xaxt="n", ylab="mean (over day) cubic feet per second", main="",
     xlab="", cex.lab=2, cex.axis=2, ylim=c(-80, 650))
title(main=sprintf("Raw flow data, Mother Brook at Dedham, agency %s, site %s, fit with cubic smoothing spline", 
      MotherBrookDedham$agency_cd[1], MotherBrookDedham$site_no[1]), 
      cex.main=3, font.main=2, family="Times")     
N<- nrow(MotherBrookDedham)
S<- seq(1, N, 30)
axis(side=1, at=S, line=-13, labels=MotherBrookDedham$datetime[S], las=2, cex.axis=2, font.axis=2, cex.lab=1.5, tick=FALSE)
abline(v=S, lty=6, col="grey")
points(1:N, MotherBrookDedham$gauge, pch=21, cex=1.2, col="blue", bg="blue")
lines(MotherBrookDedham.nonNA, MotherBrookDedham.fitting$ysmth, lwd=1, lty=1, col="green")
text(which.max(MotherBrookDedham.fitting$ysmth), max(MotherBrookDedham.fitting$ysmth), pos=2, offset=2,
     font=2, cex=2, labels=sprintf("Found smoothing SPAR = %.3f, and G.C.V. value =  %.1f", 
                                   MotherBrookDedham.fitting$spar, MotherBrookDedham.fitting$gcv), family="Helvetica")
text(which.max(MotherBrookDedham.fitting$ysmth), 0.95*max(MotherBrookDedham.fitting$ysmth), pos=2, offset=2,
     font=2, cex=2, labels=sprintf("SD = %.3f, MAD = %.3f, SEE = %.3f", 
                                   MotherBrookDedham.estimate.bounds$SD, MotherBrookDedham.estimate.bounds$MAD, 
                                   MotherBrookDedham.estimate.bounds$SEE), family="Helvetica")
closeSVG(fx)

# Force the same P-spline to use an arbitrary smoother SPAR by electing method == 1, and setting SPAR = 0.5.
MotherBrookDedham.fitting.p5<- smooth.Pspline( x=MotherBrookDedham.nonNA, y=MotherBrookDedham$gauge[MotherBrookDedham.nonNA], 
                                            norder=2, spar=0.5, method=1)
# Using 90 days as mean block length, about a quarter of a year
MotherBrookDedham.estimate.bounds.p5<- seFromPspline(psplineFittingObject=MotherBrookDedham.fitting.p5, 
                                                  originalResponses=MotherBrookDedham$gauge[MotherBrookDedham.nonNA], nb=1000, b=91)

fx<- openSVG(root="MotherBrookDedham-RawFlowData-Daily-withSmooth-with-SPARp5", width=24, height=round(24/2), pointsize=8)

plot(MotherBrookDedham$gauge, type="n", xaxt="n", ylab="mean (over day) cubic feet per second", main="",
     xlab="", cex.lab=2, cex.axis=2, ylim=c(-80, 650))
title(main=sprintf("Raw flow data, Mother Brook at Dedham, agency %s, site %s, fit with cubic smoothing spline", 
      MotherBrookDedham$agency_cd[1], MotherBrookDedham$site_no[1]), 
      cex.main=3, font.main=2, family="Times")     
N<- nrow(MotherBrookDedham)
S<- seq(1, N, 30)
axis(side=1, at=S, line=-13, labels=MotherBrookDedham$datetime[S], las=2, cex.axis=2, font.axis=2, cex.lab=1.5, tick=FALSE)
abline(v=S, lty=6, col="grey")
points(1:N, MotherBrookDedham$gauge, pch=21, cex=1.2, col="blue", bg="blue")
lines(MotherBrookDedham.nonNA, MotherBrookDedham.fitting.p5$ysmth, lwd=1, lty=1, col="green")
text(which.max(MotherBrookDedham.fitting.p5$ysmth), max(MotherBrookDedham.fitting.p5$ysmth), pos=2, offset=2,
     font=2, cex=2, labels=sprintf("Found smoothing SPAR = %.3f, and G.C.V. value =  %.1f", 
                                   MotherBrookDedham.fitting.p5$spar, MotherBrookDedham.fitting.p5$gcv), family="Helvetica")
text(which.max(MotherBrookDedham.fitting.p5$ysmth), 0.95*max(MotherBrookDedham.fitting.p5$ysmth), pos=2, offset=2,
     font=2, cex=2, labels=sprintf("SD = %.3f, MAD = %.3f, SEE = %.3f", 
                                   MotherBrookDedham.estimate.bounds.p5$SD, MotherBrookDedham.estimate.bounds.p5$MAD, 
                                   MotherBrookDedham.estimate.bounds.p5$SEE), family="Helvetica")
closeSVG(fx)

# Force the same P-spline to use an arbitrary smoother SPAR by electing method == 1, and setting SPAR = 0.7.
MotherBrookDedham.fitting.p7<- smooth.Pspline( x=MotherBrookDedham.nonNA, y=MotherBrookDedham$gauge[MotherBrookDedham.nonNA], 
                                            norder=2, spar=0.7, method=1)
# Using 90 days as mean block length, about a quarter of a year
MotherBrookDedham.estimate.bounds.p7<- seFromPspline(psplineFittingObject=MotherBrookDedham.fitting.p7, 
                                                  originalResponses=MotherBrookDedham$gauge[MotherBrookDedham.nonNA], nb=1000, b=91)

fx<- openSVG(root="MotherBrookDedham-RawFlowData-Daily-withSmooth-with-SPARp7", width=24, height=round(24/2), pointsize=8)

plot(MotherBrookDedham$gauge, type="n", xaxt="n", ylab="mean (over day) cubic feet per second", main="",
     xlab="", cex.lab=2, cex.axis=2, ylim=c(-80, 650))
title(main=sprintf("Raw flow data, Mother Brook at Dedham, agency %s, site %s, fit with cubic smoothing spline", 
      MotherBrookDedham$agency_cd[1], MotherBrookDedham$site_no[1]), 
      cex.main=3, font.main=2, family="Times")     
N<- nrow(MotherBrookDedham)
S<- seq(1, N, 30)
axis(side=1, at=S, line=-13, labels=MotherBrookDedham$datetime[S], las=2, cex.axis=2, font.axis=2, cex.lab=1.5, tick=FALSE)
abline(v=S, lty=6, col="grey")
points(1:N, MotherBrookDedham$gauge, pch=21, cex=1.2, col="blue", bg="blue")
lines(MotherBrookDedham.nonNA, MotherBrookDedham.fitting.p7$ysmth, lwd=1, lty=1, col="green")
text(which.max(MotherBrookDedham.fitting.p7$ysmth), max(MotherBrookDedham.fitting.p7$ysmth), pos=2, offset=2,
     font=2, cex=2, labels=sprintf("Found smoothing SPAR = %.3f, and G.C.V. value =  %.1f", 
                                   MotherBrookDedham.fitting.p7$spar, MotherBrookDedham.fitting.p7$gcv), family="Helvetica")
text(which.max(MotherBrookDedham.fitting.p7$ysmth), 0.95*max(MotherBrookDedham.fitting.p7$ysmth), pos=2, offset=2,
     font=2, cex=2, labels=sprintf("SD = %.3f, MAD = %.3f, SEE = %.3f", 
                                   MotherBrookDedham.estimate.bounds.p7$SD, MotherBrookDedham.estimate.bounds.p7$MAD, 
                                   MotherBrookDedham.estimate.bounds.p7$SEE), family="Helvetica")
closeSVG(fx)

The code is available online here and requires a utility from here.

So, what’s the point?

Having a spline model for a data actually offers a lot. First, the estimate of SEE and MAD give some idea of how accurate prediction using the model might be. With eight years of data, such models are in hand.

Also, having a spline model is the basis for detecting changes in stream flow rates over time. Mother Brook might not be the best example of long run stream flow rates, since the Army Corps can change their policies in how they manage it, but the same kinds of flow time series are available for many other flows in the region.

To the point about changes in flow rates, having a spline model permits estimating derivatives which, in this case, are exactly these values.

Moving on, once several such flows have been modeled using splines, these can serve as the basis for various kinds of regressions, whether on the response side or on the covariates side. For example, is there statistical evidence for a link between stream flows and temperature? The Clausius-Clapeyron relation suggests there should be, at least at the regional and global scale. It would be interesting to examine if it can be seen here.

To me, it would be also interesting to see if some of the riverine connections in the region could be inferred from examination of flow rates alone. Downstream flows see a pulse of water from precipitation and melt, but their pulses are lagged with respect to earlier ones. Sure, one could examine such connections simply by looking at a map, or Google Earth, but there are other hydrological applications where these connections are latent. In particular, connections between subterranean water sources and surface flows might be reveals if these kinds of inferences are applied to them.

(Update, 2019-01-29)

The scholarly literature such as the paper by Professor Mann cited above which critiques and explains that by by Soon, Legates, and Baliunas (2004) shows careful consideration of these techniques matters.

mb_ortho_2019-01-27_180158

Posted in American Statistical Association, citizen data, citizen science, Clausius-Clapeyron equation, Commonwealth of Massachusetts, cross-validation, data science, dependent data, descriptive statistics, dynamic linear models, empirical likelihood, environment, flooding, floods, Grant Foster, hydrology, likelihood-free, meteorological models, model-free forecasting, non-mechanistic modeling, non-parametric, non-parametric model, non-parametric statistics, numerical algorithms, precipitation, quantitative ecology, statistical dependence, statistical series, stream flow, Tamino, the bootstrap, time series, water vapor | Leave a comment

50,000+ golf balls, along a coast

KQED carried a story about free diver and 16 y.o. Alex Weber who discovered not only a new source of plastic pollution, but another testament to the casual, careless sloppiness of people.

sealwithgolfballs

And Ms Weber has converted it into a crusade against marine pollution, and a technical article in a scientific publication. Writing with Professor Matt Savoca of Stanford University, Weber and her dad, Michael Weber, also a co-author of that paper, found over 50,000 balls just offshore of a California golf course, with new ones arriving every day. See her golf ball project page.

wheregolfcourse_2019-01-26_182205

A number of the balls are in usable condition:

golfballpollution_2019-01-26_181615

Quoting from the Conclusion of their article:

In central California, the Pebble Beach Golf Links host 62,000 rounds of golf per year and has been in operation since 1919 (Dunbar, 2018). The average golfer loses 1–3 balls per round (Hansson and Persson, 2012), which implies that between 62,000 and 186,000 golf balls are lost to the environment each year at the Pebble Beach Golf Links. This translates to 3.14–9.42 tons of debris annually. While a portion of these balls is lost to non-oceanic regions adjacent to the course, the coast and intertidal environments still have a high likelihood of accumulating mishit balls. Using a conservative estimate of 10,000–50,000 balls lost to sea annually gives a range of 1–5 million golf balls lost to the coastal environment during the century that this course has been in operation. These projected numbers indicate that this issue has been overlooked for decades.

I salute Ms Weber, her dad, and Professor Savoca. And look forward to reading their paper.

2019-01-26_182425

Update, 27th January 2019

Accolades to authors Weber, Weber, and Savoca, and collection colleagues Johnston, Sammet, and Matthews for a most impressive piece of work!

The conditions on dives are cold and sometimes treacherous. Representative collections take planning and working around environmental and safety constraints. Revisits showed a glimpse of golfballs pollutant dynamics.

And it didn’t stop there: The huge population of golfballs needed to be characterized by age and wear.

The sampling areas and processes needed documentation.

This is a substantial body of field research, backed up by background scholarship.

Posted in American Association for the Advancement of Science, an uncaring American public, coastal communities, coasts, consumption, ecological disruption, Ecological Society of America, ethics, field research, Florida, Humans have a lot to answer for, marine debris, oceans, plastics, pollution, science, sustainability, sustainable landscaping | Tagged | Leave a comment

“Pelosi won, Trump lost”

U.S. House of Representatives Democratic Leader Nancy Pelosi speaks to reporters after she was re-elected to her post on Capitol Hill in Washington

From Alex Wagner, contributing editor at The Atlantic and CBS News correspondent. Excerpt from “Pelosi won, Trump Lost“:

“Nancy’s Prerogative” might be the name of an Irish bar, but in this case it signaled the waving of the presidential white flag, a fairly shocking thing to see on any war front. Trump’s pugilistic impulses, after all, have been virtually unchecked—especially these days, when he is without administration minders. But Pelosi has rendered Trump unable to employ his traditional weaponry. He couldn’t even muster the juju necessary to formulate that most Trumpian of Trump battle strategies, a demeaning nickname. “Nancy Pelosi, or Nancy, as I call her,” Trump said on Wednesday, “doesn’t want to hear the truth.”

.
.
.

Trump has intersected with powerful women before — Hillary Clinton, most notably — and showed little hesitation to diminish and demean. But Pelosi, who once joked to me she eats nails for breakfast, is a ready warrior. She is happy to meet the demands of war, whereas Clinton was reluctant, semi-disgusted, and annoyed to be dragged to the depths that running against Trump demanded. The speaker of the House is, technically, a coastal elite from San Francisco, but she was trained in the hurly-burly of machine politics of Baltimore by her father, Mayor Thomas D’Alesandro. It is not a coincidence that Pelosi has managed, over and over, to vanquish her rivals in the challenges for Democratic leadership: she flocks to the fight, not just because she usually wins, but apparently because she likes it.

Read it, particularly the quote from former Trump Organization executive Barbara Res, repeated from The New York Times.

There is a similar article at The Washington Post by Jennifer Rubin titled “Trump lost. Period.”

U.S. President Donald Trump listens to remarks at a discussion on School Safety Report at the White House in Washington

concession

Posted in "Big Bang Theory", alchemy, citizenship, Donald Trump, dump Trump, ecopragmatist, politics, reason, San Francisco, Speaker Nancy Pelosi | Leave a comment

“Collective reflection” and working together on climate issues in Massachusetts

This is an excerpt from an article which appeared at RealClimate. That, in turn, is a translation of the same article which appeared in Le Monde on 11th January 2019.

Recent discussions at climate-related blogs and among environmental activists make the portions of the excerpt which I have highlighted in bold especially pertinent.

What if the focus on the moods of climate scientists was a way to disengage emotionally from the choices of risk or solutions to global warming? Since the experts are worrying about it for us (it’s their daily life, isn’t it?), let’s continue our lives in peace. If feelings and expressing emotions – fear, anger, anguish, feelings of helplessness, guilt, depression – in the face of risks are legitimate, even necessary, to take action demands that we go beyond that. Catastrophism often leads to denial, a well-known psychic mechanism for protecting oneself from anxiety. Managing risk is part of our daily lives and supposes that we are not in such denial (active or passive) as it prevents clear and responsible action. Because we know that many hazards carry predictable risks, human societies have learned to anticipate and cope, for example, to limit the damage of storms or epidemics. The challenge of climate change is to build a strategy not in response to an acute and clearly identified risk, but in anticipation of a gradual, chronic increase in climate risks.

The climate scientists are alright (mostly), but that’s not the important question. The dispassionate management of climate risk will require that everyone – citizens, decision makers, teachers, intermediate bodies, companies, civil society, media, scientists – in their place and according to their means, take the time for a collective reflection, first of all through mutual listening. The news shows it every day: this process is hobbling along, too slowly for some, too fast for others. It will need to overcome emotional reactions, vested interests, and false information from the merchants of doubt. Those who are unable to review their strategy and have everything to lose from the exit from fossil-fuel based energies will use nit-picks, manipulation, short-termism, and promote binary and divisive visions, all of which undermine trust and pollute the debate. But despite that…

Every degree of warming matters, every year counts, every choice counts. The challenge is immense because of the nature and magnitude of the unprecedented risk. It requires doing everything to overcome indifference and fatalism.

And, in this regard, but obviously with no support from the authors of the above piece, one of the most constructive things the climate-concerned of Massachusetts can do right now, whatever your political background and stripe, is to throw your support behind Governor Baker’s proposal to tax real estate transfers as a funding source for climate mitigation and adaptation. While The Globe quoted ELM and other environmental groups of having cautious support for the Governor’s proposal, to stand on the sidelines and fail to give him support for the proposal against the likes of the Massachusetts Association of Realtors, quoted in the article, and probably Speaker Robert DeLeo means they are more interested in their side winning than on making progress towards the common goal of mitigating climate change, adapting, and preventing. I have criticized Governor Baker, too. But this and his Executive Order 569 are really welcome, and I walk back what I said there: The Governor has either learned, or I was wrong in the first place.

I’m not the only one supporting him: Foley-Hoag thinks this is a good idea, but wants the Governor to do more.

The risks are here. The risks are now. There is already a 1-chance-in-100 per year of an 8 inch rain or more in 24 hours. No Massachusetts stormwater infrastructure is capable of dealing with half of that. You think that risk small? There’s an 10% chance of that happening one or more times in 10 years. There’s a 4+% chance of that happening in 5 years. The chance of a 7 inch rain or more in 24 hours is 2% each year. Yet Massachusetts codes allow 1960s standards for diurnal rain projections to be the standard. These are no longer the 1960s.

So, which is it, all the people that say they want to fix climate change? Support or not? And if you don’t support this, where is your specific counterproposal? And if you don’t have one, you don’t deserve the label “climate activist” or “environmental activist”. Just settle for politician.

Update, 2019-01-23

A measure and program I find highly constructive is the Ceres Commit to Climate program for corporations.

resilience_initiatives_header

epif_final_820

Posted in Anthropocene, being carbon dioxide, citizenship, climate change, climate disruption, Commonwealth of Massachusetts, EBC-NE, Ecology Action, ecomodernism, ecopragmatism, environment, global warming, Governor Charlie Baker, greenhouse gases, Hyper Anthropocene, ILSR, investment in wind and solar energy, lobbying, local generation, Massachusetts, Massachusetts Clean Energy Center, New England, rights of the inhabitants of the Commonwealth | Leave a comment

What if Juliana v United States fails?

This is a replica of a comment I made at another site. As of 23:55 EST on 21st January, it hasn’t been release from moderation. Perhaps the moderator is busy. I do not know. I am proceeding as if it will not be released, because I will be too busy during the next week or so to monitor.

I am posting this as an expansion of an opinion I offered in public a few months ago, unable to include these ideas because the opinion was strongly constrained by time.

[From a fellow Commentor:]

So, who IS going to fix it? Oh, wait—-I know—-the free market that gave us the problem in the first place!

I’m hoping that the judiciary, via the Public Trust doctrine, might force the government to fix this. Professor Mary Wood’s book, Nature’s Trust, makes it very clear that Executive branch agencies entrusted with the preservation of the natural world become a licensing mechanism for permitting its wholesale destruction, in large measure, as you imply, by the “free market forces” leaning upon government. But that behavior appears ingrained in the Executive, and it doesn’t know how or what else to do without a guiding constraint. They’d do it, in other words, even without the lobbying lean, simply because there are non-business, non-corporate constituencies out there who don’t like to be constrained. Plenty of examples in the book. The American University Law Review article is a good synopsis.

But, there is an aspect to notions of harm in case law which suggests that if a condition is shared by a large population, and, in this case, all the population, there is no standing to sue. The harm must be differentiated and special. This aspect is what darkens my view of this avenue. If we get that far, the other darkening comes from the likelihood that the remedy Juliana seeks would be granted in a form which has any resemblance to the original.

As I have said publicly,

For should the plaintiffs of Juliana fail, the last government branch, the judiciary, abdicates responsibility for solving this urgent problem. And so the Constitution will have failed one of its existential requirements: To provide for the common defense. For Nature has laws, too, and we have been breaking them for a long time, ever more intensely. But Nature does not have courts of grievance or redress. Nature just acts. In a catastrophic sea level rise, perhaps triggered by a collapse of a distant ice sheet, Moakley Courthouse itself, the land you stand on would be lost, and all that there [City of Boston]. While disappointing, were Juliana to be overturned, this should not be a reason for despair. It would not mean the Constitution should be replaced. It would just mean it is useless for solving certain kinds of critically important problems. Its failure would imply the Constitution is becoming a dusty, old thing, irrelevant, like the Articles of Confederation are to us, a ceremonial relic. Let’s hope not.

There will be solutions for solving climate in any case, Constitution or not. They may well be horrifically expensive. And, while there’s no solution without first zeroing emissions, solutions will exist. These will lie beyond the Constitution, I hope Chief Justice Roberts and his colleagues understand the import of that.

Solutions “beyond the Constitution” are solutions where global economic interests decide that climate change must be stopped, for their business is being harmed and their wealth is being lost. The “free market” is no more monolithic than any other group or section of human behavior or collective, and for every company which profits from sale of fossil fuels and use of atmosphere for sewer, there are three or more which simply use them as a means to an end. If a product harms during its use, and the buyer is not forewarned, the buyer, whether individual or corporation, has every right to pursue damages from the purveyor. Beyond that, the buyers have every motivation to band together with the similarly harmed and devise a means of fixing the situation.

The trouble, of course, is to the degree these remedies are extra-governmental and extra-Constitutional, these agents governments have little opportunity to steer these remedies. They might have steered, by participating early on, but the governments, listening to @Gingerbaker’s “Us” chose to pursue the comfortable, uncontroversial paths. To the extent governments cannot fix the problem without the consortium of collective buyers, they’re stuck. This is unfortunate. But this is what happens when fundamental responsibilities are repudiated.

Professor Dan Farber has recently offered his opinion of the status of Juliana. He’s an attorney. I’m not.

callendar_2019-01-21_172722

arrhenius_2019-01-21_173244

Posted in an ignorant American public, an uncaring American public, Anthropocene, being carbon dioxide, Boston Ethical Society, carbon dioxide capture, clear air capture of carbon dioxide, climate, climate business, climate change, climate disruption, climate economics, corporate litigation on damage from fossil fuel emissions, corporate supply chains, corporations, ecological disruption, ecomodernism, economics, ecopragmatism, environment, environmental law, extended producer responsibility, extended supply chains, First Parish Needham, fossil fuel divestment, fossil fuels, global warming, greenhouse gases, Hyper Anthropocene, investment in wind and solar energy, Juliana v United States, leaving fossil fuels in the ground, Mary C Wood, optimization, Our Children's Trust, pollution, population biology, population dynamics, Principles of Planetary Climate, quantitative biology, quantitative ecology, radiative forcing, rationality, reasonableness, sea level rise, sustainability, the tragedy of our present civilization, tragedy of the horizon, United States Constitution, United States Government, UU, UU Needham, zero carbon | Leave a comment

“About” section of this blog has been revised, and rules of commenting made more prominent

See the About section of this blog for a revision in the blog’s description and rules governing commenting made more explicit and prominent. In fact, I have copied these at the bottom of this post.

The heading of the blog has also been changed to more properly express my position and approach to addressing the climate emergency, and acknowledging that I am an ecopragmatist and embrace the Ecomodernist Manifesto.

Rules Regarding Posting Comments on this Blog

  • I will not tolerate climate denial comments.
  • I will not tolerate creationist comments.
  • I will not tolerate insults or anything like slander against cultural groups, or groups based upon sexual preference or sexual identity.
  • This is primarily a technical blog. While friendly discussion and humor is welcome, positions and proclamations or arguments are expected to be accompanied by evidence or citations of evidence, whether as links or as figures or equations. \LaTeX is available. Authors of long derivations or similar contributions might want to consider using Overleaf/ShareLaTeX for their pieces.
  • Commenters who articulate extended interesting positions pertinent to the blog’s purpose may be asked to rewrite their comment as a guest post instead. If this happens, the comment will be held for moderation and the commenter contacted.
  • Commenters are expected to use unique handles, that is, they oughtn’t use multiple pseudonyms or email addresses for the same person. This is not only a rule of this blog, but is a stipulation of the TOS for wordpress.com.
  • I am happy to fix syntactic mistakes in comments. I find WordPress really ought to provide a way for commenters to revise their postings, and, in the absence of such, am happy to help.
  • I will never delete a comment without first simply holding it for moderation, and approaching the commenter, asking them to revise it, or explaining why I am holding it. In the absence of a reply, the comment may be held in moderation indefinitely.
  • Comments may be deleted if the change is virulently opposed, or if the commenter has engaged in a series of violations of rules. Ultimately, as has happened, a commenter who abuses the rules will be banned from participation.
  • In the end, I reserve the right to determine what’s appropriate here or not. This is my blog. I pay for it. There is no subsidy or advertising that helps pay for it.

Posted in Anthropocene, blog, bridge to somewhere, Buckminster Fuller, CleanTechnica, climate change, ecology, Ecology Action, ecomodernism, ecopragmatism, ecopragmatist, engineering, global warming, Hermann Scheer, Hyper Anthropocene, ILSR, Joseph Schumpeter, leaving fossil fuels in the ground, local generation, local self reliance, Mark Carney, reasonableness, secularism, solar democracy, solar domination, Stewart Brand, technology, the energy of the people, the green century, Tony Seba, wind energy, wind power, zero carbon | Leave a comment

“From Single Use to Zero Waste: What’s New with Recycling”

Wednesday, January 30, 2019, 7:00-8:30 pm, at the South Shore Natural Science Center


Map:


The South Shore Natural Science Center and the South Shore Recycling Cooperative (SSRC) present:

This event will be live-streamed at the SSRC Facebook page.

Posted in ecology, environment, environmental law, recycling, South Shore Recycling Cooperative | Leave a comment

On the rheology of cats

Important paper.

Overview.

jtnmm-are-cats-liquid-nobel-prize-1

(Dedicated to dumboldguy.)
Posted in science | 1 Comment

A never-ending litany of vituperation

There is a blog commenter whose handle is dumboldguy who used to comment here. The rules for commenting at this blog are clear and posted. He made some comments, along with extraneous material, and I first left the comments but edited the extraneous material.

He was annoyed by the editing, and I pointed out this is my blog and the rules are the rules.

He continued to comment, and I held a couple of those for moderation, asking him to provide references and links.

In the end, dumboldguy became angry and non-productive, and began to accuse all kinds of unfairness, and basically call me names. His standard for me at other blogs is “ecoquack”. I really don’t care. I was once called a “tree-hugging ecoweenie” by a climate denier, and I’m kind of proud of that moniker.

I don’t get a lot of comments. As I’ve noted, and noted to him, I don’t write this blog to make it popular. It has a reasonable following. I enjoy well-written comments, but in the spirit of the blog, claims should be footnoted with links to evidence and the like. I mostly use the blog to express things, and have a convenient place to put material, so I can cite it easily.

I also use the blog to document technical findings. dumboldbuy for some reason finds these most irritating at all, accusing me of some kind of elitism because I post them.

In the end, I needed to ban him from the blog. No problem. It happens. Users can be banned for the same reason WordPress has an automatic spam filter on comments.

But dumboldguy has continued his attacks in comments at other blog sites, most related to climate change mitigation and climate justice. I am not attempting to refute him here. I do engage at the other sites when appropriate.

However, I am beginning this blog post today to record the more abrasive of his comments at other sites, and provide a record of these attacks, by date. I am not providing his comments in full, but do provide links to them. Those will serve to provide context, unless dumboldguy manages to get himself banned elsewhere.

I don’t expect the updating of this blog post will end anytime soon.

2019-01-20, Sunday

link

Looks to me like ecoquack is suffering from the engineer’s typical inability to understand the English language, as well as the engineer’s tendency to focus on technology as the answer to all human problems.
.
.
.
If ecoquack could climb out of his engineer’s silo and really see what is being said, he would realize that the key words are OPPOSE CORPORATE SCHEMES THAT PLACE PROFITS OVER COMMUNITY BENEFITS, INCLUDING MARKET BASED MECHANISMS.

Looks to me like ecoquack is a trying to hijack the urgency of the climate emergency to advance his own set of objectives.

link

You again miss the point, just as the author of the Atlantic piece did. It’s not about mitigating climate change and competing “technologies”, it’s about fighting the politicians and so-called “capitalists” that want to prolong the system that gave us climate change in the first place. If we can’t break their stranglehold on what does or doesn’t get done, we are going nowhere.

And why do you again insist on throwing more maundering BS and self-admiration into your comment? We don’t give a rodent’s rear end that you know LaTeX and use it to crap up what could be said clearly in plain English—-it proves my point about your engineer’s cluelessness about communication.

And please don’t mention YOUR BLOG here again—-as I’ve said here before, it is not a site worth visiting except to view your self-admiration and egotism. Anyone who doesn’t agree with what they see there will be ignored or quickly banned if they persist.

link

JFC! Do you never tire of spouting bullshit and then sitting back and admiring how smart you are?

Now you’re going to say that the “proponents” are part of the great left wing conspiracy to take over the world? Have you even read Klein’s book?

Moderator's note: Actually, I have. I didn't think much of it. Suffice it to say I am a disciple of Hermann Scheer, Buckminster Fuller, and, above all, of Stewart Brand. See Brand's important book. Yeah, I'm an ecopragmatist as well as a solar revolutionary. dumboldguy apparently dislikes ecomodernism a lot.

I have—a copy sits on my bookshelf, and it is one of the best books ever written about climate change (or rather, as I said, how run-amok capitalism is the REAL problem). What parts of it do you dispute? Cite page numbers and let’s debate her points.

“….many countries are making progress reducing their emissions, and they care not revolutionary at all. Quite capitalist in fact”. BULLSHIT! The major emitters are NOT making progress.

Making common cause with conservatives? Massachusetts? What “goods” could be made from captured carbon? You waste our time with even MORE inane Bullshit!

“The engineering expertise is in corporations”? Actually, it’s drawn to wherever there is money to pay for it, as the government did by spending huge quantities on the Manhattan Project and Going to the Moon.

You are sounding more like a Republican corporate shill every time you open your mouth. Is it your “objective” to get your hooks into some of that $$$$?

2019-01-21, Monday

link

Ecoquacky does it again!

“Cumulative emissions are all that matter, because of the longevity, in atmosphere, of Carbon Dioxide. Annual emissions don’t matter at all. It’s ALL owned by the United States and Europe”.

Lord love a duck, but that’s one of the dumber things Quacky has said here. Yes, the US and Europe ARE to much to blame for the size of the cumulative emissions—-not surprising since that’s where the Industrial Revolution began and has been polluting longest—-but to say “annual emissions don’t matter at all” totally ignores the FACT that the rest of the world (whose population far outnumbers the West) is now producing an ever-increasing quantity of CO2 ANNUALLY , wants to have a living standard like that in the West, is going to NEED millions of air conditioners to survive the coming heat waves, and is still burning too much COAL (coal being the subject Quacky refuses to discuss).

Perhaps it’s time to remind Quacky of the old saw that everyone is entitled to their OPINION, no matter how half-assed, but NOT to their own facts. It is a simple FACT that ALL emissions—-past, present, and future—-are of concern.

link

Quacky just can’t quit. Now he’s swinging over to some BS about “moral and ethical responsibility” and “compensation”? WTF is he talking about?

How did we get to that from “cumulative emissions are all that matter”? (and why doesn’t he want to talk about coal—-the stake through the heart of humanity?)

I will repeat—-yes, cumulative emissions MAY have already doomed us, but if we don’t deal strongly with the “annual emissions” yet to come from EVERY country in the world, there is virtually no hope.

link

You agree? Swell! Who cares?

link

Redsky is correct, Quacky. You’re not.

CO2 levels remained stable for 1000’s of years until the Industrial Revolution. It wasn’t until the 1960’s that they started to ramp up, with the level in 1960 being ~315 ppm, only 40 ppm higher than it was 120 years before in 1840.

Those of us who were more aware than you of “the possibility of emissions having an effect” were worried about more visible and imminent threats back then—-dirty air and dirty water, toxic industrial waste, lead in gasoline and paint, DDT, acid rain, the ozone hole, resource depletion, overpopulation, SST’s, nuclear power, and more.

It wasn’t until the 1980’s and Hansen that we began to pay attention to GHG, and the near 100 ppm rise in ~60 years from 1960 until today, which is ~5 times the rate of increase before 1960.

http://www.sealevel.info/co2_and_ch4c.html

Not sure what your point is with “the government had been warned and cautioned repeatedly”. That’s not news, and it’s water over the dam anyway. Or is it just that you like to hear yourself quack?

Posted in blog, science | Leave a comment

“… [N]ew renewable energy capacity could quadruple that of fossil fuels over next three years”

This is utility-scale capacity only. See the footnote from the original post repeated at the bottom. Also, given uncertainties related to federal data availability at federal Web sites during the partial federal shutdown, I have copied the cited report and placed it so it is publicly available in a safe location.

Quoting:

Washington DC – According to an analysis by the SUN DAY Campaign of the latest data released by the Federal Energy Regulatory Commission (FERC), natural gas dominated new electrical generating capacity in 2018. However, renewable energy sources (i.e., biomass, geothermal, hydropower, solar, wind) may be poised to swamp fossil fuels as new generating capacity is added over the next three years.

FERC’s “Energy Infrastructure Update” report (with data through November 30, 2018) notes that new natural gas generation placed in service during the first 11 months of 2018 totaled 16,687 MW or 68.46% of the total (24,376 MW). Renewable sources accounted for only 30.12% led by wind (3,772 MW) and solar (3,449MW).(*)

However, the same report indicates that proposed generation and retirements by December 2021 include net capacity additions by renewable sources of 169,914 MW. That is 4.3 times greater than the net new additions listed for coal, oil, and natural gas combined (39,414 MW).

Net proposed generation additions from wind alone total 90,268 MW while those from solar are 64,066 MW — each greater than that listed for natural gas (56,881 MW). FERC lists only a single new 17-MW coal unit for the three-year period but 16,122 MW in retirements. Oil will also decline by 1,362 MW while nuclear power is depicted as remaining largely unchanged (i.e., a net increase of 69 MW).

FERC’s data also reveal that renewable sources now account for 20.8% of total available installed U.S. generating capacity.(**) Utility-scale solar is nearly 3% (i.e., 2.94%) while hydropower and wind account for 8.42% and 7.77% respectively.

(*) FERC only reports data for utility-scale facilities (i.e., those rated 1-MW or greater) and therefore its data does not reflect the capacity of distributed renewables, notably rooftop solar PV which accounts for approximately 30% of the nation’s installed solar capacity.

(**) Capacity is not the same as actual generation. Capacity factors for nuclear power and fossil fuels tend to be higher than those for most renewables. For the first ten months of 2018, the U.S. Energy Information Administration reports that renewables accounted for 17.6% of the nation’s total electrical generation – that is, a bit less than their share of installed generating capacity (20.8%).

Source:

FERC’s 6-page “Energy Infrastructure Update for November 2018” was released in early January 2019. In a seeming departure from its norm, FERC did not announce the release of this report on its web page and a specific release date does not appear on the report itself. However, it is assumed the report was issued within the past week. It can be found at: https://www.ferc.gov/legal/staff-reports/2018/nov-energy-infrastructure.pdf. For the information cited in this update, see the tables entitled “New Generation In-Service (New Build and Expansion),” “Total Available Installed Generating Capacity,” and “Proposed Generation Additions and Retirements by October 2021.”

Posted in American Solar Energy Society, Anthropocene, Bloomberg New Energy Finance, BNEF, bridge to somewhere, Buckminster Fuller, clean disruption, CleanTechnica, decentralized electric power generation, decentralized energy, electricity, FERC, green tech, ILSR, investment in wind and solar energy, John Farrell, Joseph Schumpeter, leaving fossil fuels in the ground, local generation, local self reliance, natural gas, rate of return regulation, solar democracy, solar domination, solar energy, solar power, Sonnen community, the energy of the people, the right to know, the value of financial assets, Tony Seba, wind energy, wind power, zero carbon | Leave a comment

A look at an electricity consumption series using SNCDs for clustering

(Slightly amended with code and data link, 12th January 2019.)

Prediction of electrical load demand or, in other words, electrical energy consumption is important for the proper operation of electrical grids, at all scales. RTOs and ISOs forecast demand based upon historical trends and facts, and use these to assure adequate supply is available.

This is particularly important when supply is intermittent, such as solar PV generation or wind generation, but, to some degree, all generation is intermittent and can be unreliable.

Such prediction is particularly difficult at the small and medium scale. At large scale, relative errors are easier to control, and there are a large number of units drawing upon or producing electrical energy which are amassed. At the very smallest of scales, it may be possible to anticipate usage of single institutions or households based upon historical trends and living patterns. This has only partly been achieved in devices like the Sense monitor, and prediction is still far away.

Presumably, techniques which apply to the very small could be scaled to deal with small and moderate size subgrids, although the moderate sized subgrids will probably be adaptations of the techniques used at large scale.

There is some evidence that patterns of electrical consumption directly follow the behavior of the building’s or home’s occupants that day, modulated by outside temperatures and occurrence of notable or special events. Accordingly, being able to identify the pattern of behavior early in a day can offer power prior information for the consumption pattern that will hold later in the day.

There is independent evidence occupant do, in a sense, select their days from a palette of available behaviors. This has been observed in Internet Web traffic, as well as secondary signals in emissions from transportation centers. Discovering that palette of behaviors is a challenge.

This post reports on an effort do such discovery using time series of electricity consumption for 366 days from a local high school. Consumption is sampled every 15 minutes.

Here is a portion of this series, with some annotations:

The segmentation is done automatically with a regime switching detector. The portion below shows these data atop a short-term Fourier spectrum of the same (STFT):

The point of this exercise is to cluster days together in a principled day, so to attempt to derive a kind of palette. One “color” of such a palette would be a cluster. Accordingly, if a day is identified, from the preliminary trace of its electricity consumption as being a member of a cluster, the bet is that the remainder of the day’s consumption will follow the patterns of other series seen in the cluster. If more than one cluster fits, then some kind of model average across clusters can be taken as predictive, obviously with greater uncertainty.

(Click on figure to see larger image and then use browser Back Button to return to blog.)

Each day of 366 for the 2007-2008 academic year was separated and pairwise dissimilarities for all days were calculated using a Symmetrized Normalized Compression Divergence (SNCD) described previously. The dissimilarity matrix was used with the default hierarchical clustering function, hclust, in R and its Ward-D2 method. That clustering produced the following dendrogram:

The facilities of the dynamicTreeCut package of R were used to find a place to cut the dendrogram and thus identify clusters. The cutreeDynamic function was called on the result of hierarchical clustering, using the hybrid method, and a minimum cluster size setting of one, to give the cluster chooser free range.

There were 5 clusters found. Here they are in various ways.

First, the dates and their weekdays:


$`1`
 2007-09-06  2007-09-07  2007-09-10  2007-09-14  2007-09-17  2007-09-18  2007-09-21  2007-09-25  2007-09-27  2007-10-01  2007-10-02  2007-10-03  2007-10-04  2007-10-09 
 "Thursday"    "Friday"    "Monday"    "Friday"    "Monday"   "Tuesday"    "Friday"   "Tuesday"  "Thursday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"   "Tuesday" 
 2007-10-10  2007-10-22  2007-10-23  2007-10-29  2007-10-31  2007-11-02  2007-11-05  2007-11-06  2007-11-13  2007-11-21  2007-11-28  2007-12-03  2007-12-04  2007-12-05 
"Wednesday"    "Monday"   "Tuesday"    "Monday" "Wednesday"    "Friday"    "Monday"   "Tuesday"   "Tuesday" "Wednesday" "Wednesday"    "Monday"   "Tuesday" "Wednesday" 
 2007-12-06  2007-12-11  2007-12-12  2007-12-14  2007-12-17  2007-12-18  2007-12-19  2008-01-03  2008-01-04  2008-01-11  2008-01-15  2008-01-16  2008-01-17  2008-01-18 
 "Thursday"   "Tuesday" "Wednesday"    "Friday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"    "Friday"    "Friday"   "Tuesday" "Wednesday"  "Thursday"    "Friday" 
 2008-01-22  2008-01-23  2008-01-24  2008-01-29  2008-01-30  2008-01-31  2008-02-05  2008-02-06  2008-02-07  2008-02-11  2008-02-12  2008-02-13  2008-02-25  2008-02-27 
  "Tuesday" "Wednesday"  "Thursday"   "Tuesday" "Wednesday"  "Thursday"   "Tuesday" "Wednesday"  "Thursday"    "Monday"   "Tuesday" "Wednesday"    "Monday" "Wednesday" 
 2008-02-28  2008-03-10  2008-03-12  2008-03-13  2008-03-14  2008-03-19  2008-03-24  2008-03-25  2008-04-01  2008-04-02  2008-04-03  2008-04-04  2008-04-11  2008-04-23 
 "Thursday"    "Monday" "Wednesday"  "Thursday"    "Friday" "Wednesday"    "Monday"   "Tuesday"   "Tuesday" "Wednesday"  "Thursday"    "Friday"    "Friday" "Wednesday" 
 2008-04-28  2008-04-30  2008-05-05  2008-05-07  2008-05-09  2008-05-12  2008-05-19  2008-05-22  2008-05-27  2008-05-28  2008-06-01  2008-06-02  2008-06-04  2008-06-05 
   "Monday" "Wednesday"    "Monday" "Wednesday"    "Friday"    "Monday"    "Monday"  "Thursday"   "Tuesday" "Wednesday"    "Sunday"    "Monday" "Wednesday"  "Thursday" 
 2008-06-07  2008-06-10  2008-06-13  2008-06-17  2008-06-18  2008-06-19  2008-06-23  2008-06-24  2008-06-27  2008-07-01  2008-07-02  2008-07-05  2008-08-11  2008-08-18 
 "Saturday"   "Tuesday"    "Friday"   "Tuesday" "Wednesday"  "Thursday"    "Monday"   "Tuesday"    "Friday"   "Tuesday" "Wednesday"  "Saturday"    "Monday"    "Monday" 
 2008-08-27 
"Wednesday" 

$`2`
 2007-09-03  2007-09-04  2007-09-08  2007-09-12  2007-09-13  2007-09-15  2007-09-20  2007-09-24  2007-09-29  2007-10-06  2007-10-07  2007-10-08  2007-10-12  2007-10-15 
   "Monday"   "Tuesday"  "Saturday" "Wednesday"  "Thursday"  "Saturday"  "Thursday"    "Monday"  "Saturday"  "Saturday"    "Sunday"    "Monday"    "Friday"    "Monday" 
 2007-10-20  2007-10-27  2007-10-28  2007-10-30  2007-11-03  2007-11-22  2007-11-23  2007-11-26  2007-12-01  2007-12-13  2007-12-24  2007-12-26  2007-12-28  2007-12-31 
 "Saturday"  "Saturday"    "Sunday"   "Tuesday"  "Saturday"  "Thursday"    "Friday"    "Monday"  "Saturday"  "Thursday"    "Monday" "Wednesday"    "Friday"    "Monday" 
 2008-01-05  2008-01-14  2008-01-21  2008-01-25  2008-02-02  2008-02-04  2008-02-09  2008-02-10  2008-02-15  2008-02-18  2008-02-19  2008-02-20  2008-02-21  2008-02-22 
 "Saturday"    "Monday"    "Monday"    "Friday"  "Saturday"    "Monday"  "Saturday"    "Sunday"    "Friday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"    "Friday" 
 2008-03-04  2008-03-06  2008-03-15  2008-03-18  2008-03-23  2008-03-28  2008-03-29  2008-04-05  2008-04-10  2008-04-16  2008-04-17  2008-04-18  2008-04-21  2008-04-22 
  "Tuesday"  "Thursday"  "Saturday"   "Tuesday"    "Sunday"    "Friday"  "Saturday"  "Saturday"  "Thursday" "Wednesday"  "Thursday"    "Friday"    "Monday"   "Tuesday" 
 2008-04-25  2008-05-01  2008-05-02  2008-05-08  2008-05-21  2008-05-24  2008-05-29  2008-06-08  2008-06-12  2008-06-21  2008-06-25  2008-06-26  2008-07-04  2008-07-06 
   "Friday"  "Thursday"    "Friday"  "Thursday" "Wednesday"  "Saturday"  "Thursday"    "Sunday"  "Thursday"  "Saturday" "Wednesday"  "Thursday"    "Friday"    "Sunday" 
 2008-07-07  2008-07-13  2008-07-18  2008-07-21  2008-07-22  2008-07-23  2008-07-24  2008-07-29  2008-07-30  2008-08-01  2008-08-02  2008-08-05  2008-08-06  2008-08-08 
   "Monday"    "Sunday"    "Friday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"   "Tuesday" "Wednesday"    "Friday"  "Saturday"   "Tuesday" "Wednesday"    "Friday" 
 2008-08-09  2008-08-10  2008-08-12  2008-08-13  2008-08-15  2008-08-16  2008-08-20  2008-08-28 
 "Saturday"    "Sunday"   "Tuesday" "Wednesday"    "Friday"  "Saturday" "Wednesday"  "Thursday" 

$`3`
 2007-09-05  2007-09-11  2007-09-19  2007-09-26  2007-09-28  2007-10-05  2007-10-11  2007-10-16  2007-10-17  2007-10-18  2007-10-19  2007-10-24  2007-10-25  2007-10-26 
"Wednesday"   "Tuesday" "Wednesday" "Wednesday"    "Friday"    "Friday"  "Thursday"   "Tuesday" "Wednesday"  "Thursday"    "Friday" "Wednesday"  "Thursday"    "Friday" 
 2007-11-01  2007-11-07  2007-11-08  2007-11-09  2007-11-14  2007-11-15  2007-11-16  2007-11-19  2007-11-20  2007-11-27  2007-11-29  2007-11-30  2007-12-07  2007-12-10 
 "Thursday" "Wednesday"  "Thursday"    "Friday" "Wednesday"  "Thursday"    "Friday"    "Monday"   "Tuesday"   "Tuesday"  "Thursday"    "Friday"    "Friday"    "Monday" 
 2007-12-20  2007-12-21  2007-12-27  2008-01-02  2008-01-07  2008-01-08  2008-01-09  2008-01-10  2008-01-28  2008-02-01  2008-02-08  2008-02-14  2008-02-26  2008-02-29 
 "Thursday"    "Friday"  "Thursday" "Wednesday"    "Monday"   "Tuesday" "Wednesday"  "Thursday"    "Monday"    "Friday"    "Friday"  "Thursday"   "Tuesday"    "Friday" 
 2008-03-03  2008-03-05  2008-03-07  2008-03-08  2008-03-11  2008-03-17  2008-03-26  2008-03-27  2008-03-31  2008-04-07  2008-04-08  2008-04-09  2008-04-14  2008-04-15 
   "Monday" "Wednesday"    "Friday"  "Saturday"   "Tuesday"    "Monday" "Wednesday"  "Thursday"    "Monday"    "Monday"   "Tuesday" "Wednesday"    "Monday"   "Tuesday" 
 2008-04-24  2008-04-29  2008-05-06  2008-05-13  2008-05-14  2008-05-15  2008-05-16  2008-05-20  2008-05-23  2008-05-30  2008-06-03  2008-06-06  2008-06-09  2008-06-11 
 "Thursday"   "Tuesday"   "Tuesday"   "Tuesday" "Wednesday"  "Thursday"    "Friday"   "Tuesday"    "Friday"    "Friday"   "Tuesday"    "Friday"    "Monday" "Wednesday" 
 2008-06-14  2008-06-16  2008-06-22  2008-07-14  2008-07-25  2008-08-19  2008-08-26 
 "Saturday"    "Monday"    "Sunday"    "Monday"    "Friday"   "Tuesday"   "Tuesday" 

$`4`
2007-09-01 2007-09-02 2007-09-09 2007-09-16 2007-09-22 2007-09-23 2007-09-30 2007-10-13 2007-10-14 2007-10-21 2007-11-04 2007-11-10 2007-11-11 2007-11-12 2007-11-17 2007-11-18 
"Saturday"   "Sunday"   "Sunday"   "Sunday" "Saturday"   "Sunday"   "Sunday" "Saturday"   "Sunday"   "Sunday"   "Sunday" "Saturday"   "Sunday"   "Monday" "Saturday"   "Sunday" 
2007-11-24 2007-11-25 2007-12-02 2007-12-08 2007-12-09 2007-12-15 2007-12-16 2007-12-22 2007-12-23 2007-12-25 2007-12-29 2007-12-30 2008-01-01 2008-01-06 2008-01-12 2008-01-13 
"Saturday"   "Sunday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday"  "Tuesday" "Saturday"   "Sunday"  "Tuesday"   "Sunday" "Saturday"   "Sunday" 
2008-01-19 2008-01-20 2008-01-26 2008-01-27 2008-02-03 2008-02-16 2008-02-17 2008-02-23 2008-02-24 2008-03-01 2008-03-02 2008-03-09 2008-03-16 2008-03-21 2008-03-22 2008-03-30 
"Saturday"   "Sunday" "Saturday"   "Sunday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday"   "Sunday"   "Sunday"   "Friday" "Saturday"   "Sunday" 
2008-04-06 2008-04-12 2008-04-13 2008-04-19 2008-04-20 2008-04-26 2008-04-27 2008-05-03 2008-05-04 2008-05-10 2008-05-11 2008-05-17 2008-05-18 2008-05-25 2008-05-31 2008-06-15 
  "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday" "Saturday"   "Sunday"   "Sunday" "Saturday"   "Sunday" 
2008-06-29 2008-07-12 2008-07-19 2008-07-20 2008-07-26 2008-07-27 2008-08-03 2008-08-17 2008-08-24 2008-08-31 
  "Sunday" "Saturday" "Saturday"   "Sunday" "Saturday"   "Sunday"   "Sunday"   "Sunday"   "Sunday"   "Sunday" 

$`5`
 2008-03-20  2008-05-26  2008-06-20  2008-06-28  2008-06-30  2008-07-03  2008-07-08  2008-07-09  2008-07-10  2008-07-11  2008-07-15  2008-07-16  2008-07-17  2008-07-28 
 "Thursday"    "Monday"    "Friday"  "Saturday"    "Monday"  "Thursday"   "Tuesday" "Wednesday"  "Thursday"    "Friday"   "Tuesday" "Wednesday"  "Thursday"    "Monday" 
 2008-07-31  2008-08-04  2008-08-07  2008-08-14  2008-08-21  2008-08-22  2008-08-23  2008-08-25  2008-08-29  2008-08-30 
 "Thursday"    "Monday"  "Thursday"  "Thursday"  "Thursday"    "Friday"  "Saturday"    "Monday"    "Friday"  "Saturday" 

Note that most of the weekend days are in cluster 4 along with a Christmas Tuesday (25 December 2007) and Veterans Day (observed) on a Monday, 12 November 2007, and a Good Friday, 21 March 2008. Assigning meanings to the other clusters depends upon having events to mark them with. It’s known, for example, that the last day of school in 2008 was 20th June 2008. Unfortunately, the academic calendars for 2007-2008 have apparently been discarded. (I was able to find a copy of the 2008 Westwood High School yearbook, but it is not informative about dates, consisting primarily of photographs.) Accordingly, it’s necessary to look for internal consistency.

There is a visual way of representing these findings. The figure below, a reproduction of the one at the head of the blog post, traces energy consumption for the high school during each day. The abscissa shows hours of the day, broken up into 96 15-minute intervals. For each of 366 days, the energy consumption recorded is plotted, and the lines connected. Each line is plotted in a different color depending upon the day of the week. The colors are faded by adjusting their alpha value so they can be seen through.

Note how days with flat energy consumption tend to be in a single color. These are apparently weekend days.

Atop of each of the lines describing energy consumption, a black numeral has been printed which gives the cluster number to which the day was assigned. These are printed at the highest point of their associated curves, but these are jittered so they don’t stack atop one another and make them hard to distinguish.

(Click on figure to see larger image and then use browser Back Button to return to blog.)

The clusters go along with consumption characters. A proactive energy management approach would entail examining the activities done on the days in each of the clusters. Of special interest would be clusters, such as clusters 1 and 3 which have very high energy usage.

Code and data

The code and data reviewed here are available in my Google replacement for a Git repository.

Future work

I am next planning to apply this clustering technique to long neglected time series of streamflow in Sharon, MA and on the South Shore.

Posted in American Statistical Association, consumption, data streams, decentralized electric power generation, dendrogram, divergence measures, efficiency, electricity, electricity markets, energy efficiency, energy utilities, ensembles, evidence, forecasting, grid defection, hierarchical clustering, hydrology, ILSR, information theoretic statistics, local self reliance, Massachusetts, microgrids, NCD, normalized compression divergence, numerical software, open data, prediction, rate of return regulation, Sankey diagram, SNCD, statistical dependence, statistical series, statistics, sustainability, symmetric normalized compression divergence, time series | Leave a comment

On plastic bag bans, and the failure to realize economic growth cannot be green

(Updated 2019-01-12.)

Despite the surge of interest in plastic bag bans, the environmental sustainability numbers haven’t been run. For example, it makes no sense to trade using paper bags instead of plastic ones, even if the paper is recycled, because paper is a nasty product to make, and more emissions are involved shipping paper bags than plastic ones. Paper bags are heavier, get wet, and cost towns and residents to recycle or dispose.

The City of Cambridge, Massachusetts, put fees on all retail bags, but did that after studying the matter for seven years. Reports on their study are available at the City of Cambridge Web site.

Even reusable bags have an impact to be made, and, if used, must be reused one or two hundred times to offset their own upstream environmental impacts in comparison with plastic bags, downstream impacts and all. The biggest problem people have with reusable bags is remembering to bring them along.

We don’t really know what happens to plastic bags in oceans, apart from anecdotal evidence of harm to macroscale creatures. Cigarette filters and microplastics seem to persist.

See the podcast from BBC’s “Costing the Earth”

for some of the complexities.

Wishful environmentalism can be damaging: It consumes policy good will, energy on the part of activists, and misses addressing substantial problems, like expansive development, which cause far greater harm to the natural world. And, worse, the “feel good” of not using plastic bags, or helping to ban them tends to justify personal behaviors which are more damaging, such as taking another aircraft flight for fun that hasn’t been properly offset in its emissions (*). Air travel is a huge contributor, and has, thus far, never been successfully penalized for its contributions to human emissions. The last round on that was fought during the Obama administration which fiercely negotiated with Europe not to have to pay extra fees for landing in EU airports.

The hard fact is economic growth cannot be green. Quoting:

Study after study shows the same thing. Scientists are beginning to realize that there are physical limits to how efficiently we can use resources. Sure, we might be able to produce cars and iPhones and skyscrapers more efficiently, but we can’t produce them out of thin air. We might shift the economy to services such as education and yoga, but even universities and workout studios require material inputs. Once we reach the limits of efficiency, pursuing any degree of economic growth drives resource use back up.

These problems throw the entire concept of green growth into doubt and necessitate some radical rethinking. Remember that each of the three studies used highly optimistic assumptions. We are nowhere near imposing a global carbon tax today, much less one of nearly $600 per metric ton, and resource efficiency is currently getting worse, not better. Yet the studies suggest that even if we do everything right, decoupling economic growth with resource use will remain elusive and our environmental problems will continue to worsen.

This sounds discouraging, but I am not discouraged. The natural world has repeatedly dealt with species which were resource hogs. That it ends poorly for the species who do is a salutary lesson for those which can observe it, assuming they learn.

Claire bought me a wonderful book for the holidays. It’s Theory-based Ecology by Pásztor, Botta-Dukát, Magyar, Czárán, and Meszéna, and I got it for my Kindle Oasis. It has a number of themes but two major ones are (1) exponential growth of unstructured populations, and (2) the inevitability of population regulation. By the latter they mean organism deaths due to insufficient resources, or, in other words, growth beyond the carrying capacity.

In our case, that kind of collapse or growth is mediated by an economic system, one which suffers its own periodic collapses. Accordingly, the choice is whether to keep hands off and allow such a collapse via a Minsky moment occur on its own, or, instead, intervene and have a controlled descent. We are not as self-sustaining as we collectively think, and developed countries, although wealthier and replete with resources, also have a greater cross section for impact and harm.

Our choice.

Update, 2019-01-12

From The Hill, “Will a market crash get the action we need on climate change?”:

So, what’s the good news? The end of denial by financial markets and government leaders is nearly at hand. For most investors, the risks of climate change loom beyond their investment horizon. It’s been easy for investors to operate in a speculative carbon bubble, acting as though there are no impending costs to earnings-per-share or to liabilities in their portfolios from the buildup of carbon in the atmosphere. But these costs may increasingly look real, and when investors start taking these costs into account, markets will revalue: not just oil and gas stock, but all stocks.

Companies have facilities that will be flooded or be without needed water for production; supply chains will need to be rebuilt; costs of transportation will increase. What about the costs to financial institutions as communities need to be abandoned because of flood or drought? What are the fiscal consequences to governments of rebuilding airports, roads and other critical infrastructure? What will happen to consumer spending?

There will be winners and losers in this revaluation, but as past speculative bubbles have shown us, when they burst, markets move very quickly.

Government leaders have likewise largely operated in a bubble. It is the rare leader who can spend political (or taxpayer) capital on addressing an over-the-horizon problem. When the bubble bursts, government leaders will need to address the real concerns of rebuilding infrastructure, food and water security, and public health threats that will be seen by voters as imminent.


(*) This is actually pretty straightforward to do. Here’s our formula.

There is something called the New England Wind Fund. Essentially, contributions are used to buy WRECs, and one WREC prevents 842 pounds of CO2 emissions on the electric grid. Thecarbonfootprint.com offers a CO2 travel calculator. It tells how much CO2-equivalents are produced from a flight. (They offer calculators for other modes of travel, too.) They also offer you a vehicle for offsetting right on the spot, but I do not recommend using it. They do also make available a check box for additional forcing effects, which I always check. This is because emissions at typical aircraft altitudes are worse than at sea level or on the ground.

The result is in metric tonnes, where 1 metric ton is 1000 kilograms. There are 2.2 lbs per kilogram. So 1 WREC prevents 383 metric tonnes of CO2 emissions.

For a trip, calculate how much emissions you will make in units of WRECs, and then go to the New England Wind Fund site and contribute US$40 for each WREC.

Done.

I don’t recommend using the carbonfootprint.com offset because, while they could be fine, Carbon offsetting programs need constant auditing and checking, and there are some unscrupulous operators out there who use these for greenwashing purposes only. I know New England Wind, though, and these really do get converted into WRECs.

Posted in adaptation, an ignorant American public, an uncaring American public, Anthropocene, development as anti-ecology, E. O. Wilson, environment, evidence, evolution, exponential growth, fragmentation of ecosystems, global warming, greenwashing, Humans have a lot to answer for, Hyper Anthropocene, local self reliance, plastics, population biology, quantitative biology, quantitative ecology, supply chains, sustainability, sustainable landscaping, The Demon Haunted World, the right to be and act stupid, the right to know, the tragedy of our present civilization, the value of financial assets, tragedy of the horizon | Leave a comment

Hogwarts Hymn

Posted in Harry Potter, J K Rowling | Leave a comment

My most political post yet … yeah, but it’s me, and Bill Maher is, most of the time, what I’m down with.

Sorry, but there are distinctions to be made.

Posted in Bill Maher, objective reality | Leave a comment

International climate negotiations, the performance: `Angry and upset’

Climate Adam, who you should follow:

Posted in adaptation, American Association for the Advancement of Science, Anthropocene, carbon dioxide, Carbon Worshipers, climate change, Glen Peters, global warming, Hyper Anthropocene, Kevin Anderson | Leave a comment

Love your home. The place we call home needs love. But love means nothing, without action.

Posted in Ørsted, bridge to somewhere, Buckminster Fuller, climate disruption, decentralized electric power generation, decentralized energy, ecological disruption, electricity, green tech, Green Tech Media, investment in wind and solar energy, local generation, solar democracy, solar domination, solar energy, solar power, Spaceship Earth, sustainability, the energy of the people, the green century, tragedy of the horizon, utility company death spiral, wind energy, wind power, zero carbon | Leave a comment