“Getting our heads out of the sand: The facts about sea level rise” (Robert Young)

If current luck holds, North Carolina may well escape the 2013 hurricane season without the widespread damage that has so frequently plagued the fragile coastal region in recent years. Unfortunately, this brief respite is almost certainly only that — a temporary breather.

Experts assure us that the impacts of climate change (including rising oceans and frequent, damaging storms) are sure to remake the coast in myriad ways over the decades to come and will, quite likely, permanently submerge large tracts of real estate.

So, what does our best science predict? And what can and should we do — especially in a state in which policymakers have actually passed a law denying that sea level rise is even occurring?

Dr. Robert Young of Western Carolina University, professor of geology, an accomplished author and a nationally recognized expert on the future of our developed shorelines, explores answers to there and related questions.

NC Policy Watch presents — a Crucial Conversation Featuring Dr. Robert S. Young, professor of geology and Director of the Program for the Study of Developed Shorelines at Western Carolina University.

See their Storm Surge Viewer, especially if you are interested in buying or developing shoreline property.

Posted in American Association for the Advancement of Science, American Meteorological Association, American Statistical Association, AMETSOC, Anthropocene, Boston, climate, climate change, climate disruption, climate economics, coastal communities, coasts, ecology, environment, evidence, global warming, hurricanes, Hyper Anthropocene, living shorelines, Massachusetts, National Park Service, New England, nor'easters, oceanography, quantitative ecology, risk, Robert Young, science, sea level rise, shorelines, spatial statistics, sustainability, the right to be and act stupid, the right to know, the tragedy of our present civilization, the value of financial assets | Leave a comment

Time to turn page on natural gas – CommonWealth Magazine

Plugging In Plugging In Energy and the Environment Gas pipeline firm says it’s full-speed ahead By John Flynn, Lee Olivier and Bill YardleySee all » Plugging In Plugging In Energy and the Environment SJC nixes ‘pipeline tax’ By Bruce MohlSee all » Plugging In Plugging In Energy and the Environment Energy bill a solid step(…)

Source: Time to turn page on natural gas – CommonWealth Magazine

Also see this, and this.

Posted in Bloomberg New Energy Finance, BNEF, bridge to nowhere, Carbon Worshipers, climate economics, explosive methane, fossil fuel divestment, fossil fuels, fracking, greenhouse gases, natural gas | Leave a comment

Eversource withdraws from the Spectra-Algonquin “Access Northeast” pipeline project

EversourceWithdraws_2016-08-23_145137
(Click on image to see a bigger copy. Use browser Back Button to return to blog.)

Yes!

Now let’s hope the remaining customers for Spectra’s Access Northeast pull out, and FERC denies permission to proceed. Their next meeting is 22nd September 2016.

Lossage

InefficienciesSlide_12

Update, 2016-08-24

National Grid and the rest of the utilities have pulled out of Spectra-Algonquin’s Access Northeast.

More.

Posted in Anthropocene, Bloomberg New Energy Finance, BNEF, bridge to nowhere, carbon dioxide, Carbon Worshipers, citizenship, civilization, climate change, climate disruption, climate economics, corporate litigation on damage from fossil fuel emissions, decentralized electric power generation, decentralized energy, destructive economic development, disruption, distributed generation, electricity markets, energy utilities, FERC, fossil fuel divestment, fossil fuels, fracking, gas pipeline leaks, global warming, greenhouse gases, grid defection, Hyper Anthropocene, ISO-NE, Joseph Schumpeter, leaving fossil fuels in the ground, local generation, Massachusetts, Massachusetts Clean Energy Center, Massachusetts Interfaith Coalition for Climate Action, New England, pipelines, politics, public utility commissions, PUCs, regime shifts, regulatory capture, the right to be and act stupid, the tragedy of our present civilization, the value of financial assets, utility company death spiral | Leave a comment

“Naïve empiricism and what theory suggests about errors in observed global warming”

A post from one of my favorite statistics-oriented bloggers, Variable Variability, dealing with a subject too casually passed over.

See Naïve empiricism and what theory suggests about errors in observed global warming.

Posted in Anthropocene, climate, climate change, climate data, climate disruption, confirmation bias, geophysics, rationality, statistics, time series, Variable Variability, Victor Venema | Leave a comment

“Understanding Climate Change with Bill Nye”, on Dr Neil deGrasse Tyson’s “Star Talk”

Bill Nye hosts Dr Neil deGrasse Tyson‘s Star Talk Radio, featuring climate change and NASA’s Dr Gavin Schmidt. (See also RealClimate.)

Posted in Anthropocene, astrophysics, Bill Nye, climate, climate change, climate disruption, climate education, Eaarth, ecology, environment, geophysics, global warming, Hyper Anthropocene, Neill deGrasse Tyson | Leave a comment

ECS2x, land, sea, and all that

ecs_2016-08-21_135738

from http://dx.doi.org/10.1126/science.1203513

P.S. I wrote more here. Reproduced below …

Practical likelihood functions are very flat-topped, so the idea that a maximum likelihood function (MLE) can be confined to a point is a theoretical mirage. See Chapter 3 of S. Konishi, G. Kitagawa, Information Criteria and Statistical Modeling, Springer, 2008. Even if you want to set aside Bayesian considerations, whose priors tend to sharpen the posteriors, the best you can do is expected likelihoods, because likelihoods in practice, just like p-values, are random variables. Accordingly, the MLE is a neighborhood, because a point has probability mass zero.

Besides, … the question of multimodality [wasn’t addressed]. Actual Expected Climate Sensitivity is a combination of the densities over oceans and land, each of which have different distributions and modes. (See https://goo.gl/pB7H24 which is from http://dx.doi.org/10.1126/science.1203513) Accordingly, their combination is (at least) bimodal. Ocean ECS has 4 modes. Land ECS has 2 modes, one slightly higher than the other, the higher being at +3.4°C and the second at about +3°C. Worse, the variance of land ECS is over twice than of oceans.

Finally, what you should be looking at is the ECS2x over land, not combined. Even if granted to want to go with the location of the highest mode, that’s +3.4°C.

Posted in climate | Leave a comment

A litany of climate depression

Bill Nye talks about the show, here.

Posted in American Meteorological Association, Anthropocene, bridge to nowhere, carbon dioxide, climate change, climate disruption, climate justice, Eaarth, ecology, global warming, greenhouse gases, Hermann Scheer | Leave a comment

“Disrupt climate disruption”

From Science Music Videos

And if you have the time, a 52 minute movie …

Power concedes nothing without a demand.

“No leader is coming to save us [from climate disruption].”

Posted in adaptation, Anthropocene, bollocks, carbon dioxide, Carbon Worshipers, citizenship, civilization, climate, climate change, climate disruption, climate justice, Daniel Kahneman, denial, dynamical systems, Eaarth, ecology, Ecology Action, Gaylord Nelson, George Monbiot, global warming, greenhouse gases, Hermann Scheer, Hyper Anthropocene, Joseph Schumpeter, liberal climate deniers, moral leadership, Our Children's Trust, regime shifts, science, Spaceship Earth, stranded assets, sustainability, the energy of the people, the green century, the right to be and act stupid, the tragedy of our present civilization, the value of financial assets, zero carbon | Leave a comment

Carbon Sinks in Crisis — It Looks Like the World’s Largest Rainforest is Starting to Bleed Greenhouse Gasses

This is the kind of thing that’s expected of a +3C world, although the idea of it being a threshold phenomenon is a bit unrealistic. So, it’s expected that these sinks might, overall, start releasing their sequestered Carbon, one here one year, another there another year. But if the biggest sinks start releasing theirs first, well, this is one of the Climate Surprises the IPCC and the U.S. National Climate Change assessment talk about. And they are not at all good.

robertscribbler

Back in 2005, and again in 2010, the vast Amazon rainforest, which has been aptly described as the world’s lungs, briefly lost its ability to take in atmospheric carbon dioxide. Its drought-stressed trees were not growing and respiring enough to, on balance, draw carbon out of the air. Fires roared through the forest, transforming trees into kindling and releasing the carbon stored in their wood back into the air.

These episodes were the first times that the Amazon was documented to have lost its ability to take in atmospheric carbon on a net basis. The rainforest had become what’s called carbon-neutral. In other words, it released as much carbon as it took in. Scientists saw this as kind of a big deal.

This summer, a similar switch-off appears to be happening again in the Amazon. A severe drought is again stressing trees even as it is fanning…

View original post 1,144 more words

Posted in bifurcations, carbon dioxide, carbon dioxide sequestration, changepoint detection, climate, climate change, climate disruption, disruption, dynamical systems, environment, exponential growth, fossil fuels, geophysics, global warming, IPCC, Lévy flights, Lorenz, Minsky moment, model-free forecasting, physics, population biology, population dynamics, Principles of Planetary Climate, quantitative biology, quantitative ecology, random walk processes, Ray Pierrehumbert, reason, reasonableness, regime shifts, risk, Stefan Rahmstorf, the right to be and act stupid, the tragedy of our present civilization, UU Humanists | 2 Comments

An Energy Revolution

Professor Mara Prentiss speaks at Harvard on the possibility of an “energy revolution”:

Update, 2016-08-16

Although I am not a PhD professor like Professor Prentiss, nor am I associated with an institution as esteemed as Harvard University, I disagree with her point regarding the need for natural gas, based upon my studies of the solar energy business during the last two years. I also assuredly hope that is not the only way to transition to zero Carbon energy, because from what I know of the climate science, we could be in very, very serious trouble if we need to go through an intermediate CO2-spewing step which will persist for another few decades.

Posted in adaptation, Anthropocene, Bloomberg New Energy Finance, BNEF, clean disruption, climate change, climate disruption, decentralized electric power generation, decentralized energy, demand-side solutions, destructive economic development, disruption, distributed generation, ecology, electricity, electricity markets, energy, energy storage, energy utilities, engineering, fossil fuel divestment, global warming, green tech, greenhouse gases, grid defection, Hermann Scheer, Hyper Anthropocene, investment in wind and solar energy, Joseph Schumpeter, leaving fossil fuels in the ground, Mark Jacobson, Massachusetts Clean Energy Center, Michael Osborne, microgrids, public utility commissions, PUCs, Sankey diagram, smart data, solar domination, solar energy, solar power, Spaceship Earth, stranded assets, sustainability, the energy of the people, the green century, the right to be and act stupid, the right to know, the tragedy of our present civilization, the value of financial assets, Tony Seba, utility company death spiral, wind energy, wind power, zero carbon | Leave a comment

Can the City of Boston adapt to and help mitigate climate disruption?

(See the major update at the bottom of this post as well.)

(On “Less Science and More Social Science” at And Then There’s Physics)

And Then There’s Physics is one of my favorite blogs discussing climate disruption and related policy (in my climate blogroll, for instance). There was a recent post regarding another post by a science blogger called Stoat (one William Connolley) on the limitations of science for dictating mitigation and adaptation policy. Read there for context.

But along the way, a Commentor, mt at 13th August 2016 at 6:25 p.m. where mt, cited the 1979 `Charney Commission report’, suggests science can and has done little more, even with the IPCC. I composed a Comment which suggested at least one city, Boston, was trying to enlist science in its detailed response and planning.

That Comment apparently did not make it through moderation at ATTP, or got lost through a technical glitch, as sometimes happens. I worked on it a bit, so am reproducing it here instead. (As can be seen by the Comment below, apparently there was a technical glitch, and that Comment has now been posted.)


@mt,

Well, the City of Boston is engaged in a pretty deliberate process to ascertain climate impacts, what should City policy and planning dictate, especially with respect to sea level rise and storm surge, and needed investments. It is informed by science, and climate projections for Boston, followed by three additional reports, an Integrated Vulnerability Assessment, a detailing of Resilience Strategies, and a Final Report and Implementation Roadmap. The last three appear to be late, but there is a hard stop of sorts in the form of a Climate Vulnerabilities & Solutions Symposium on the 15th of September, which I am attending.

Attendees will include representatives from local financial firms, banks, insurers and re-insurers, as well as businesses, utilities, real estate people, government people, NGOs and attorneys. There already was a presentation of the Climate Projections Consensus at which there were many representatives of these stakeholders.

Come September, it will be interesting to see how these groups think about the problem, and where they are landing in terms of a mix of the three basic choices,

  1. wait-and-see, with willingness to take on and deal with damage as it comes,
  2. makes some preparations, but basically remain-in-place, or
  3. preparare to abandon the present location of the City, and begin preparations to assess where to go.

This is in part because:

I like Bank of England head Mark Carney’s description that “Climate change is an economic problem.” He wants to avoid a Minsky moment.


The economy is a wholly owned subsidiary of the environment, not the other way around.

Gaylord Nelson


It should be noted as well that the City of Boston has volunteered itself to host a United States-China Climate Summit in 2017. Whether that applies additional pressure, as I suspect, or gives the City cover for delay and greenwashing is anyone’s guess. We’ll see in September.

Update, 2016-08-28

It might be slightly premature, but it seems, as of today, that the answer to the rhetorical question posed in the headline-title of this post is “No”, the City of Boston does not know or want to adapt to climate change, including sea level rise.

The basis for my conclusions is the recent history of Climate Ready Boston reports:

  1. Spring 2016, “Climate Projections Consensus” (in hand and available)
  2. Coming, Summer 2016, “Integrated Vulnerability Assessment: Assessing the potential impacts of climate change on Boston’s buildings, infrastructure, environmental systems, and communities” (missing in action)
  3. Coming, Summer 2016, “Resilience Strategies: Developing preliminary ideas for projects, policies, and programs to help Boston’s neighborhoods and infrastructure respond to climate change and become more resilient” (missing in action)
  4. Coming, Summer 2016, “Final Report And Implementation Roadmap: Pulling the findings and initiatives together with a roadmap to address major vulnerabilities” (missing in action)

There is a 5.5 hour “symposium” scheduled for 15th September 2016 titled the “Boston’s Climate Vulnerabilities & Solutions Symposium, long planned. Presumably these reports were to be completed in order to be able to inform this symposium. Instead, the agenda for the symposium consists of:

  1. Opening remarks
  2. An overview of Climate Ready Boston
  3. Resilience Interventions In Boston: Existing Buildings, New Construction & District Solutions. This includes the following speakers and panel members:
    • John Cleveland, Director, Boston Green Ribbon Commission
    • John Messervy, Director of Capital & Facility Planning, Partners Healthcare
    • Ben Myers, Sustainability Manager, Boston Properties
    • Jeff Wechsler, Marketing Director-Acquisitions, Tishman Speyer
  4. Refreshments & Vendor Expo
  5. Financing And Policy Solutions For Resilience. This includes the following speakers and panel members:
    • Michael E. Mooney, Chairman, Nutter McClennen & Fish LLP
    • Rebecca Davis, Deputy Director, MAPC
    • John Markowitz, Vice President – Infrastructure Finance, MassDevelopment
    • Sara Myerson, Director of Planning, BRA
  6. Closing Remarks, by Austin Blackmon, Chief of Environment, Energy & Open Space, City of Boston

Note:

  • No major political figures
  • No representatives from large financial firms, or insurance firms. The financial district is located a block or two from the Atlantic Avenue Wharf, and is more or less downhill from it
  • No local property owners from Atlantic Avenue, people who were in attendance at a HUCE presentation and panel discussion on getting Boston ready for climate change. That meeting included discussions of planning to move the City.

I can only speculate why this process is deflating. Even at the time of the HUCE meeting, it was clear Boston was not taking measures for preparedness as much as, say, the City of Cambridge is. Of the three possible responses to sea-level rise, in the absence of any other statement, Boston has made a commitment to wait-and-see and remaining-in-place. Unfortunately, this also means that commercial and other development along the Boston waterfront will continue as if nothing is going to happen.

So, in retrospect, the commenter @mt was correct and I was wrong. And Boston is taking the path of the lottery player.
Boston-head-in-sand

I have cancelled my registration for the symposium and will, instead, be attending the 2016 Cleantech Energy Storage Finance Forum that evening.

Posted in adaptation, anomaly detection, Anthropocene, Bill Nye, Bloomberg New Energy Finance, BNEF, bollocks, Boston, bridge to somewhere, citizenship, civilization, clean disruption, climate business, climate change, climate disruption, climate economics, climate education, climate justice, Daniel Kahneman, destructive economic development, economics, engineering, environment, finance, floods, forecasting, Gaylord Nelson, global warming, Hermann Scheer, Hyper Anthropocene, insurance, investing, John Englander, Joseph Schumpeter, Kerry Emanuel, MA, Massachusetts, meteorology, Minsky moment, nor'easters, organizational failures, politics, risk, sea level rise, sociology, statistics, supply chains, sustainability, the right to be and act stupid, the right to know, the tragedy of our present civilization, the value of financial assets | 6 Comments

Repaired R code for Markov spatial simulation of hurricane tracks from historical trajectories

I’m currently studying random walk and diffusion processes and their connections with random fields. I’m interested in this because at the core of dynamic linear models, Kalman filters, and state-space methods there is a random walk in a parameter space. The near-term goal is to understand how to apply these techniques to series of categorical data, and understand the relationship between, say,

W. Li, ``Markov chain random fields for estimation of categorical variables'', Mathematical Geology (2007) 39: 321–335 DOI 10.1007/s11004-007-9081-0.

and

K. V. Mardia, C. Goodall, E. J. Redfern, F. J. Alonso, ``The kriged Kalman filter'', Test, December 1998, 7(2), 217–282 DOI 10.1007/BF02565111.

Along the way, I came across a mention of work done by Christophe Denuse-Baillon in his actuarial thesis which appears to study and describe a method of simulating North Atlantic hurricane tracks using a Markov spatial process calibrated with historical tracks. The idea, essentially, is to make a random walk from a suitably chosen starting point, taking steps conditional upon relative frequencies of directions taken from the current step. In other words, the Markovian assumption of conditional independence is made so the next step depends only upon the probabilities of taking a step in directions centered on the current one.

I would not know the details, since Denuse-Baillon’s thesis is in French, but Professor Arthur Charpentier introduced the idea and provided some R code for implementing Denuse-Baillon’s idea on actual hurricane tracks. Now the thing is the datasets he drew upon have naming mistakes in them which need to be repaired in order to use them, and his blog post did not identify these, although he did note they existed. But, more seriously, he did not provide a complete, runnable set of R code to reproduce his results, something which would be quite useful to students of the technique and the problem.

This blog posting remedies that providing runnable R code for replicating his results (a tarball), and fixing the mistakes in the datasets on the fly. Note that the R relies upon the XML, maps, ks, and RColorBrewer packages which also need to be installed.

The general technique is interesting in that it offers a non-parametric, indeed, model free way of forecasting hurricane tracks.

I have reproduced the figures Professor Charpentier showed below:

MarkovHurricanes_tracks_only_2016-08-11_194335

MarkovHurricanes_densityplot_2016-08-11_194817

MarkovHurricanes_startingpoints_2016-08-11_210738

MarkovHurricanes_sim_trajectories_2016-08-11_210825

Larger versions of these figures can be seen by clicking on any one of them, and then using your browser Back Button to return to the blog.

The R code has been cleaned up some, and made a bit more robust.

Robert Grant also remarked upon Professor Charpentier’s work.

Note that it is still possible that when trajectories are generated, if a poor starting point is chosen, the generation step can take a while.

I also want to make note of the spMC package by Luca Sartore, available via CRAN, which he describes here. Abstract:

Currently, a part of the R statistical software is developed in order to deal with spatial models. More specifically, some available packages allow the user to analyse categorical spatial random patterns. However, only the spMC package considers a viewpoint based on transition probabilities between locations. Through the use of this package it is possible to analyse the spatial variability of data, make inference, predict and simulate the categorical classes in unobserved sites. An example is presented by analysing the well-known Swiss Jura data set.

Posted in American Meteorological Association, American Statistical Association, AMETSOC, Arthur Charpentier, atmosphere, diffusion, diffusion processes, dynamic linear models, dynamical systems, environment, geophysics, hurricanes, Kalman filter, Kerry Emanuel, Lévy flights, Lorenz, Markov chain random fields, mathematics, mathematics education, maths, MCMC, mesh models, meteorological models, meteorology, model-free forecasting, Monte Carlo Statistical Methods, numerical analysis, numerical software, oceanography, open data, open source scientific software, physics, random walk processes, random walks, science, spatial statistics, state-space models, statistical dependence, statistics, stochastic algorithms, stochastics, time series | Leave a comment

Living deliberately in Washington, D.C. (courtesy of The Atlantic magazine)

The adventures of Keya Chatterjee and her family living free of Pepco. Courtesy of The Atlantic magazine.

Posted in adaptation, Anthropocene, Bloomberg New Energy Finance, bridge to somewhere, Buckminster Fuller, clean disruption, climate change, climate disruption, climate justice, decentralized electric power generation, decentralized energy, disruption, distributed generation, Ecology Action, economics, efficiency, electricity, electricity markets, energy reduction, energy utilities, fossil fuel divestment, global warming, grid defection, Hyper Anthropocene, ice sheet dynamics, leaving fossil fuels in the ground, microgrids, public transport, public utility commissions, PUCs, resiliency, Sankey diagram, solar domination, solar energy, solar power, the energy of the people, the green century, zero carbon | Leave a comment

Energy Democracy

I’ve actually written about this before, but John Farrell of the ILSR (“Institute for Local Self-Reliance” a famous Emerson essay, by the way) presents an up-to-date synthesis of developments, incorporating policy as well as Tony Seba-like, Hermann Scheer-like, and Michael Osborne-like insights.

By the way, the dreaded duck curve which utilities wonks and even some IEEE engineers from the PES I’ve listened to (more details here) does not seem to be materializing in two of the strongest renewables markets in the United States.

Update, 2016-08-21

From CleanTechnica:

Distributed Power

It’s a matter of political and philosophical debate, but I agree with the idea that society is generally better off when socioeconomic and political power are distributed. (Granted, a benevolent dictator can be a wonderful gift for a society, but most dictatorships don’t tend to be very benevolent from what I’ve seen.)

While we do live in a somewhat democratic society (in the US, Canada, UK, Europe, Australia, India, Korea, Japan, or wherever you are probably reading from), there’s no doubt that money = power, and people with more money or representing more money have more power in politics and society.

With regard to this matter, we often think of powerful people and companies in the telecommunications, media, banking, and real estate industries. Clearly, though, these aren’t the only ones trying to steer more cash to their executives than to society as a whole.

Of course, with many utilities being regulated monopolies, these are powerful giants as well (no pun intended). Despite the fact that they are regulated, the vast majority of us can’t name the people who regulate them, and there is rampant corruption in the sector. Largely, we don’t even know what they’re doing. I think we typically take utilities for granted and leave their work almost invisible — they’re there, we have to pay them to keep the lights and computers on, someone is watching over them to make sure they don’t fleece us (too much), etc.

A more obvious “enemy of the societal good” is the fossil energy industry. Burning coal and natural gas kills millions of people prematurely every year. We somehow accept burning these fossils as a necessity of modern life (though, given the state of clean technologies like solar and EVs, we no longer should), but we also know that these industries work hard to not clean up their processes and emit less pollution. They lobby government and fight huge wars against regulation with millions and millions of dollars that could have just gone toward protecting more lives from pollution. But hey, what can we do?

To find out, read their article.

RooftopSolarShiftsPower

DivestFromFossilFules

Update: 2016-08-23

From ILSR, again:

I wouldn’t presume to define energy democracy for all those using the term, but I think those of us that use it share these common principles:

  • Energy democracy means both the sources (e.g. solar panels) and ownership of energy generation are distributed widely.
  • Energy democracy means that the management of the energy system be governed by democratic principles (e.g. by a public, transparent, accountable authority) that allows ordinary citizens to have a say. This means that communities that wish greater control over their energy system (via municipalization of utilities, for example) should have minimal barriers to doing so.
  • Energy democracy means that the wide distribution of power generation and ownership, and access to governance of the energy system be equitable by race and socioeconomic status.

Posted in abstraction, Anthropocene, Bloomberg New Energy Finance, BNEF, bridge to somewhere, Buckminster Fuller, citizenship, clean disruption, CleanTechnica, climate economics, corporate litigation on damage from fossil fuel emissions, decentralized electric power generation, decentralized energy, demand-side solutions, destructive economic development, distributed generation, economics, efficiency, electricity, electricity markets, energy, energy reduction, energy storage, energy utilities, engineering, evidence, extended supply chains, feed-in tariff, forecasting, fossil fuel divestment, fossil fuels, green tech, grid defection, Hermann Scheer, Hyper Anthropocene, ILSR, investment in wind and solar energy, John Farrell, Joseph Schumpeter, leaving fossil fuels in the ground, liberal climate deniers, life purpose, local generation, Massachusetts Clean Energy Center, Michael Osborne, microgrids, open data, public utility commissions, PUCs, rate of return regulation, rationality, reason, reasonableness, regime shifts, regulatory capture, risk, Sankey diagram, solar democracy, solar domination, solar energy, solar power, SolarPV.tv, Spaceship Earth, spatial statistics, statistics, stranded assets, sustainability, temporal myopia, the energy of the people, the green century, Tony Seba, utility company death spiral, wind energy, wind power, zero carbon | Leave a comment

“Holy crap – an actual book!”

You’ll find links to Cathy O’Neil’s important book in the Blogroll here, as well as a link to reviews of it.

I have not read it yet. While I have pre-ordered it, it’s not available. I have read the reviews, all favorable. I suspect it expresses a lot of concern that statisticians, including myself, have long expressed regarding “data mining” (data dredging), and inferences derived from “big data”, including a lack of concern about the quality of the data, forgetting about cleaning, and whether or not, despite the size of a dataset, there is enough data to warrant a strong estimate or inference in a particular case.

These arguments have been made by many, like Simply Statistics‘ “Why big data is in trouble: they forgot about applied statistics” from 2014. Or Bajorski’s “Applied Statistics Comes to the Rescue of Big Data”. And the wave of emphasis upon pure machine learning methods of late does not help: These obfuscate and confound, making it harder to understand what’s really going on.

Anyway, I very much look forward to reading the book!

mathbabe

Yo, everyone! The final version of my book now exists, and I have exactly one copy! Here’s my editor, Amanda Cook, holding it yesterday when we met for beers:

20160809_161608

Here’s my son holding it:

20160809_161558 He’s offered to become a meme in support of book sales.

Here’s the back of the book, with blurbs from really exceptional people:

20160810_074117

In other exciting book news, there’s a review by Richard Beales from Reuter’s BreakingViews, and it made a list of new releases in Scientific American as well.

Endnote:

I want to apologize in advance for all the book news I’m going to be blogging, tweeting, and otherwise blabbing about. To be clear, I’ve been told it’s my job for the next few months to be a PR person for my book, so I guess that’s what I’m up to. If you come here for ideas and are turned off by cheerleading, feel…

View original post 34 more words

Posted in American Association for the Advancement of Science, Buckminster Fuller, business, citizen science, citizenship, civilization, complex systems, confirmation bias, data science, data streams, deep recurrent neural networks, denial, economics, education, engineering, ethics, evidence, Internet, investing, life purpose, machine learning, mathematical publishing, mathematics, mathematics education, maths, moral leadership, multivariate statistics, numerical software, numerics, obfuscating data, organizational failures, politics, population biology, prediction, prediction markets, privacy, quantitative biology, quantitative ecology, rationality, reason, reasonableness, rhetoric, risk, Schnabel census, smart data, sociology, statistical dependence, statistics, the right to be and act stupid, the right to know, the value of financial assets, transparency, UU Humanists | Leave a comment

Dramatis personæ: How to do zero Carbon emissions at a residence (Westwood, MA)

IMG_5223
(To see a larger picture for this and all images, click on image and then use browser Back Button to return to blog.)

Our external meter is now reading negative.

Peak was 2850 in April 2016. Solar panels were grid connected on 31st December 2015.
eversource_2016-08-09_090651

Now, and for the last couple of months, our electric bill has been negative:
currentEversource_2016-08-09_090736

This is the trend heading to zero on the meter, including a calculation of household electricity consumption. We heat with electricity, heat hot water with electricity, and have an induction stove:
2016-08-09_090109

How?

Air source heat pumps for heating and cooling:
IMG_5201

You can find the latest on these systems here. We have models ASU12RLF, AUU12RLF, ASU9RLF.

IMG_5205

IMG_5208

IMG_5209

IMG_5212

IMG_5217

IMG_5219

IMG_5214

Air source hot water heater (General Electric):
IMG_5233

IMG_5234

And, of course, 29 SunPower PV panels installed by RevoluSun:
pvo_2016-08-09_091422

So far we’re generated 6.6 megawatt-hours:
IMG_5232

rs_2016-08-09_091522

pvo_2016-08-09_091422

IMG_5226

IMG_5224

Our inverter:
IMG_5227

And this is the oil heating system we orphaned to do this:
IMG_5235

Posted in Anthropocene, Bloomberg New Energy Finance, bridge to somewhere, Buckminster Fuller, citizenship, clean disruption, climate business, climate disruption, climate economics, decentralized electric power generation, decentralized energy, electricity, electricity markets, energy, energy reduction, energy utilities, fossil fuel divestment, global warming, green tech, grid defection, Hyper Anthropocene, investment in wind and solar energy, local generation, Mark Jacobson, Massachusetts Clean Energy Center, Michael Osborne, microgrids, New England, open data, regime shifts, RevoluSun, Sankey diagram, solar domination, solar energy, solar power, Spaceship Earth, SunPower, sustainability, the energy of the people, the green century, the value of financial assets, Tony Seba, wind energy, wind power, zero carbon | Leave a comment

TOO LATE: “There will be no golden age of [natural] gas”

Last month, Tom Randall at Bloomberg New Energy Finance (“BNEF”) profiled a new forecast which shows costs for zero Carbon energy and energy are plummetting so fast that coal, oil, and natural gas will begin their terminal decline within a decade.

natural-gas_large

Since 2008, the single most important force in U.S. power markets has been the abundance of cheap natural gas brought about by fracking. Cheap gas has ravaged the U.S. coal industry and inspired talk of a “bridge fuel” that moves the world from coal to renewable energy. It doesn’t look like that’s going to happen.

The costs of wind and solar power are falling too quickly for gas ever to dominate on a global scale, according to BNEF. The analysts reduced their long-term forecasts for coal and natural gas prices by a third for this year’s report, but even rock-bottom prices won’t be enough to derail a rapid global transition toward renewable energy.

“You can’t fight the future,” said Seb Henbest, the report’s lead author. “The economics are increasingly locked in.” The peak year for coal, gas, and oil: 2025.

Wind-Power-Cost-per-Kwh

In addition to the pace of technology improvements in solar, storage, and wind, US$8 trillion is estimated to be poised to be dropped into renewables energy, even if, through 2040, people will, dare I say, waste US$2 trillion developing more fossil fuels. By 2027, BNEF estimates that building new wind and solar power will be cheaper than even operating fossil fuel plants. And capacity factors for peaking and standard generation natural gas plants will be hit by the increasing availability of extremely low cost zero Carbon energy, meaning the capital costs of the fossil fuel plants will skyrocket. These assets will rapidly be stranded. And this is just on price: There’s no accounting for what the impact will be of Carbon fees, which are looking increasingly plausible, if only to raise monies to clean up the mess the fossil fuel industry has made of our atmosphere.
ModulePrice
And, from what I have seen of fossil fuel companies, their infrastructure builders, and their PR people, it couldn’t happen to a nicer bunch of folks. They deserve it. No tears here. A new kind of climate justice.

But, as far as climate disruption goes, people should not get complacent. Even if all this happens on this schedule, it won’t happen fast enough to prevent serious disruption of life and business due to climate effects. Accordingly, we need to push. And if the public are convinced to waste their share of that US$2 trillion on fossil fuels and their infrastructure, this could seriously slow us down making the change we need to make.

160802.215.2313.n18

Taking the CO2 out of atmosphere once it’s there is not easy and it is very expensive.

Important Update

The Massachusetts Supreme Judicial Court today roundly denied the Republican Governor Baker, his Department of Public Utilities, Eversource, National Grid, and pipeline companies, notably Spectra Energy, the vehicle whereby ratepayers would be charged to pay for construction of additional natural gas capacity to Massachusetts for the generation of electricity.

My favorite part of today’s SJC decision regarding, SJC-12051, SJC-12052, ENGIE GAS & LNG LLC1 vs. DEPARTMENT OF PUBLIC UTILITIES; also Conservation Law Foundation vs. Department of Public Utilities, emphasis added.

Perhaps most importantly, however, the department’s order would reexpose ratepayers to the very types of risks that the Legislature sought to protect them from when it enacted the restructuring act. Both the DOER and the department noted that gas-fired generating businesses are unwilling to assume the risks associated with long-term gas pipeline capacity contracts because there “is no means by which they can” assure recovery of those contract costs. Shifting that risk onto the electric ratepayers of the Commonwealth, however, is entirely contrary to the risk-allocation design of the restructuring act. Equally unavailing is the department’s finding that the order does not contravene the policy embodied in the restructuring act because it does not allow the use of ratepayer funds to construct a power plant. D.P.U. 15-37, at 27. As prior decisions by this court and the department make clear, power plant construction is only one aspect of the electric generation market, and in enacting the restructuring act, the Legislature sought to separate all aspects of generation from all aspects of distribution. See, e.g., D.T.E. 98-13, at 4; D.T.E. 98-84, at 1. Moreover, the department itself has recognized that fuel procurement and planning is an integral component of the generation business, as evidenced by its exemption of electric distribution companies from § 69I. Indeed, by some estimations, fuel-related costs constitute seventy-five per cent of a natural gas-fired plant’s generation costs. 3 World Scientific Handbook of Energy 72 (G.M. Crawley ed., 2013). Accordingly, prior to the enactment of the restructuring act, the department required electric companies to consider both the type and amount of fuel they would use to generate power when they calculated whether they could supply enough electricity to match expected demand. We agree with the plaintiffs that if the restructuring act does not allow electric distribution companies to finance investments in electric generation, it cannot be reasonably interpreted to permit those companies to invest in infrastructure unrelated to electric distribution service. Accordingly, we reject the department’s reasoning. See Cardin v. Royal Ins. Co. of Am., 394 Mass. 450, 456-557 (1985) (agency’s interpretation of statute “hardly persuasive where [it] violates the language and policy of the statute,” [quotation and citation omitted]).

The department’s interpretation of the statute as permitting electric distribution companies to shift the entire risk of the investment to the ratepayers is unreasonable, as it is precisely this type of shift that the Legislature sought to preclude through the restructuring act. Contrast D.P.U. 12-77, at 28 (Mar. 15, 2013) (“The legislation restructured the electric industry in the state by providing incentives to investor-owned electric distribution companies to divest their generating assets and by adopting a competitive market structure for the generation and purchase of electricity. This restructuring shifted the risks of generation development from consumers to generators, who are better positioned to manage those risks”).

Our interpretation of the restructuring act is supported by the Legislature’s own actions since the law’s enactment. That is, where the Legislature has sought to override the risk allocation policy of the act, it has done so expressly. First, in 2008, through enactment of the Green Communities Act, St. 2008, c. 169, the Legislature directed electric distribution companies to seek proposals from renewable energy developers, and, if they received reasonable proposals, to enter into ratepayer-backed long-term contracts to buy the renewable power. See St. 2008, c. 169, § 83. The Legislature concluded that such contracts were necessary to “facilitate the financing of renewable energy generation facilities.” Alliance to Protect Nantucket Sound, Inc. v. Department of Pub. Utils. (No. 1), 461 Mass. 166, 168 (2011). Importantly, in enacting the Green Communities Act, the Legislature explicitly provided the department with the authority to review and approve the ratepayer-backed renewable energy contracts. St. 2008, c. 169, § 83 (“[a]ll proposed contracts shall be subject to the review and approval of the department of public utilities”).

The Green Communities Act represents a legislatively created exception to the restructuring act’s general prohibition on electric distribution companies owning generation assets. To facilitate promotion of renewable energy in the Commonwealth, the Legislature allowed each distribution company to construct, own, and operate twenty-five megawatts of solar energy before January 1, 2009, and 50 megawatts after January 1, 2010. St. 2008, c. 169, § 58. Section 58 further provided that an electric distribution company had to obtain prior approval for cost recovery from the department in order to recover construction costs of a solar generation facility. Id. Although the statute has since been amended, it continues to provide an express, limited exemption from the restructuring act. See St. 2012, c. 209, § 17.
.
.
.
Here, the department’s stated motive in issuing the order is to correct a perceived failure of market-based incentives to encourage wholesale generators to contract for adequate pipeline capacity. However, its means of doing so, namely by reallocating risk onto the ratepayers, is clearly prohibited by legislative policy. Thus, no matter how salutary the department may claim its policy aims to be, its order contravenes the fundamental policy embodied in the restructuring act and cannot stand. See Utility Air Regulatory Group v. Environmental Protection Agency, 134 S. Ct. 2427, 2446 (2014) (agency authority to interpret ambiguities in enabling statute “does not include a power to rewrite clear statutory terms to suit its own sense of how the statute should operate”); Wakefield Teachers Ass’n v. School Comm. of Wakefield, 431 Mass. 792, 802 (2000) (fundamental policy decisions are province of Legislature, and not coordinate branches of government).

Update, 2016-08-28: The beginning of the end

From Bloomberg New Energy Finance:
TheBeginningOfTheEnd--image27-e1429976056747

Posted in adaptation, American Petroleum Institute, Anthropocene, Bloomberg, Bloomberg New Energy Finance, BNEF, bridge to nowhere, Buckminster Fuller, Chevron, citizenship, clean disruption, climate change, climate disruption, climate economics, climate justice, decentralized electric power generation, decentralized energy, disruption, distributed generation, Ecology Action, economics, efficiency, electricity, electricity markets, energy, energy storage, energy utilities, engineering, explosive methane, Exxon, false advertising, forecasting, fossil fuel divestment, fossil fuels, fracking, geoengineering, global warming, green tech, greenhouse gases, grid defection, Gulf Oil, Hermann Scheer, Hyper Anthropocene, investment in wind and solar energy, ISO-NE, Joseph Schumpeter, leaving fossil fuels in the ground, Mathematics and Climate Research Network, methane, microgrids, natural gas, petroleum, pipelines, rate of return regulation, rationality, reason, reasonableness, regime shifts, risk, solar domination, solar energy, solar power, Standard Oil of California, stranded assets, supply chains, sustainability, the energy of the people, the green century, the right to be and act stupid, the right to know, the stack of lies, the tragedy of our present civilization, the value of financial assets, utility company death spiral, wind energy, wind power, zero carbon | 1 Comment

A model of an electrical grid: A vision

Many people seem to view the electrical grid of the future being much like the present one. I think a lot about networks, because of my job. And I especially think a lot about network topologies, although primarily concerning the Internet.

Presently, most utilities’ electrical grids are hub-and-spoke topologies, like Boston’s somewhat wanting subway system, where there is a center, and distribution is fed from it, and generation feeds it from an array of generators who are paid for the privilege and bid to do so. It is overseen by ISOs like ISO-NE. Sometimes there isn’t enough generation and one ISO needs to borrow from the next, but that’s rare and, in this market-based system, expensive. Bidding is done day-ahead.

What I envision is a completely different topology.

The equilibrium state of such a grid or network would consist of spatially separated clusters, roughly proportional to population, with some cleaving along political boundaries. Each cluster would have its own generation sources — wind, solar, storage — and, of course, its own demand. Roughly half the time (or more!) each cluster gets all the power it needs from its own generation. The rest of the time, the cluster draws power from available power in nearby clusters, coordinated as a super-cluster. The great majority of the time, say 90+%, each super-cluster is completely self-sufficient. There may be community wind or solar generation at the super-cluster level which ties. It is entirely possible that the distribution lines and substations at the super-cluster are owned by an entity not associated with what we would identify as a “utility”. At best (or worst) it would be like a municipal utility or energy cooperative. (These already exist in Massachusetts.) They maintain the distribution network. While they are a kind of monopoly, like utilities, they have a smaller population to serve and, accordingly, need to answer for the level of service they do or not provide. Their board might even be elected.

When a super-cluster fails to provide for its own energy, it can either request it from neighboring super-clusters, or it can approach what is the vestige of the present utility, the vestige of a hub or grid. Such an entity would be economically much smaller than today, having sold off most of its assets to super-clusters, but its role would remain, albeit much smaller than that of today. Its political influence would also be accordingly smaller.

There is also a much reduced need for an ISO in this model. Rather than a centrally planned and centrally controlled manager of energy, the clusters system can be mathematically designed and implemented via control theory to be self-balanced and self-regulating. Accordingly, there is little need for command decisions, although there continues to be a need for monitoring and reporting to national authorities and agencies.

Naturally, transitioning from the present central system to such a clustered approach will have its bumps and pitfalls. I think this will be technology and price driven, and there will be entrepreneurs who will create the local storage and digital energy overlay networks to make this happen. It would probably go easier if there were an oversight agency or agencies. New York State appears to be trying to do something like that, but I’m skeptical that Massachusetts, for instance, can, since its governance is tied up and torn between a number of fierce vested interests.

I bet New York will get there first, and I bet New York’s economy will be much the better for it, at least than that of Massachusetts.

Finally, such a system, hierarchically decomposed, will be far more robust and resilient than the present one, if much less profitable, because it depends less upon extended supply chains. Its price volatility will be a fraction of the volatility in the present system, offering less reward for arbitrage. It will also be vastly more efficient in its use of energy. There will be no need for natural gas pipelines. These are good.

This is the kind of “grid defection” or, rather, grid transformation most policy leaders don’t imagine.

(Below, some gratuitously included clips from the Canettes Blues Band, literally, a group of LHC physicists who play the blues.)

Posted in abstraction, American Meteorological Association, anomaly detection, Anthropocene, Bloomberg New Energy Finance, BNEF, Boston, bridge to somewhere, Buckminster Fuller, Canettes Blues Band, clean disruption, climate business, climate economics, complex systems, corporate supply chains, decentralized electric power generation, decentralized energy, demand-side solutions, differential equations, distributed generation, efficiency, EIA, electricity, electricity markets, energy, energy reduction, energy storage, energy utilities, engineering, extended supply chains, green tech, grid defection, Hermann Scheer, Hyper Anthropocene, investment in wind and solar energy, ISO-NE, Kalman filter, kriging, Lawrence Berkeley National Laboratory, leaving fossil fuels in the ground, Lenny Smith, local generation, marginal energy sources, Massachusetts Clean Energy Center, Mathematics and Climate Research Network, mesh models, meteorology, microgrids, networks, New England, New York State, open data, organizational failures, pipelines, planning, prediction markets, public utility commissions, PUCs, rate of return regulation, rationality, reason, reasonableness, regime shifts, regulatory capture, resiliency, risk, Sankey diagram, smart data, solar domination, solar energy, solar power, Spaceship Earth, spatial statistics, state-space models, statistical dependence, statistics, stochastic algorithms, stochastics, stranded assets, supply chains, sustainability, the energy of the people, the green century, the value of financial assets, thermodynamics, time series, Tony Seba, utility company death spiral, wave equations, wind energy, wind power, zero carbon | Leave a comment

Bayesian blocks via PELT in R

The Bayesian blocks algorithm of Scargle, Jackson, Norris, and Chiang has an enthusiastic user community in astrostatistics, in data mining, and among some in machine learning. It is a dynamic programming algorithm (see VanderPlas referenced below) and, so, exhibits optimality when used without performance-related shortcuts. There is an implementation by Scargle and others in MATLAB, and a popular version exists in the astropy and astroML modules for Python. Scargle introduced the idea about 2001, and has repeatedly improved it. I searched for a comparable implementation for R, the programming language I use most heavily and could not find one.

In time, however, I discovered the work of Killick, Fearnhead, and Eckley, led to it by an oceanographer colleague. They build upon the work of Jackson, Scargle, Barnes, Arabhi, Alt, Gioumousis, Gwin, Sangtrakulcharoen, Tan, and Tsai, but find a pruning step which lets the algorithm achieve the same result but in linear time. Their algorithm, known as PELT, has been incorporated into two R packages on CRAN, changepoint and changepoint.np, and has found use in oceanography and other areas. Still, a one-for-one substitute for the Scargle-Jackson-Norris-Chiang Bayesian blocks was missing, and it is the purpose of this post to remedy that.

  • R. Killick, P. Fearnhead, And I. A. Eckley, “Optimal detection of changepoints with a linear computational cost”, Journal of the American Statistical Association, December 2012, 107(500), DOI: 10.1080/01621459.2012.737745
  • R. Killick, I. A. Eckley, K. Ewans, P. Johnson, “Detection of changes in the characteristics of oceanographic time-series using changepoint
    analysis”, Ocean Engineering, 2010, 37(13), 10.1016/j.oceaneng.2010.04.009
  • R. Killick, I. A. Eckley, “changepoint: An R package for changepoint analysis”, Journal of Statistical Software, June 2014, 58(3), 10.18637/jss.v058.i03
  • B. Jackson, J. D. Scargle, D. Barnes, S. Arabhi, A. Alt, P. Gioumousis, E. Gwin, P. Sangtrakulcharoen, L. Tan, T. T. Tsai, “An algorithm for optimal partitioning of data on an interval”, IEEE Signal Processing Letters, February 2005, 12(2), 10.1109/LSP.2001.838216
  • J. D. Scargle, J. P. Norris, B. Jackson, J. Chiang, “Studies in astronomical time series analysis. VI. Bayesian Block representations”, The Astrophysical Journal, 2013 February 20, 764(167), 10.1088/0004-637X/764/2/167
  • J. VanderPlas, “Dynamic programming in Python: Bayesian Blocks”, Pythonic Perambulations, 12 September 2012, a blog entry
  • bayesian_blocks, a component of astropy.stats, the astropy module for Python
  • 4.1 Bayesian Blocks: Histograms the right way, a component of 4. Unsupervised Learning: Density Estimation, the astroML module for Python
  • J. D. Scargle, “Bayesian Blocks in two or more dimensions: Image segmentation and cluster analysis”, arXiv:math/0111128v1

Jake VanderPlas is one of the astroML authors and, specifically that of the Bayesian blocks implementation therein. He offers a description of the Scargle algorithm, explaining it as an instance of a dynamic programming code pattern. He also offers a clever mixture distribution consisting of five Cauchy distributions for illustrating his code, a mixture distribution I have adopted for illustrating this, my implementation of Bayesian blocks in R. (R code is at link.)

Comparing its performance to that of a standard histogramming facility, truehist in the MASS package, we see the following:
bb_histogram_comparison_2016-08-01_195029
(Click on figure to see a larger image, then use browser Back Button to return to blog.)

The result from VanderPlas’ implementation can be seen at his blog, just above the Conclusion at its bottom.

But VanderPlas’ mixture also can serve as a series for another test. Bayesian blocks has been proposed as a signals representation as well, one which can be used for compression as well as specifying features. The implementation here, based upon the Killick-Fearnhead-Eckley PELT, can do that as well, as can be seen by the figure below:
bb_segmentation_of_series_2016-08-01_195518
(Click on figure to see a larger image, then use browser Back Button to return to blog.)

The code is available from my Google space as a gzip‘d R file. Please feel free to use for anything you like, and, not only would I be happy to answer questions about it, I’d like to know what kinds of things you are using it for, or receiving suggestions for improvements. Please comment below!

I am also intending to generalize the code so it can be used for multidimensional Bayesian blocks, as Scargle has suggested (in J. D. Scargle, “Bayesian Blocks in two or more dimensions: Image segmentation and cluster analysis”, arXiv:math/0111128v1), but, of course, based upon the Killick-Fearnhead-Eckley PELT. My particular interest is cluster-finding.

Posted in American Statistical Association, AMETSOC, anomaly detection, astrophysics, Cauchy distribution, changepoint detection, engineering, geophysics, multivariate statistics, numerical analysis, numerical software, numerics, oceanography, population biology, population dynamics, Python 3, quantitative biology, quantitative ecology, R, Scargle, spatial statistics, square wave approximation, statistics, stepwise approximation, time series, Woods Hole Oceanographic Institution | Leave a comment

“Full-depth Ocean Heat Content” reblog

This is a re-blog of an excellent post at And Then There’s Physics, titled Full-depth OHC or, expanded, “full-depth ocean heat content”.

Since my holiday is now over, I thought I might briefly comment on a recent paper by Cheng et al., called Observed and simulated full-depth ocean heat-content changes for 1970–2005. John Abraham, o…

Source: Full-depth OHC

Posted in Anthropocene, climate, climate change, climate data, climate disruption, climate models, computation, differential equations, ensembles, environment, fluid dynamics, forecasting, geophysics, global warming, greenhouse gases, Hyper Anthropocene, Lorenz, Mathematics and Climate Research Network, model comparison, NOAA, oceanography, physics, science, statistics, theoretical physics, thermodynamics, time series | Leave a comment

Richard Somerville, UCSD, Scripps: “The science is becoming more widely accepted”

By Richard Somerville, emiritus professor of Oceanography from Scripps Institution of Oceanography. See the site he helps build and run regarding communication regarding change.

Somerville_IPCC_summary_2016-07-24_154819

Posted in adaptation, American Meteorological Association, AMETSOC, Anthropocene, atmosphere, citizenship, civilization, climate, climate change, climate disruption, environment, forecasting, global warming, meteorology, oceanography, Principles of Planetary Climate, science, Scripps Institution of Oceanography, the right to know, the tragedy of our present civilization, University of California, zero carbon | Leave a comment

“Stochastic Parameterization: Towards a new view of weather and climate models”

Judith Berner, Ulrich Achatz, Lauriane Batté, Lisa Bengtsson, Alvaro De La Cámara, Hannah M. Christensen, Matteo Colangeli, Danielle R. B. Coleman, Daan Crommelin, Stamen I. Dolaptchiev, Christian L.E. Franzke, Petra Friederichs, Peter Imkeller, Heikki Järvinen, Stephan Juricke, Vassili Kitsios, François Lott, Valerio Lucarini, Salil Mahajan, Timothy N. Palmer, Cécile Penland, Mirjana Sakradzija, Jin-Song Von Storch, Antje Weisheimer, Michael Weniger, Paul D. Williams, Jun-Ichi Yano, Stochastic Parameterization: Towards a new view of weather and climate models, Bulletin of the American Meteorological Society, published online 19^{th} July 2016,

Abstract

Stochastic parameterizations — empirically derived, or based on rigorous mathematical and statistical concepts — have great potential to increase the predictive capability of next generation weather and climate models.

The last decade has seen the success of stochastic parameterizations in short-term, medium-range and seasonal forecasts: operational weather centers now routinely use stochastic parameterization schemes to better represent model inadequacy and improve the quantification of forecast uncertainty. Developed initially for numerical weather prediction, the inclusion of stochastic parameterizations not only provides better estimates of uncertainty, but it is also extremely promising for reducing longstanding climate biases and relevant for determining the climate response to external forcing.

This article highlights recent developments from different research groups which show that the stochastic representation of unresolved processes in the atmosphere, oceans, land surface and cryosphere of comprehensive weather and climate models (a) gives rise to more reliable probabilistic forecasts of weather and climate and (b) reduces systematic model bias.

We make a case that the use of mathematically stringent methods for the derivation of stochastic dynamic equations will lead to substantial improvements in our ability to accurately simulate weather and climate at all scales. Recent work in mathematics, statistical mechanics and turbulence is reviewed, its relevance for the climate problem demonstrated, and future research directions outlined.

And five related papers, from another field:

Posted in biology, climate models, complex systems, convergent cross-mapping, data science, dynamical systems, ecology, Ethan Deyle, Floris Takens, George Sughihara, Hao Ye, likelihood-free, Lorenz, mathematics, meteorological models, model-free forecasting, physics, population biology, population dynamics, quantitative biology, quantitative ecology, Scripps Institution of Oceanography, state-space models, statistical dependence, statistics, stochastic algorithms, stochastic search, stochastics, Takens embedding theorem, time series, Victor Brovkin | 4 Comments

Natural gas: The Zaphod Beeblebrox of energy

Amber Lin at The Bulletin of the Atomic Scientists describes the two-headed character of natural gas plants needed to implement “natural gas as a bridge fuel”, and sketches the stark reality proponents of that argument are embracing if they are serious about using natural gas, whether for electricity or heating, to reduce greenhouse gas emissions.

The basic fact is that in order to serve as a proper “bridge”, natural gas infrastructure would need to be decommissioned by 2050, including ceasing flows of the gas through the elaborate pipelines which criss-cross the United States. That’s because emission limits for CO2 dictated by Nature cannot be met otherwise, with 450 ppm CO2, just 40 ppm higher than where we are now, corresponding to the widely accepted +2°C warming limit. And, as that is unlikely, if we want to limit warming to +3°C 650 ppm is the overall limit, and +3°C brings us into a highly uncertain, dangerous, and eventually ice-free world. In particular, we might lose control of a portion of the warming process, since large natural stores of CO2 are quite likely to be breached and begin leaking at those temperatures.

The Presidential commission on the matter also sketched the key problem with using a “bridge fuel” mechanism to reach targets like this, namely, “A slow start leads to a crash finish”, meaning that to hit these targets, the abandonment of fossil fuels and their infrastructure must be pursued much more quickly than if we start early. I daresay, none of the proposals for new natural gas generation have incorporated operating lifetimes which abruptly end in 2050, or depreciation schedules which reflect that. In fact, the new Massachusetts Salem Harbor gas-powered electricity generator has a lifetime up through 2080.

mixedZG

Amber Lin tells how there are really two incompatible kinds of natural gas plants for electricity generation:

When constructing a new natural gas power plant, there are two options: a combined cycle or an open cycle. A combined-cycle power plant produces electricity with relatively high efficiency and low carbon emissions: When the gas burns, it heats and compresses air to spin a turbine and power a generator. A heat recovery system captures waste heat, which is routed to a nearby steam turbine to generate even more power. Combined-cycle plants have low operating costs, but because high capital costs must be offset, these plants are built to produce baseload power—available 24 hours a day. Open-cycle gas turbine plants lack the steam cycle, so their thermal efficiency is much lower, and their carbon emissions per unit of electricity generated are slightly higher. Their running costs are much higher than a combined-cycle plant, but they have a much lower start-up cost, so they are often built as “peakers,” plants that run only to support other power infrastructure during hours of high demand or when solar or wind isn’t available.

Considering the two choices in the larger context of natural gas as a “transition fuel,” a dilemma appears: To build the bridge, combined-cycle is what is needed—a consistent, efficient, power source that can effectively replace coal. But for a combined-cycle natural gas plant to be economically feasible, it would typically need 15 to 20 years to make up for start-up costs, and even longer to become profitable. This means that a combined-cycle plant built in 2016 would break even no sooner than 2031, and would have to run for several more decades to be a worthwhile investment. Levi’s 2030 limit for peak emissions, and roughly 2050 limit for zero emissions, translate to major fossil fuel reductions after 2030. Owners and backers, however, will not want to shut down gas plants that are just beginning to generate a profit. Thus, building combined-cycle plants in 2016 without an explicit understanding of their necessarily temporary nature—and with no financial incentives for early closures in the future—defeats the purpose of natural gas as a “transition fuel.”

Why not focus on open-cycle plants instead? While “peakers” make sense as backups for future renewable energy sources, they don’t make sense right now. In the current infrastructure, they can only run for a couple hundred hours a year before they cost more than they can earn; this is not nearly enough to displace coal. Closed-cycle plants can help build the bridge but cannot close it, and open-cycle plants can help close the bridge but cannot build it. Neither type of plant is both economically feasible in the long run, and powerful enough to meet today’s demand while cutting emissions in time to mitigate climate change. However, when natural gas is branded as a “transition fuel” in politics and in popular media, this crucial detail is rarely mentioned.

(Emphasis added by blog author.)

So natural gas plants are the Zaphod Beeblebrox of electricity generation, as they are duplicitous and their purpose is to distract from the true goals of natural gas infrastructure expansion, to prolong the day when fossil fuel assets are stranded because of government action to mitigate climate change, or, as increasingly plausible, it is taxed for its Carbon.

Luckily Arthur’s Betelgeusean friend, Ford Prefect, a roving researcher for that illustrious interstellar travel almanac The Hitchhikers Guide to the Galaxy, was more of an optimist. Ford saw silver linings where Arthur saw only clouds and so between them they made one prudent space traveller, unless their travels led them to the planet Junipella where the clouds actually did have silver linings. Arthur would have doubtless steered the ship straight into the nearest cloud of gloom and Ford would have almost certainly attempted to steal the silver, which would have resulted in the catastrophic combustion of the natural gas inside the lining. The explo­sion would have been pretty, but as a heroic ending it would lack a certain something, i.e. a hero in one piece.

(An extract from The Hitchhikers Guide to the Galaxy.)

Posted in adaptation, Anthropocene, atmosphere, Bloomberg, Bloomberg New Energy Finance, BNEF, bridge to nowhere, bridge to somewhere, carbon dioxide, Carbon Tax, Carbon Worshipers, citizenship, civilization, climate, climate change, climate disruption, climate economics, consumption, decentralized electric power generation, decentralized energy, distributed generation, electricity, electricity markets, energy, energy utilities, explosive methane, fossil fuel divestment, fossil fuels, fracking, gas pipeline leaks, global warming, greenhouse gases, greenwashing, Hyper Anthropocene, investment in wind and solar energy, leaving fossil fuels in the ground, Massachusetts, Massachusetts Clean Energy Center, methane, natural gas, networks, petroleum, pipelines, planning, politics, public utility commissions, PUCs, rate of return regulation, rationality, reason, reasonableness, regulatory capture, Sankey diagram, solar domination, stranded assets, supply chains, the energy of the people, the green century, the right to be and act stupid, the right to know, the tragedy of our present civilization, the value of financial assets, zero carbon | 1 Comment

The Presidential betting markets

Someone blatantly misrepresented the U.S. Presidential election betting markets in a Google+ comment thread tonight, and I wanted to bring these forward, here.

See the latest odds and assessments from the prediction markets.

Done.

2016-07-22_010019

No doubt some supporters of Trump will argue “God is on our side, and so these heathen markets cannot be correct”. I’ll bet.

Update, 2016-07-24

Current odds on Betfair.

Posted in forecasting, investing, politics, prediction markets, rationality, reasonableness, statistics | Leave a comment

Now, if we could only say the same thing about Massachusetts …

Massachusetts is supposed to be a Blue State.

Massachusetts is supposed to be concerned about the environment, full of tree-hugging eco-weenies (like myself!), and sprouting solar panels from every other rooftop.

Massachusetts is supposed to have aggressive support for zero Carbon energy, including incentives, SRECs, and so on.

But facts are different.

59% of Massachusetts electricity comes from explosive methane (“natural gas” to those of you who prefer industry adverts). This is a potent greenhouse gas which, in 20 year timeframes, is 90x worse than CO2 for climate disrupting radiative forcing. (See https://667-per-cm.net/about if you have doubts.) Natural gas ain’t granola. And the calculations which suggest it is better for the environment are, in my opinion, whacked and bupkis. Set aside upstream impacts from fracking. Not all methane is burnt when it goes up your chimney for heating, nor in generating plants. There are big leaks throughout the Boston metropolitan area which the utilities will fix “if they are dangerous”, but they don’t consider greenhouse gas emissions dangerous. And we all known we have to transition off of fossil fuels, for the good of ourselves, a coastal state as we are, and the moral good of the planet and people on it, not to mention the recently affirmed requirements of the Global Warming Solutions Act (“GWSA”). The Union of Concerned Scientists says we are getting overdependent upon natural gas. And the comparison with coal as a benefit is a logical fallacy of “the worst negates the bad”.

That’s quite different than Texas. Yes, Texas. Home of cowboy boots, and guns, and Spectra Energy.

climatehopesg

  1. Texas holds the record for all-time wind energy production.
  2. The benefits of wind are estimated at $3.3 billion annually.
  3. Texas was the first US state to reach 10,000 megawatts of wind power generating capacity.
  4. Texas was One of the First U.S. States to Require a Certain Amount of Electricity Come from Renewable Energy Sources. (As was Massachusetts, to its credit.)
  5. Texas wind power is cheaper than fossil fuels.
  6. The Texas wind industry employs more than 24,000 workers.

Without going really big on offshore wind and solar, Massachusetts could be being just being a bunch of chumps. And I often wonder if Spectra Energy isn’t trying to dump their explosive methane here, because people at home know better. Or it could be Massachusetts citizens are hypocrites, claiming to be for something, until it affects their own back yards. Or it could be, Massachusetts leadership is having $100,000 spent on them, just in 2016.

windpower

0413bus_wind

AlamoIV_2016-07-21_230755

Alamo-I-Aerial-1024x683

Posted in Anthropocene, Bloomberg New Energy Finance, bridge to nowhere, bridge to somewhere, Buckminster Fuller, Cape Wind, Carbon Worshipers, citizenship, clean disruption, climate business, corruption, decentralized electric power generation, decentralized energy, demand-side solutions, destructive economic development, distributed generation, economics, electricity, electricity markets, energy, energy utilities, explosive methane, fossil fuel divestment, fossil fuels, gas pipeline leaks, Green Tea Coalition, greenhouse gases, Hyper Anthropocene, investment in wind and solar energy, ISO-NE, Joseph Schumpeter, leaving fossil fuels in the ground, local generation, MA, Mark Jacobson, Massachusetts, Massachusetts Clean Energy Center, Massachusetts Interfaith Coalition for Climate Action, Michael Osborne, natural gas, New England, Nikola Tesla, pipelines, politics, public utility commissions, PUCs, rate of return regulation, rationality, reason, reasonableness, regulatory capture, risk, Sankey diagram, solar energy, solar power, Spaceship Earth, supply chains, Texas, the energy of the people, the green century, the tragedy of our present civilization, the value of financial assets, Tony Seba, wind energy, wind power, zero carbon | 1 Comment

David Spiegelhalter on `how to spot a dodgy statistic’

In this political season, it’s useful to brush up on rhetorical skills, particularly ones involving numbers and statistics, or what John Allen Paulos called numeracy. Professor David Spiegelhalter has written a guide to some of these tricks. Read the whole thing. Highlights, though, of devices used to produce statistics which aren’t-quite-right (that is, wrong):

  • Use a real number, but change its meaning
  • Make the number look big (but not too big)
  • Casually imply causation from correlation
  • Choose your definitions carefully
  • Use total numbers rather than proportions (or whichever way suits your argument)
  • Don’t provide any relevant context
  • Exaggerate the importance of a possibly illusory change
  • Prematurely announce the success of a policy initiative using unofficial selected data
  • If all else fails, just make the numbers up

David Spiegelhalter is the Winton Professor of the Public Understanding of Risk at the University of Cambridge and president elect of the Royal Statistical Society. Among many other things, he’s an advocate for expressing life risks as micromorts.

Posted in abstraction, anemic data, Bayes, Bayesian, chance, citizenship, civilization, corruption, Daniel Kahneman, disingenuity, Donald Trump, education, games of chance, ignorance, maths, moral leadership, obfuscating data, open data, perceptions, politics, rationality, reason, reasonableness, rhetoric, risk, sampling, science, sociology, statistics, the right to know | Leave a comment

Rushing the +2 degree Celsius boundary

I made a comment on Google+ pertaining to a report of a recent NOAA finding.

Enjoy.

But remember that COP21 boundary is equivalent to 450 ppm CO2.

Posted in adaptation, AMETSOC, Anthropocene, atmosphere, Bill Nye, bridge to nowhere, carbon dioxide, Carbon Tax, Carbon Worshipers, citizenship, civilization, clean disruption, climate, climate disruption, COP21, corporate litigation on damage from fossil fuel emissions, differential equations, disruption, distributed generation, Donald Trump, ecology, El Nina, El Nino, energy, energy reduction, engineering, environment, environmental law, Epcot, explosive methane, forecasting, fossil fuel divestment, fossil fuels, geophysics, global warming, greenhouse gases, greenwashing, Hyper Anthropocene, investment in wind and solar energy, IPCC, local generation, Mark Jacobson, Martyn Plummer, microgrids, Miguel Altieri, philosophy, physical materialism, R, resiliency, Ricky Rood, risk, Sankey diagram | Leave a comment

BOYCOTT natural gas, American or otherwise

It’s one thing to oppose pipelines and continued use of fossil fuels, but there is little as effective as a boycott of the key product. This is certainly not a new idea. (I don’t do Facebook. See this 2001 article as well.) So if you want to nudge in the direction of renewables, please consider boycotting natural gas. If you want to save money in the long term, please consider leaving natural gas. Natural gas and other fossil fuel prices are inherently volatile. Complaints of their being too high some times are really complaints about this volatility. Renewable energy produces electricity at the same price decade after decade.

Natural gas ain’t granola. Despite company advertisements to the contrary, drilling and fracking natural gas wells and associated infrastructure, including pipelines, compressor stations, and piping and metering stations are invasive, disruptive, expensive, and harmful to people, the environment, and the climate. Methane, the chief component of natural gas, is many times more powerful as a greenhouse gas than is CO2, and even at the burning end, combustion of natural gas is not complete, so there is leakage, even if the raw chemistry of all the component that is burnt is much cleaner than coal. Moreover, gas leaks, from pipelines, from nearly every step along the way, and especially in distribution networks near homes. And don’t think that because you hear reassuring things from utilities and gas companies and engineers that there’s safety there. It may be out of sight, but the political process and the Natural Gas Act of 1938 rigs the federal system against all opponents of natural gas, from cities and towns, down to localities and homeowners and farmers.

Co-constituents of natural gas with methane are carcinogenic and are powerfully harmful of human breathing and lungs. Even the odorants which are added to facilitate detection of leaks are themselves harmful.

  • The easiest way is to design your new home with solar PV, energy efficiency, and electric heating/cooling in mind. Induction stoves are wonderful devices, bringing most of the benefits of gas stoves and energy efficiency to an all-electric footprint. Many new homes, particularly large ones, have excellent roofs and yards for solar PV arrays. Our home has appreciable tree shading, especially in summer, but we solved that by oversizing the array we installed, and, so, generating like crazy for the parts of the day and year we do see unimpeded Sun. And those puffy cumulus clouds are truly awesome helpers.
  • Consider refitting your home and getting off natural gas. We were never on natural gas, but we once did heat our home with oil heated forced hot water and got our hot water that way. Now, our home is zero Carbon, since we heat and cool with ductless minisplits, and have an electric air heat pump hot water heater. We even have an electric, battery-powered lawnmower. Payback times are better each year. (Our solar array will pay for itself in 7 years, this being in Massachusetts.) Your state may have an incentive program and, from what I have studied, it is a win in any case: You’ll just not earn as much back as people do in states that have strong support of solar PV, like New York. The town of Minster, OH went big on solar despite their state’s punitive measures upon solar owners.
  • If you cannot afford solar, or your roof or yard is shielded from Sun, or you live in an apartment, consider the Relay Power community solar program, and switch over from natural gas to at least one ductless minisplit system. Relay Power is available to anyone in the Eversource electricity region in Massachusetts, and there are community solar programs elsewhere in the United States. Check yours.
  • Make your house more efficient! The less heat you need, the less gas you use. If you have a gas-powered clothes dryer, seriously consider drying your clothes by hanging them on racks on your deck or from a clothesline. Dampness in a house is not good for the house, and clothes smell better and feel better, in my opinion, when dried outdoors. If you use a gas oven, consider getting a much smaller top-of-counter electric oven. In our experience, most cooking for singles and couples does not need the big oven. Most of these small ovens are big enough to bake pizza. We have a Breville Convection Smart Oven and we love it.
  • Support your local environmental organization and political action committee to push your state governor and legislature for better energy policy, one which advocates for zero Carbon energy. The more of this there is, the cheaper is electric power in your state, and the lower the price of natural gas, even if you continue to use it.
  • Prohibit reimbursements like the Pipeline Tax. Utilities and gas companies are for-profit corporations. They should be able to get loans to build infrastructure from the private sector. They don’t need taxpayers to share the risk.

Natural gas … WE DON’T WANT YOUR PIPELINE We don’t want your damn gas.

Posted in Anthropocene, atmosphere, Bloomberg New Energy Finance, BNEF, bridge to nowhere, Carbon Worshipers, citizenship, civilization, climate change, climate disruption, climate economics, climate justice, decentralized electric power generation, decentralized energy, denial, destructive economic development, distributed generation, ecology, economics, electricity, electricity markets, energy, environment, explosive methane, fossil fuel divestment, fossil fuels, fracking, gas pipeline leaks, global warming, greenhouse gases, greenwashing, Hyper Anthropocene, ignorance, ISO-NE, leaving fossil fuels in the ground, methane, natural gas, New England, politics, public utility commissions, rationality, regulatory capture, Sankey diagram, the energy of the people, the problem of evil, the right to be and act stupid, the stack of lies, the tragedy of our present civilization, zero carbon, ``The tide is risin'/And so are we'' | Leave a comment

Mark Carney: Why are financial regulators and central bank governors looking at climate?

http://www.cbc.ca/i/caffeine/syndicate/?mediaId=725874755644

“We don’t want a Minsky moment about climate.”

Update, 2016-07-19

Interesting that Carney talks about “stabilizing at a temperature” when emissions are stabilized using a Carbon tax. He agrees with a Carbon tax, but he seems to have his science wrong. I did not get the impression he understands that to stabilize at any temperature, Carbon emissions need to go to zero. In his world, I wonder, does that mean that a price on Carbon needs to go to infinity? From my perspective, there is an implicit ceiling on Carbon price, and that is the realistic price per tonne to exact a unit of Carbon from atmosphere. Perhaps it would be more, before it’s not just about extracting this tonne of Carbon but this one, and another, and more. But, still, there is a kind of ceiling.

Posted in adaptation, Anthropocene, Bloomberg, Bloomberg New Energy Finance, BLUE, central banks, civilization, climate, climate business, climate change, climate disruption, climate economics, climate education, climate justice, corporate litigation on damage from fossil fuel emissions, corporate supply chains, demand-side solutions, ecology, economics, education, environment, false advertising, finance, fossil fuel divestment, fossil fuels, global warming, greenwashing, grid defection, insurance, investing, Joseph Schumpeter, liberal climate deniers, local generation, organizational failures, rate of return regulation, rationality, reasonableness, solar domination, Spaceship Earth, stranded assets, sustainability, the right to know, the value of financial assets, zero carbon | Leave a comment

JASA demands code and data be supplied as a condition of publication

The Journal of the American Statistical Association (“JASA”) has announced in this month’s Amstat News that effective 1st September 2016 “… will require code and data as a minimum standard for reproducibility of statistical scientific research.” Trends were heading this way, but it is excellent to see a major journal insisting upon it as standard practice.

There appear to be some weasel words allowing publications having “proprietary data” to move forward, insisting upon code nevertheless. I can only imagine that publications opting for that path will be seen as less established, solid, or compelling.

Posted in American Association for the Advancement of Science, American Statistical Association, citizen science, engineering, ethics, evidence, new forms of scientific peer review, numerical software, planning, rationality, reasonableness, resiliency, science, statistics, stochastic algorithms, testing, the right to know | Leave a comment