Happy Earth Day: Doubt climate change? Just be patient and watch

And know that, because of our collective inaction, even if we were to fix everything immediately, now, because of the built-up momentum in the climate system, things will get steadily worse for two-to-four decades after we stop.

Don’t like those odds? Then stop, now.

Some of them were angry,
At the way the Earth was abused
By the men who learned how to forge her beauty into power.
And they struggled to protect her from them, only to be confused
By the magnitude of her fury in the final hour
When the sand was gone and the time arrived …

Remember, the amount of human emissions in atmosphere has doubled since 1992, about when the leaders of the world claimed to begin to take this seriously.

Posted in adaptation, American Association for the Advancement of Science, an ignorant American public, an uncaring American public, climate change, fossil fuel divestment, global warming, Humans have a lot to answer for, Hyper Anthropocene, science, The Demon Haunted World, the right to be and act stupid, the right to know, the tragedy of our present civilization, tragedy of the horizon | Leave a comment

One of the best moments of my week …

Falcon Heavy from SpaceX delivering Arabsat, and landing 3 for 3 …

This is the triumph of Mathematics and physical reality over all the other crap and nonsense we hear about.

This is what will always win first, despite anything else, in the long term.

Posted in science | Leave a comment

12 km Burgess extension

Saturday’s run, warm weather, about 20℃. One stop for water. 12.1 km. 1h40m. I was slower than mean because it was warm.

Up 130m altitude and back.

(Click image for view of interactive version of run map in new tab. Click on “speed” tab to see speed, etc.)
Posted in running, Westwood | Leave a comment

On the Ministry of Silly Walks : Brexit

The John Cleese reference came from this week’s treatment of comic self-deprecation in the UK post-Brexit, in The New Yorker.

Posted in Brexit, bridge to nowhere, Britain, populism, the right to be and act stupid, the right to know, UK, unreason | Leave a comment

Five Thirty Eight podcast: `Can Statistics solve gerrymandering?`

Great podcast, featuring Professor and geometer Moon Duchin, Nate Silver, and Galen Druke. If the link doesn’t work, listen from here or below:

https://fivethirtyeight.com/player/politics/26382150/

Professor Duchin has written extensively on this:

Posted in FiveThirtyEight, Nate Silver, point pattern analysis, politics, statistics | Leave a comment

Overleaf: `#FuturePub London returned to a full house!’, 10 April 2019

I have switched from basic desktop MikTex to Overleaf for most of my day-to-day \LaTeX needs.

They recently had a FuturePub session in London.

I’m enthusiastic about their capability and degree of support, especially in their documentation.

Posted in collaboration, LaTeX, Overleaf, ShareLaTeX | Leave a comment

Dan Fleisch says you don’t know the power of … the Dark Side Tensors

This is a fun motivating lecture:

See also his A Student’s Guide to Vectors and Tensors, with related podcasts. It’s available on Kindle, by the way. (Save some trees.)

eigenchris has another series of lectures on Tensors.

Posted in Dan Fleisch, physics, tensors | Leave a comment

I just chose to support Climate Adam!

I just chose to support ClimateAdam.

You can, too!

One of my many favorite videos by Climate Adam:

Here’s another:

Why is supporting Climate Adam and talking about climate so important?

Here’s one reason why:

Backing this up:

Posted in American Association for the Advancement of Science, climate, climate change, climate education, ClimateAdam, geophysics, global warming, public education, sustainability, the right to know | Leave a comment

October 2013 retrospective … Karl Ragabo on ‘Talk Solar’ podcast, regarding value of solar generation

In October of 2013, Karl Ragabo was interviewed on the Talk Solar podcast from Beth Bond of Decatur, GA. This was shortly after the first version of the Value of Solar report was issued by IREC. Listen to it below:

This presentation is particularly valuable for people, like municipal regulators, who sometimes interact with, purchase from, and make decisions affected by utility company practices yet do not know from where many of them are coming, and how much of a business model disruption distributed PV generation and storage threatens.

Mr Ragabo’s deep history with renewable rate setting and PV generation is summarized here.

‘Wärtsilä introduces new hybrid solar PV and storage solution’

(Image courtesy of Wärtsilä, and you can read more about the above solution here.)

Readers may notice the PV farm in the figure above was placed in a sparsely treed area, resulting in trees being cut down. An interesting discussion might ensue, either here in the comments or in a future post, if there is enough interest, regarding the costs and benefits of substituting large scale PV farms for a relatively undisturbed natural ecosystem.

Another interesting point is whether or not losing ground to zero Carbon energy generation is indirectly a cost of failing to address and cost-in climate change. Companies and people look to things like Carbon pricing and Carbon taxes as the most direct effects, or increases in rates to pay for zero Carbon incentives. On the other hand, to the degree to which zero Carbon energy is, hands down, the best long term bet an energy investor or consumer can make (see below) means companies which ignore this, particularly utilities, have a lot to lose.

Posted in Beth Bond, Bloomberg New Energy Finance, bridge to somewhere, Buckminster Fuller, CleanTechnica, climate disruption, climate economics, decentralized electric power generation, decentralized energy, distributed generation, ecomodernism, ecopragmatism, energy storage, energy utilities, engineering, investment in wind and solar energy, investments, Joseph Schumpeter, Karl Ragabo, microgrids, public utility commissions, regulatory capture, resiliency, RevoluSun, solar democracy, solar domination, solar energy, solar power, Sonnen community, Stewart Brand, stranded assets, Talk Solar, the energy of the people, the green century, Tony Seba, utility company death spiral | Leave a comment

Repeat of Long Mill 1, on a moderately warm day

(Click on map to be taken to my Ride with GPS site where you can interact with the route display.)

I am, by the way, steadily changing my displays to present data in Metric Units rather than English Units. I began with temperatures, and now I’m moving on to distances and speeds. I want to get good enough to have a sense of how far, say, 12 km is without converting to miles or feet.

Posted in Massachusetts, Nature, running | Leave a comment

Weekend break: Theme for Earth Day

By John Williams:

PBD

Posted in agroecology, Aldo Leopold, American Association for the Advancement of Science, American Statistical Association, an uncaring American public, argoecology, biology, Botany, Buckminster Fuller, climate, David Suzuki, dynamical systems, E. O. Wilson, earth, Earth Day, ecological disruption, ecological services, Ecological Society of America, ecology, Ecology Action, ecomodernism, ecopragmatism, ecopragmatist, Eli Rabett, environment, Equiterre, evolution, fragmentation of ecosystems, global warming, green tech, greenhouse gases, greenwashing, invasive species, investing, investment in wind and solar energy, investments, Lawrence Berkeley National Laboratory, Lotka-Volterra systems, marine biology, Mathematics and Climate Research Network, microbiomes, NOAA, oceans, Peter del Tredici, Peter Diggle, Pharyngula, physical materialism, quantitative biology, quantitative ecology, rate of return regulation, scientific publishing, Spaceship Earth, statistical dependence, Stefan Rahmstorf, Tamino | Leave a comment

Still a climate hawk, and appreciate all my climate friends: To the climate deniers, the greenwashers, the liberal environmental opportunists, and the environmental purists who will never compromise …

“Not ready to make nice” (Dixie Chicks)

I stick by my friends in these hard times:

Losing Earth: The decade we almost stopped climate change.

Posted in American Association for the Advancement of Science, American Statistical Association, Anthropocene, Bayesian, climate change, climate disruption, climate economics, climate grief, coastal investment risks, ecological disruption, ecological services, ecomodernism, ecopragmatism, engineering, environment, flooding, global warming, Grant Foster, Humans have a lot to answer for, Hyper Anthropocene, investment in wind and solar energy, investments, Joseph Schumpeter, Mathematics and Climate Research Network, mathematics education, personal purity, population biology, quantitative biology, quantitative ecology, regulatory capture, risk, riverine flooding, sampling without replacement, Scituate, secularism, shorelines, solar democracy, solar domination, solar energy, Solar Freakin' Roadways, solar power, SolarPV.tv, Spaceship Earth, statistical dependence, SunPower, the energy of the people, the green century, the tragedy of our present civilization, the value of financial assets, tragedy of the horizon, Unitarian Universalism, unreason, utility company death spiral, UU Needham, Wally Broecker, Walt Disney Company, Woods Hole Oceanographic Institution, ``The tide is risin'/And so are we'' | Leave a comment

Another reason why the future of Science and STEM education in the United States is cloudy

From Nature‘s “Universities spooked by Trump order tying free speech to grants“, with the subheading “White House policy will require universities to certify that they protect free speech to remain eligible for research funding”, comes this chilling news:

US President Donald Trump signed an executive order on 21 March that requires universities to certify that they protect free speech, or risk losing federal research funds.

Public institutions will have to certify that they are following free-speech protections laid out in the First Amendment of the US Constitution, and private institutions must promise to follow their stated policies on free speech, a White House official told reporters on 21 March.

The order applies to 12 research agencies, including the National Institutes of Health, the National Science Foundation, the Department of Energy and NASA. It affects only money for research, not financial aid for students.

“We’re dealing with billions and billions and billions of dollars,” Trump said in a speech just before signing the order. “Taxpayer dollars should not subsidize anti-First Amendment institutions.” He said that the order was the first in a series of steps that his administration intends to take to “defend students’ rights”.

Clearly, this is an attempt to magnify the pseudo-standard of “fair and balanced” so badly invoked in media to elevate unsubstantiated and illogical claims from scientifically illiterate and innumerate minorities to the status of powerful political voices. Witness the collective disposition of climate change.

Worse, though, it is another step of encroachment of an hitherto economically unsuccessful populist world view, one which coincides with basically large scale sour grapes, upon the Success Centers of United States culture. These are overwhelmingly Blue, self-made, urban, and diverse, even if they still allocate their wealth unfairly. It is an extended exercise of spiting oneself for, without these technologies, military safety and economic success won’t continue.

But, people aren’t going to wait for that to be rectified in some hypothetical future — and probably Democratic — administration. This is a dynamic business world, and people seek their own comfortable surroundings and fortunes.

And, so, there is a Brain Drain beginning from the United States to elsewhere. (This is also known as human capital flight.) First, it was limited to the rejection or imposition of discomfort of brilliant and ingenious technical entrepreneurs from India and Pakistan and China, who thought nothing better than coming to what once was the haven and incubator for free enterprise and free ideas and founding a fortune. But, now, even the best and the brightest of full born Americans, young bright minds and spirits who know how to succeed, are beginning to see the rest of the world as more inviting and accommodating, and are making the hard choice to uproot and go, emigrate.

I applaud them for their foresight. The idea of blind loyalty despite cultural sins and political idiocy is itself idiotic. It is not living, it is a self-deprecating religion.

And, so, I was not at all surprised that Nature also carried an extended article chronicling how five scientists had wrestled with the idea of moving to another country to improve their futures.

See:

Posted in American Association for the Advancement of Science, American Mathematical Society, American Statistical Association, an ignorant American public, an uncaring American public, anti-intellectualism, anti-science, climate change, Commonwealth of Massachusetts, emigration, European Union, mathematics, science, United States | Leave a comment

Long Mill 1, a run

(Click on map to be taken to my Ride with GPS site where you can interact with the route display.)
Posted in running, Westwood | Leave a comment

Result of our own fiddling: Bob Watson and climate risk

https://sms.cam.ac.uk/media/746045

Professor Bob Watson, University of East Anglia, presents the summary risk, climate change:

The question is not whether the Earth’s climate will change in response to human activities, but when, where and by how much. Human activities are changing the Earth’s climate and further human-induced climate change is inevitable. Indeed the climate of the next few decades will be governed by past emissions. The most adverse consequences of human-induced climate change will be in developing countries and poor people within them. Climate change threatens to bring more suffering to the one billion people who already go to bed hungry every night and the approximately 2 billion people exposed to insect-borne diseases and water scarcity. Sea level rise threatens to displace tens of millions of people in deltaic areas and low-lying small island states. Climate change will undermine the ability of many poor people to escape poverty and the long-term sustainable economic development of some countries. Hence, climate change is not only an environmental issue, but a development and security issue. The challenge is to limit the magnitude and rate of human-induced climate change, and simultaneously reduce the vulnerability of socio-economic sectors, ecological systems and human health to current and projected climate variability by integrating climate concerns into local and national economic planning. Technological options for reducing greenhouse gas emissions cost-effectively over the next few decades already exist. However, the required transition to a very low carbon economy (a reduction in global emissions by at least 50% by 2050) will require a technological evolution in the production and use of energy, energy sector reform, appropriate pricing policies and behavior change, coupled with a more sustainable agricultural sector and reduced deforestation. This transition to a low-carbon economy must be achieved while improving access to affordable energy in developing countries, which is critical for economic growth and poverty alleviation, and while ensuring adequate affordable and nutritious food. The challenge is to negotiate a long-term (up to 2050) global regulatory framework that is equitable with common but differentiated responsibilities and has intermediate targets that can reduce greenhouse emissions to a level that limits the increase in global mean surface temperature to 2C above pre-industrial levels. While this goal has been widely accepted, the current rate of growth in emissions globally, coupled with a failure in Copenhagen to agree to stringent targets to reduce emissions, makes this goal extremely difficult, hence the world needs to be prepared to adapt to a 4C warmer world.

Posted in Anthropocene, attribution, carbon dioxide, Carbon Worshipers, catastrophe modeling, climate, climate change, climate data, climate disruption, climate economics, climate education, climate grief, climate justice, ecological disruption, ecology, Ecology Action, environment, global blinding, global warming, greenhouse gases, greenwashing, meteorology, National Center for Atmospheric Research, non-parametric model, Principles of Planetary Climate, radiative forcing, reasonableness, science, solar democracy, solar domination, solar energy, Solar Freakin' Roadways, solar power, SolarPV.tv, Solpad, Sonnen community, Spaceship Earth, stranded assets, sustainability, the energy of the people, the green century, the tragedy of our present civilization, the value of financial assets, tragedy of the horizon, utility company death spiral, water, wind energy, wind power | Leave a comment

Welcome to snowy New England … Bad place for solar PV, right?

And this is ISO-NE, who, as little as three years back were highly sceptical anything other than additional natural gas generation could supply the ever increasing electrical power needs of the region, particularly with the withdrawal of generation from oil, coal, and nuclear sources scheduled for the period.

Oh. So, perhaps, maybe, Professor Tony Seba nailed it right on all along. What a concept.

Welcome to New England. Bad place for solar PV, right? So why can’t you make it work, Texas, or South Carolina, or Florida, or Georgia, or North Carolina, or Arizona? What are you dumb or something?

Hat tip to SP Global for the original article.

Posted in American Solar Energy Society, Amory Lovins, Arnold Schwarzennegger, Bloomberg New Energy Finance, bridge to somewhere, clean disruption, CleanTechnica, climate economics, Commonwealth of Massachusetts, corporations, decentralized energy, destructive economic development, distributed generation, ecological disruption, economic trade, economics, ecopragmatism, ecopragmatist, engineering, entrpreneurs, green tech, Green Tech Media, grid defection, investment in wind and solar energy, ISO-NE, Joseph Schumpeter, Massachusetts Clean Energy Center, rate of return regulation, reworking infrastructure, rights of the inhabitants of the Commonwealth, Sankey diagram, solar democracy, solar domination, solar energy, solar power, SolarPV.tv, Sonnen community, Spaceship Earth, technology, the energy of the people, the green century, the right to be and act stupid, the right to know, the tragedy of our present civilization, the value of financial assets, UNFCCC, Unitarian Universalism, unreason, utility company death spiral, Wally Broecker, wishful environmentalism, Woods Hole Oceanographic Institution, zero carbon | Leave a comment

One of the happiest two hours I’ve spent in months: A Professor Tony Seba update

From end of 2018:

from alianza FiiDEMAC.

And, indeed, it was one of the most uplifting two hours I’ve recently spent. I have long been an admirer of Professor Tony Seba. I have read his books. This was an update on how he now sees the world.

As someone who embraces the legal logic of the Juliana v United States lawsuit, I do not have much confidence in politics being able to mitigate climate disruption. Both political parties in the United States have been repeatedly warned of the consequences of continuing the policy of mining and emitting and their inevitable disruptions of the climate. And, while, technically, United States emissions have plateaued, this is a result of our collective exporting our manufacturing emissions to China.

So, politically, efforts to mitigate climate change, in the United States, but not only in the United States, but also in the OECD, have been an abysmal failure. How depressing. And the death throes of the so-called Green New Deal do not inspire.

I have stated my problems with matters as they are. (Context.) I am pessimistic that the last branch of the United States government will intervene appropriately. They haven’t shown enthusiasm.

And, as I made clear in my statement, this is not a cause for despair. There will be a response. Unfortunately, by the abrogation of interest and concern on the part, firstly, of the general public in the matter, the displacements in jobs, social equity, and wealth which will inevitably occur by their collective lack of engagement will be painful. Nevertheless, this disruption will happen, since economics, at least in OECD countries, are primary.

Climate change will be mitigated, perhaps a bit late, and probably with an incredible loss of present wealth, because of bad bets on the part of the wealthy. I really do not find any reason to sympathize with them. I believe the less privileged won’t be impacted any more than they usually are, and, in the dissolution of wealth which will inevitably occur, they may have opportunities they did not have previously. In any case, the presumed omniscience on the part of the Haves over the Have Nots in United States society should be destroyed in concept, although the ignorance of some publics regarding our present leadership gives me some pause in this conclusion.

In any event, I feel this change is inexorable, not, as Professor Tony Seba repeatedly emphasizes, because of do-good environmental policies, but because the time of zero Carbon energy and smart distribution of it via computation has arrived.

And, frankly, as uncharitable as the opinion might seem, I have zero commiseration with those who opposed the advance of such zero Carbon energy, whether that means they lose their jobs, lose their investments, or cannot provide for their offspring. For they are the reason why, after more than 20 years of knowing about climate change, we have collectively done nothing, and, in process, thrown doubt at Science and Engineering and Mathematics, they deserve no sympathy, and no consideration. Let them be a lesson.

It is also notably that electorate should be highly cautious of urgings on the part of fossil fuel interests, including extractions companies as well as their supporters, to reimburse them for losses relating to this disruption. There is ample evidence they saw what was coming and chose to oppose it rather than adapting to it. That was a choice. That was their right. But they should not be given a penny because they chose wrongly. There is nothing more fundamental to free market capitalism than the principle that those who make bad bets should bear the full cost of making those bad bets.

Posted in an ignorant American public, an uncaring American public, anti-intellectualism, anti-science, being carbon dioxide, bridge to somewhere, climate business, climate change, climate disruption, climate economics, climate education, corporations, Cult of Carbon, decentralized energy, distributed generation, ecomodernism, economics, ecopragmatism, ecopragmatist, electricity, entrpreneurs, extended producer responsibility, extended supply chains, Exxon, global warming, Green New Deal, Humans have a lot to answer for, Hyper Anthropocene, investing, investment in wind and solar energy, investments, Joseph Schumpeter, Juliana v United States, leaving fossil fuels in the ground, local generation, local self reliance, Mark Jacobson, Neill deGrasse Tyson, politics, science, solar democracy, solar domination, solar energy, solar power, supply chains, sustainability, temporal myopia, the energy of the people, the green century, the tragedy of our present civilization, the value of financial assets, Tony Seba, trading, tragedy of the horizon, utility company death spiral, wishful environmentalism, zero carbon | 1 Comment

“Ridiculously well-designed rockets”, not to mention some seriously awesome Mathematics

I’m just amazed with the quality of their controls systems, understated in the video, but which are absolutely critical to success.

For more technical details, see:

B. Açıkmeşe, J. M. Carson III, L. Blackmore, “Lossless convexification of nonconvex control bound and pointing constraints of the soft landing optimal control problem“, IEEE Transactions on Control Systems Technology, 21(6), November 2013.

Posted in control theory, controls theory, convex problems, differential equations, relaxation methods, rocket science, SpaceX | Leave a comment

Macros in R

via Macros in R

See also:

  • The gtools package of R which enables these.
  • There’s a description and motivation beginninng on page 11 of an (old: 2001) R News issue.

They have been around a long time, but I haven’t tried them.

I will.

lambda

Quote | Posted on by | Leave a comment

Temperatures, Summers, Germany, ≈ 50.5N to 57.5N latitude

(Click on figure for larger image and use browser Back Button to return to blog.)

Hat tip to Gregor Aisch, Adam Pearce, and Steve Hoey, and sourced from the mashup dataset and visuals by Lisa Charlotte Rost.

Mr Aisch’s innovation was to use Loess regression to display. Loess is one of a set of local regression methods. I personally prefer p-spline smoothing (penalized spline regression).

In contrast, Boston is 42.4N, Bangor, Maine is 44.8N, and Montreal, Quebec, Canada is 45.5N.

See also article in Quartz.

Posted in Anthropocene, climate change, data presentation, data science, data visualization, digital art, drought, Germany, global warming, loess, p-spline, penalized spline regression | Leave a comment

“Rising seas erode $15.8 billion in home value from Maine to Mississippi”

From the First Street Foundation‘s press release, with selected figures below. This is based upon the methods described in:

S. A. McAlpine, J. R. Porter, “Estimating recent local impacts of Sea-Level Rise on current real-estate losses: A housing market case study in Miami-Dade, Florida“, (open access) Population Research and Policy Review, December 2018, 37(6), 871–895.

(Click on image to see larger figure and use browser Back Button to return to blog.)

(Click on image to see larger figure and use browser Back Button to return to blog.)
Posted in Anthropocene, catastrophe modeling, coastal communities, coastal investment risks, Commonwealth of Massachusetts, EBC-NE, flooding, floods, Hyper Anthropocene, risk, riverine flooding | Leave a comment

Procrustes tangent distance is better than SNCD

I’ve written two posts here on using a Symmetrized Normalized Compression Divergence or SNCD for comparing time series. One introduced the SNCD and described its relationship to compression distance, and the other applied the SNCD to clustering days at a high school based upon patterns of electricity consumption.

Having good tools for making such comparisons is important, because such bases for clustering and exploration are useful when examining large datasets, like the hydrological datasets I’ve previously described. I am also finally getting around to doing something with these datasets, a project I put off because of my commitments to climate activism over the last few years.

Despite my earlier enthusiasm for SNCD as a tool for series comparisons, it turns out there is a better measure, something called Procrustes tangent distance (“PTD”). I discovered this in the second edition of a book by I. L. Dryden and K. V. Mardia, called Statistical Shape Analysis, with Applications in R (2016) and through related literature and scholarship. A key paper is

J. T. Kent, K. V. Mardia, “Shape, Procrustes tangent projections and bilateral symmetry“, Biometrika, 2001, 88(2), 469-485 (with correction).

PTD is superior because it and related efforts reduce shape comparisons like that of two time series to ordinary multivariate analysis. (See pertinent book by Mardia, J. Kent, and J. Bibby as well.) For purposes of statistical analysis, it’s difficult to get better than that.

This is an outcome of a problem area dubbed Generalized Procrustes Analysis (“GPA”), and arises in applications where biological shapes need to be matched, such as bivalve shells. It also arises in archaeological work where automated methods for matching shards of pottery are engaged. These techniques and problems have deep connections to differential geometry and have engaged other great minds besides Mardia, Dryden, and Kent. PTD may not be the last word. In particular,

C. P. Klingenberg, L. R. Monteiro, “Distances and directions in multidimensional shape spaces: Implications for morphometric applications“, Systematic Biology, 54(4), 1 August 2005, 678–688

reviewed some criticisms of PTD, along with discussion by Dryden and Mardia, with others.

My application is more modest than the general multidimensional shapes problem, being limited strictly to two dimensions where some of these complications to not arise.

Unfortunately, the details of defining the Procrustes tangent distance are involved. Procrustes analysis begins with the consideration of k m-dimensional landmarks and proceeds to the recovery of a rotational invariant shape, obtained by maximizing the trace of a product, \text{tr}(\mathbf{A} \mathbf{Q}), involving a symmetric landmarks distance matrix, \mathbf{A} and a rotation matrix, \mathbf{Q}, over all \mathbf{Q}. The value of the trace and the maximizing rotation is found using the SVD, and that is also used in the practical construction of the PTD.

The next step is a linearization by constructing a tangent space, namely, the Procrustes tangent space, and an associated tangent matrix, \mathbf{T}, which is constructed as follows. Let \mathbf{A}_{1}, \mathbf{A}_{2} be two sets of k-by-m landmarks matrices. Recall these are landmark coordinates in m dimensions and there are k of them. Find the maximum over rotation matrices \mathbf{Q} of

\text{tr}(\mathbf{A}_{2}^{\top}\mathbf{A}_{1}\mathbf{Q}) = \alpha

Call that maximum point \hat{\mathbf{Q}}. Then

\mathbf{T} = \mathbf{A}_{1} \mathbf{Q} - \alpha \mathbf{A}_{2}

and this can be re-expressed, after some algebra, as

\mathbf{A}_{1} = (\cos{(\rho \mathbf{A}_{2} + \mathbf{T})})\mathbf{Q}^{\top}

Because of an implicit constraint on \alpha, \rho turns out to be a bounded, non-negative Riemannian distance between \mathbf{A}_{1} and \mathbf{A}_{2} and their shapes. While the equation above could be solved using non-linear minimization, there are more direct approaches sketched in Kent and Mardia. Moreover, my calculations of PTD are obtained by calls to the function procGPA from the shapes package offered by I. L. Dryden.

The article by Klingenberg and Monteiro cited above also gives a qualitative overview.

The insight for applicability to time series comes from this sketch:

Applying the PTD to unique pairs of edges results in:

Note however that the traces in the picture could just as well be three different time series. Accordingly, the PTD for shapes also yields distances between time series.

Does this generalize, however? Do the distances continue to make sense even when the series differ in other ways?

Consider

(Click image to see a larger figure, and use browser Back Button to return to blog.)

In the labeling atop of each, the “L” factor is inversely proportional to slope, except for the zero case, which is a zero slope. In the same, the “W” factor is inversely proportional to frequency.

What does the PTD produce as distances among these? Note the the larger the number in the following figure, the farther away the cases are:

(Click image to see a larger figure, and use browser Back Button to return to blog.)

The distances show that irrespective of slope, the PTD is picking up ripple trains with the same frequency. Some are annotated.

Note that these distances have been multiplied by 100 times to get the distances in a range where they register well in the plot. What this means is that PTD considers all the cases pretty close to one another in shape. Nevertheless, it is capable of good discriminations.

What does SNCD do with the same 16 cases?

(Click image to see a larger figure, and use browser Back Button to return to blog.)

In short, the divergences are very difficult to reconcile with any pattern of similarity. Even shorter, SNCD butchered it.

Code for calculating these figures and results is available in my Google repository.

Finally, I have repeated the analysis of high school electricity consumption clustering with PTD and found it gave nearly identical results to use of SNCD,

Posted in data science, dependent data, descriptive statistics, divergence measures, hydrology, Ian Dryden, information theoretic statistics, J.T.Kent, Kanti Mardia, non-parametric statistics, normalized compression divergence, quantitative ecology, R statistical programming language, spatial statistics, statistical series, time series | Leave a comment

“Unpleasant surprises in the greenhouse” (in memorium, Professor Wallace Broecker)

These are excerpts from a 1987 paper by Professor Wallace Broecker, widely acknowledged to be one of the greatest climate scientists and oceanographers in the last century.


.
.
.

Posted in science | Leave a comment

One possible way to do small, modular nuclear power

Featured in Science Magazine today, NuScale Power, a spinout from Oregon State University, is planning simpler, smaller, safer gang-lashable nuclear reactors, with a trial in the early 2020s. A schematic is shown below.

As I’ve noted here elsewhere, the reason why conventional nuclear reactor designs have a negative learning curve is because the industry did not turn the nuclear reactors into commodities, taking advantage of large scale replication.

Despite the unhappiness some have with nuclear power, it is clear that a good solution to most of its ills, including cost and rollout time, would be a godsend for providing the massive amounts of electrical power we need to electrify the entire United States and the world.

I continue to argue that those who oppose such developments on some kind of principle do not understand or appreciate the desperate solution situation with respect to climate change we have placed ourselves, and the soon-to-be-realized consequences.

Posted in Anthropocene, climate, climate business, climate change, climate disruption, electricity, global warming, Hyper Anthropocene, modular nuclear power, nuclear power, zero carbon | 2 Comments

Legacy

It should be noted that, exponential growth is a plank in the theoretical framework of modern Ecology. See

L. Pásztor, Z. Botta-Dukát, G. Magyar, T. Gzárán, G. Meszéna, Theory-Based Ecology: A Darwinian approach, 2016.

Dr Suzuki points out that, objectively, people are big animals, and the total biomass on Earth due to human beings is quite large. We are also large in terms of our demands upon the natural world, and, in fact, each one of us consumes many times more than the world’s natural carrying capacity for us. This is possible because of technology, and fossil fuels.

Posted in David Suzuki, ecology, exponential growth, quantitative biology, quantitative ecology | Leave a comment

Professor Kevin Anderson: “Climate’s holy trinity”

24th January 2019, Oxford, England, UK

Appalling failure:

Who is responsible:

Yeah, it’s us.

Posted in climate, climate change, climate disruption, climate grief, global blinding, global warming, Kevin Anderson | Leave a comment

On bag bans and sampling plans

Plastic bag bans are all the rage. It’s not the purpose of this post to take a position on the matter. Before you do, however, I’d recommend checking out this:

and especially this:

and the Woods Hole Oceanographic Institution has many articles about plastics in the oceans.

Good modern governance means having evidence-based decisions. So, if a bag ban of any kind, or a bag tax of any kind is going to be imposed, it makes sense to assess how much and what kinds of use of bags are prevalent before the ban or tax, and how this changes after the ban or tax. This kind of thing used to need to be done with professional surveyors and statisticians. But with the availability of online datasets, access to the experience of others, widely available and open-source computing, and new survey technology and methods, expensive professional options aren’t the only way this can be done. Professional surveyors tend to argue otherwise. But, facts are, you can learn a lot by using Google Earth and Google Maps these days.

Surveys are designed around answering specific questions. If the objective is to estimate how many bags of one kind or another are being consumed per week in a town or county, that’s one question. If the objective is to estimate how many people regularly choose paper over plastic, or bring-their-own-bags, that’s an entirely different question. The governance and the group need to choose what’s important to them.

Surveys are also designed around the skillsets of the people involved in conducting them. With a volunteer organization, it is important that the procedure be something they can readily be trained in, and I say “trained” because no survey can do without training, however simple.

Surveys also ought to be easy on the surveyors, especially if they are volunteers. The requirements of when they need to be on stations oughtn’t be so onerous that they might not arrive on time, or not show up at all, and, worse, misrepresent to the group what happened. So, for instance, even if there are shoppers using bags in a store at 6:00 a.m., it’s probably not going to get covered well if a sampling plan were to require it.

Surveys also ought to be easy to explain to those who want to know how they were done. Along with this, it is critically important that, as part of an analysis of the primary quantities of interest, like plastic bags used per week, the survey’s contribution to overall uncertainty is quantified.

All that noted, there is a lot interested and committed citizens can do to gather data like this, and interpret results. This can be important, whether or not a town or county governance consider its findings as inputs. It can serve as a check on their result. It can also serve as a check on their budget, meaning why did the pay for some expensive professional organization to do something when something good enough for the purpose could have been had much cheaper? That said, any old surveying or sampling technique which appears to be good enough isn’t good enough. That is, there is some training and learning involved.

Returning to the bag ban matter, use of bags is key to the project. As with any policy, if a regulation is imposed and there is no evidence it helps or has untoward consequences, it ought to be revoked. To do that means measuring a baseline, and then measuring after the regulation is in place. It probably is a good idea to measure a couple of times after the regulation is in place. In statistics and engineering, this general approach is called A/B testing, which is explained better here. As mentioned above, how one gets counts — they are nearly always counts — and then analyzes them depends very much on the question being asked.

But, in the case of bags, there’s the really important question of where and when to sample. In this case, I’m setting aside bags given out in stores other than grocery stores. And, in this case, I’m using the example of my home town, Westwood, Massachusetts.

Counting noses or bags assumes there’s a sampling frame in hand. In Westwood’s case, the concern is the population of residents or visitors frequenting local grocery stores. And in Westwood’s case, there’s a desire to count people and their preferences for bags, whether plastic, paper, some mix, or whether they bring their own bags and some mix, as well as size of order. So this means counting people.

There are three grocery stores in Westwood: Roche Brothers in Islington, Wegman’s at University Station, and Lambert’s. There are convenience and other stores which sell small amount of groceries, but these were assumed to show behaviors which would be exhibited by the populations of the three majors. But, still, surveying these stores either demands deep cooperation on the part of their owners, an outrageous commitment on the part of volunteers, or a sampling scheme that is constructed with knowledge of who goes where when. Where to find such a thing?

Google.

Most substantial grocery store entries on Google Maps now present a bar chart of when they are most frequently visited. It looks like this:

Now, I’ve discovered that that dashed line is a fixture marking a certain number of visits per hour. It is constant for a given store across days of the week. And it is at least roughly consistent across stores in an area. This is great. This was helped, in part, by a visit to one of the stores by a volunteer to take data for a half hour. She was collecting data for me, and was also trying out a data collection form and seeing how difficult it was or easy to get the kind of data that was pertinent. Doing this is an excellent idea.

But, wait, you say: These aren’t numbers. It’s a bar chart.

Digitizing. I learned this when I took courses in Geology. An amazing amount of data is recovered by digitizing figures in scientific journals. Why not Google?

There are several digitizing applications out there. I’ve tried a couple and, so far, I like WebPlotDigitizer best. So, I did. Digitize, that is. How?

Here I’ve marked, by hand, two points on the bar chart, attempting to ascertain the height of the dashed line in pixels from the baseline. Note the original images aren’t produced to the same resolution or size, so it’s important to calibrate each one. In the upper right you can see WebPlotDigitizer‘s close-up of the place where the cursor is. That’s a little hard to see, since there’s so much real estate there, but here’s a close-up of the bottom:

And here’s a close-up of the upper right, a close-up of a close-up:

The completed digitization looks like this:

and results in a .csv file which looks partly like:

There look to be extra points in the digitization, which I’ll explain. It is important to note that the code I reference later which is available to the public demands digitization be done in this style. That code has no other documentation. I don’t give a recipe. That said, it’s not difficult to figure out.

The first point I take is the baseline, not in any of the bars of the bar plot. The next point I take is on the dashed upper score. I then do two bars, taking the baseline of the bar, and the upper horizontal of the bar. The rest of the bars have only their upper horizontal marked. The point is to get a good estimate of the baseline, obtained as an average of three baseline observations, the initial outside of bars, and then from two bars. There is one observation on the upper score, and then there are observations of the upper horizontals for the rest of the bars.

The heights of the bars can be estimated from the difference between the reading for their upper bars and the estimate of the baseline as the mean of three observations. These can be divided by the distance between the estimate of baseline and the upper score in order to calculate a portion of the range to upper score.

Note that because these are pixel coordinates, the ordinate values of the observations higher up on the bar plot are lower in coordinates than, say, the baselines. This is because distances on the ordinate are measured (ultimately) as pixels from the top of the image. Accordingly, some distance calculations need to have their signs reversed.

The accompanying R code reads in these .csv files and then extracts the heights for each of the hours in a day. There is a system for the Westwood case which you can understand by reading the code where each of the separate stores’ files are assimilated into a single corresponding matrix of scored versus hour of day and day of week.

In the end what’s in hand is a matrix of values proportional to numbers of visits to the stores. Calibration and actual counts have indicated that a value of unity corresponds to about 140 visitors.

Now that traffic to stores is available, or at least, something proportional to traffic, it is a matter of constructing a sampling plan. A plan which is proportional to traffic makes the most sense. This is equivalent to sampling time intervals where probability of electing an interval is proportional to the estimated traffic in the interval. For this study, the surveyors expressed a desire not to be surveying more than 60-90 minutes at a time. I settled for 60 minutes. So the question became one of finding a set of samples of individual hours for a store weighted by the probability of traffic.

The melt.array function of the R reshape2 package was handy here, and I was able to use the sampling-without-replacement of the R built-in sample to achieve the appropriate election. The volunteers had a strict constraint on the total number of times they wanted to visit stores. The code in the R file generateSamplingPlan.R produces several options, based upon the setting of the N.stage1 variable. They also did not want to survey before 9:00 a.m. and after 10:00 p.m.

The result is a sampling plan which looks a little like this:

The code and data supporting this post is available in a repository. Note that it is live, and exists to support an ongoing project, so there is no promise of stability. Note, however, that it is subject to Google’s version controls system.

So, what happens after the regulation or ordinance is adopted? What’s the sampling plan to find out how things are going?

At first it seems that simply repeating the days and times would be the best. On the other hand, remember that the sampling plan designed was intended to expose data collection to as representative a set of people as could be had given the constraints on that sampling. So, in principle, it shouldn’t harm at all to generate new sampling plans with the same constraints, ones which, invariably, will give other times and days. They are all vehicles for getting at what the population prefers.

Posted in bag bans, citizen data, citizen science, Commonwealth of Massachusetts, Ecology Action, evidence, Google, Google Earth, Google Maps, goverance, lifestyle changes, microplastics, municipal solid waste, oceans, open data, planning, plastics, politics, pollution, public health, quantitative ecology, R, R statistical programming language, reasonableness, recycling, rhetorical statistics, sampling, sampling networks, statistics, surveys, sustainability | 1 Comment

A lagomorph has an idea which might save the world

Eli, who offers a clever and consistent consumption-based accounting scheme.

  1. Consumption-based Carbon accounting: Does it have a future?
  2. Consumption-based accounting of CO2 emissions

Aside | Posted on by | Leave a comment

“Renewables are set to penetrate the global energy system more quickly than any fuel in history” (BP, 2019 Energy Outlook)

Selections from BP Energy Outlook: 2019 edition:

In the ET scenario, the costs of wind and solar power continue to decline significantly, broadly in line with their past learning curves.

To give a sense of the importance of technology gains in supporting renewables, if the speed of technological progress was twice as fast as assumed in the ET scenario, other things equal, this would increase the share of renewables in global power by around 7 percentage points by 2040 relative to the ET scenario, and reduce the level of CO2 emissions by around 2 Gt.

The impact of these faster technology gains is partly limited by the speed at which existing power stations are retired, especially in the OECD.

If, in addition to faster technological gains, policies or taxes double the rate at which existing thermal power stations are retired relative to the ET scenario, the reduction in emissions is doubled.

This suggests that technological progress without other policy intervention is unlikely to be sufficient to decarbonize the power sector over the Outlook. The ‘Lower carbon power’ scenario described below considers a package of policy measures aimed at substantially decarbonizing the global power sector.

The extent to which the global power sector decarbonizes over the next 20 years has an important bearing on the speed of transition to a lower-carbon energy system.

In the ET scenario, the carbon intensity of the power sector declines by around 30% by 2040. The alternative ‘Lower-carbon power’ (LCP) scenario considers a more pronounced decarbonization of the power sector.

This is achieved via a combination of policies. Most importantly, carbon prices are increased to $200 per tonne of CO2 in the OECD by 2040 and $100 in the non-OECD – compared with $35-50 in OECD and China (and lower elsewhere) in the ET scenario.

Carbon prices in the LCP scenario are raised only gradually to avoid premature scrapping of productive assets.

There is one gloomy projection. Despite the progress on the world scene,

The share of renewables in the US fuel mix grows from 6% today to 18% by 2040.

If that were to come true, in the context of these other changed, it is possible the United States would be regarded a pariah state and have economic sanctions imposed upon it. But … these projections have several built-in assumptions. Recall, BP is a bit like the U.S. Energy Information Administration and the world IAEA in that they are an established bureaucracy of forecasters. Both EIA and IAEA have systematically underestimated the acceleration in solar and wind adoption over the last decade.

Also, it is telling that BP’s assessment regarding the slowness with which wind and energy displace fossil fuel generation is because of the capital costs of retiring existing generation and replacing it. There are two points here.

First, the incremental capital costs for substituting solar+wind+storage for the same unit of fossil fuel energy is much smaller, as long as the accounting is done correctly. In particular, the costs to society are not just the generating plant, but the capital infrastructure needed to mind and bring the fuel to the point of combustion. There are also tremendous Sankey losses associated with Carnot cycle energy production. (See also.) That’s wasted money.

Second, the BP analysis clearly assumes the market and business structure for providing such energy remains intact. That assumption is big, one akin to assuming the there will always be a Sears and always be a Kodak. If, in fact, there are energy sources available at much lower costs per kWh or BTU, the market isn’t going to care about the sunk costs of existing players. It will go around them, and they will either seek government subsidies to remain intact, or economically die.

So, the “pariah state” outcome for the United States is too gloomy. I, instead, see a United State whose economic productivity might be increasing assaulted by challenges from climate change, including impacts to personal wealth and, so, unwillingness to consume at rates comparable to before, direct damage to productive capacity, including extensive damage to supply chains within country, and to basic infrastructure that permits people to get to their jobs, and costs in insurance and of doing business. But, I also see a hunger for cheaper everything, especially energy, and a thriving market willing to supply that with wind and solar and storage, widely distributed, overcoming zoning and other objections because many people have abandoned suburbs due to affordability and proximity to work, and because the gap between cost of energy from zero Carbon sources is so huge, a tenth of the comparable cost from fossil fuel sources.

It’s one thing to be a zealot for fossil fuels. It’s something else to ignore paying but 10% of the cost of something if zealotry is pursued.

Posted in Anthropocene, being carbon dioxide, Bloomberg New Energy Finance, BNEF, BP, bridge to somewhere, Carbon Tax, clean disruption, CleanTechnica, climate change, climate disruption, corporate citizenship, corporate litigation on damage from fossil fuel emissions, decentralized electric power generation, decentralized energy, ecomodernism, economic trade, ecopragmatist, fossil fuel divestment, fossil fuel infrastructure, global warming, Hyper Anthropocene, investing, investment in wind and solar energy, investments, local generation, local self reliance, solar democracy, solar domination, solar energy, solar power, the energy of the people, the green century, the right to be and act stupid, the right to know, the value of financial assets, Tony Seba, tragedy of the horizon, utility company death spiral, wind energy, wind power, zero carbon | Leave a comment

Tit-For-Tat in Repeated Prisoner’s Dilemma: President Donald Trump creates the Green New Deal

Jonathan Zasloff at Legal Planet offers “Donald Trump creates the Green New Deal”. The closing excerpt:

But what goes around comes around. A President Harris, or Warren, or Booker, etc. etc. can just as easily declare a National Emergency on Climate Change — one that would have a far better factual predicate than Trump’s patently false border emergency — and he or she will a lot more money to move around. After all, a lot of the climate crisis is about infrastructure, and if the relevant statute allows the President to move money from one project to another, then it is very easy to do that. Or the $100 billion that DOD has for national security emergencies: given that both the Pentagon and the heads of the national intelligence agencies have already said that climate represents a serious national security challenge, it’s not a hard legal lift (assuming intellectually honest and consistent judges, which of course we cannot). This fund must be for a military purpose, and a smarter, more energy efficient energy grid could do the trick.

It’s no way to run a democracy. But Trump and the GOP have made it clear that they do not believe in democracy, and as Robert Axelrod demonstrated years ago in his classic book The Evolution of Cooperation, the best strategy in repeat-player games to facilitate cooperation is playing Tit-For-Tat.

See also Generous Tit-For-Tat.

Update, 2019-02-18

Dan Farber writes on “National Security, Climate Change, and Emergency Declarations” at Legal Planet that:

If the Supreme Court upholds Trump, it will have to uphold an emergency declaration for climate change.

One reason why it would be hard for the Supreme Court to overturn a climate change declaration is that some attributes of climate change and immigration are similar. Both issues involve the country’s relations with the outside world, an area where presidential powers are strong. But it isn’t as if we suddenly found out about border crossings or climate change. Given these similarities, it would be very difficult for the conservative majority to explain why it was deferring to the President in one case but not the other.

The only major difference actually cuts strongly in favor of an emergency declaration for climate change: The U.S. government has already classified climate change as a serious threat to national security, and it is a threat that is getting stronger daily. Recent science indicates that climate action is even more urgent than we thought.

Trump’s stated justification in his proclamation is that “the problem of large-scale unlawful migration through the southern border is long-standing, and despite the executive branch’s exercise of existing statutory authorities, the situation has worsened in certain respects in recent years.” Trump’s stated justification in his proclamation is that “the problem of large-scale unlawful migration through the southern border is long-standing, and despite the executive branch’s exercise of existing statutory authorities, the situation has worsened in certain respects in recent years.” Climate change, too, is a “longstanding problem,” and it certainly has gotten worse despite the effort of the executive branch (Obama) to address the problem. Federal agencies, as well as Congress, have made it clear that climate is a serious threat to our nation.

Posted in climate change, game theory, global warming, Green New Deal | Leave a comment