## You can, too!

One of my many favorite videos by Climate Adam:

Here’s another:

Here’s one reason why:

Backing this up:

## October 2013 retrospective … Karl Ragabo on ‘Talk Solar’ podcast, regarding value of solar generation

In October of 2013, Karl Ragabo was interviewed on the Talk Solar podcast from Beth Bond of Decatur, GA. This was shortly after the first version of the Value of Solar report was issued by IREC. Listen to it below:

This presentation is particularly valuable for people, like municipal regulators, who sometimes interact with, purchase from, and make decisions affected by utility company practices yet do not know from where many of them are coming, and how much of a business model disruption distributed PV generation and storage threatens.

#### ‘Wärtsilä introduces new hybrid solar PV and storage solution’

###### (Image courtesy of Wärtsilä, and you can read more about the above solution here.)

Readers may notice the PV farm in the figure above was placed in a sparsely treed area, resulting in trees being cut down. An interesting discussion might ensue, either here in the comments or in a future post, if there is enough interest, regarding the costs and benefits of substituting large scale PV farms for a relatively undisturbed natural ecosystem.

Another interesting point is whether or not losing ground to zero Carbon energy generation is indirectly a cost of failing to address and cost-in climate change. Companies and people look to things like Carbon pricing and Carbon taxes as the most direct effects, or increases in rates to pay for zero Carbon incentives. On the other hand, to the degree to which zero Carbon energy is, hands down, the best long term bet an energy investor or consumer can make (see below) means companies which ignore this, particularly utilities, have a lot to lose.

## Repeat of Long Mill 1, on a moderately warm day

###### (Click on map to be taken to my Ride with GPS site where you can interact with the route display.)

I am, by the way, steadily changing my displays to present data in Metric Units rather than English Units. I began with temperatures, and now I’m moving on to distances and speeds. I want to get good enough to have a sense of how far, say, 12 km is without converting to miles or feet.

## “Not ready to make nice” (Dixie Chicks)

I stick by my friends in these hard times:

## Another reason why the future of Science and STEM education in the United States is cloudy

From Nature‘s “Universities spooked by Trump order tying free speech to grants“, with the subheading “White House policy will require universities to certify that they protect free speech to remain eligible for research funding”, comes this chilling news:

US President Donald Trump signed an executive order on 21 March that requires universities to certify that they protect free speech, or risk losing federal research funds.

Public institutions will have to certify that they are following free-speech protections laid out in the First Amendment of the US Constitution, and private institutions must promise to follow their stated policies on free speech, a White House official told reporters on 21 March.

The order applies to 12 research agencies, including the National Institutes of Health, the National Science Foundation, the Department of Energy and NASA. It affects only money for research, not financial aid for students.

“We’re dealing with billions and billions and billions of dollars,” Trump said in a speech just before signing the order. “Taxpayer dollars should not subsidize anti-First Amendment institutions.” He said that the order was the first in a series of steps that his administration intends to take to “defend students’ rights”.

Clearly, this is an attempt to magnify the pseudo-standard of “fair and balanced” so badly invoked in media to elevate unsubstantiated and illogical claims from scientifically illiterate and innumerate minorities to the status of powerful political voices. Witness the collective disposition of climate change.

Worse, though, it is another step of encroachment of an hitherto economically unsuccessful populist world view, one which coincides with basically large scale sour grapes, upon the Success Centers of United States culture. These are overwhelmingly Blue, self-made, urban, and diverse, even if they still allocate their wealth unfairly. It is an extended exercise of spiting oneself for, without these technologies, military safety and economic success won’t continue.

But, people aren’t going to wait for that to be rectified in some hypothetical future — and probably Democratic — administration. This is a dynamic business world, and people seek their own comfortable surroundings and fortunes.

And, so, there is a Brain Drain beginning from the United States to elsewhere. (This is also known as human capital flight.) First, it was limited to the rejection or imposition of discomfort of brilliant and ingenious technical entrepreneurs from India and Pakistan and China, who thought nothing better than coming to what once was the haven and incubator for free enterprise and free ideas and founding a fortune. But, now, even the best and the brightest of full born Americans, young bright minds and spirits who know how to succeed, are beginning to see the rest of the world as more inviting and accommodating, and are making the hard choice to uproot and go, emigrate.

I applaud them for their foresight. The idea of blind loyalty despite cultural sins and political idiocy is itself idiotic. It is not living, it is a self-deprecating religion.

And, so, I was not at all surprised that Nature also carried an extended article chronicling how five scientists had wrestled with the idea of moving to another country to improve their futures.

See:

## Result of our own fiddling: Bob Watson and climate risk

https://sms.cam.ac.uk/media/746045

Professor Bob Watson, University of East Anglia, presents the summary risk, climate change:

The question is not whether the Earth’s climate will change in response to human activities, but when, where and by how much. Human activities are changing the Earth’s climate and further human-induced climate change is inevitable. Indeed the climate of the next few decades will be governed by past emissions. The most adverse consequences of human-induced climate change will be in developing countries and poor people within them. Climate change threatens to bring more suffering to the one billion people who already go to bed hungry every night and the approximately 2 billion people exposed to insect-borne diseases and water scarcity. Sea level rise threatens to displace tens of millions of people in deltaic areas and low-lying small island states. Climate change will undermine the ability of many poor people to escape poverty and the long-term sustainable economic development of some countries. Hence, climate change is not only an environmental issue, but a development and security issue. The challenge is to limit the magnitude and rate of human-induced climate change, and simultaneously reduce the vulnerability of socio-economic sectors, ecological systems and human health to current and projected climate variability by integrating climate concerns into local and national economic planning. Technological options for reducing greenhouse gas emissions cost-effectively over the next few decades already exist. However, the required transition to a very low carbon economy (a reduction in global emissions by at least 50% by 2050) will require a technological evolution in the production and use of energy, energy sector reform, appropriate pricing policies and behavior change, coupled with a more sustainable agricultural sector and reduced deforestation. This transition to a low-carbon economy must be achieved while improving access to affordable energy in developing countries, which is critical for economic growth and poverty alleviation, and while ensuring adequate affordable and nutritious food. The challenge is to negotiate a long-term (up to 2050) global regulatory framework that is equitable with common but differentiated responsibilities and has intermediate targets that can reduce greenhouse emissions to a level that limits the increase in global mean surface temperature to 2C above pre-industrial levels. While this goal has been widely accepted, the current rate of growth in emissions globally, coupled with a failure in Copenhagen to agree to stringent targets to reduce emissions, makes this goal extremely difficult, hence the world needs to be prepared to adapt to a 4C warmer world.

## Welcome to snowy New England … Bad place for solar PV, right?

And this is ISO-NE, who, as little as three years back were highly sceptical anything other than additional natural gas generation could supply the ever increasing electrical power needs of the region, particularly with the withdrawal of generation from oil, coal, and nuclear sources scheduled for the period.

Oh. So, perhaps, maybe, Professor Tony Seba nailed it right on all along. What a concept.

Welcome to New England. Bad place for solar PV, right? So why can’t you make it work, Texas, or South Carolina, or Florida, or Georgia, or North Carolina, or Arizona? What are you dumb or something?

Hat tip to SP Global for the original article.

## One of the happiest two hours I’ve spent in months: A Professor Tony Seba update

From end of 2018:

from alianza FiiDEMAC.

And, indeed, it was one of the most uplifting two hours I’ve recently spent. I have long been an admirer of Professor Tony Seba. I have read his books. This was an update on how he now sees the world.

As someone who embraces the legal logic of the Juliana v United States lawsuit, I do not have much confidence in politics being able to mitigate climate disruption. Both political parties in the United States have been repeatedly warned of the consequences of continuing the policy of mining and emitting and their inevitable disruptions of the climate. And, while, technically, United States emissions have plateaued, this is a result of our collective exporting our manufacturing emissions to China.

So, politically, efforts to mitigate climate change, in the United States, but not only in the United States, but also in the OECD, have been an abysmal failure. How depressing. And the death throes of the so-called Green New Deal do not inspire.

I have stated my problems with matters as they are. (Context.) I am pessimistic that the last branch of the United States government will intervene appropriately. They haven’t shown enthusiasm.

And, as I made clear in my statement, this is not a cause for despair. There will be a response. Unfortunately, by the abrogation of interest and concern on the part, firstly, of the general public in the matter, the displacements in jobs, social equity, and wealth which will inevitably occur by their collective lack of engagement will be painful. Nevertheless, this disruption will happen, since economics, at least in OECD countries, are primary.

Climate change will be mitigated, perhaps a bit late, and probably with an incredible loss of present wealth, because of bad bets on the part of the wealthy. I really do not find any reason to sympathize with them. I believe the less privileged won’t be impacted any more than they usually are, and, in the dissolution of wealth which will inevitably occur, they may have opportunities they did not have previously. In any case, the presumed omniscience on the part of the Haves over the Have Nots in United States society should be destroyed in concept, although the ignorance of some publics regarding our present leadership gives me some pause in this conclusion.

In any event, I feel this change is inexorable, not, as Professor Tony Seba repeatedly emphasizes, because of do-good environmental policies, but because the time of zero Carbon energy and smart distribution of it via computation has arrived.

And, frankly, as uncharitable as the opinion might seem, I have zero commiseration with those who opposed the advance of such zero Carbon energy, whether that means they lose their jobs, lose their investments, or cannot provide for their offspring. For they are the reason why, after more than 20 years of knowing about climate change, we have collectively done nothing, and, in process, thrown doubt at Science and Engineering and Mathematics, they deserve no sympathy, and no consideration. Let them be a lesson.

It is also notably that electorate should be highly cautious of urgings on the part of fossil fuel interests, including extractions companies as well as their supporters, to reimburse them for losses relating to this disruption. There is ample evidence they saw what was coming and chose to oppose it rather than adapting to it. That was a choice. That was their right. But they should not be given a penny because they chose wrongly. There is nothing more fundamental to free market capitalism than the principle that those who make bad bets should bear the full cost of making those bad bets.

## “Ridiculously well-designed rockets”, not to mention some seriously awesome Mathematics

I’m just amazed with the quality of their controls systems, understated in the video, but which are absolutely critical to success.

For more technical details, see:

B. Açıkmeşe, J. M. Carson III, L. Blackmore, “Lossless convexification of nonconvex control bound and pointing constraints of the soft landing optimal control problem“, IEEE Transactions on Control Systems Technology, 21(6), November 2013.

## Macros in R

### via Macros in R

• The gtools package of R which enables these.
• There’s a description and motivation beginninng on page 11 of an (old: 2001) R News issue.

They have been around a long time, but I haven’t tried them.

I will.

## Temperatures, Summers, Germany, ≈ 50.5N to 57.5N latitude

###### (Click on figure for larger image and use browser Back Button to return to blog.)

Hat tip to Gregor Aisch, Adam Pearce, and Steve Hoey, and sourced from the mashup dataset and visuals by Lisa Charlotte Rost.

Mr Aisch’s innovation was to use Loess regression to display. Loess is one of a set of local regression methods. I personally prefer p-spline smoothing (penalized spline regression).

In contrast, Boston is 42.4N, Bangor, Maine is 44.8N, and Montreal, Quebec, Canada is 45.5N.

## “Rising seas erode $15.8 billion in home value from Maine to Mississippi” From the First Street Foundation‘s press release, with selected figures below. This is based upon the methods described in: S. A. McAlpine, J. R. Porter, “Estimating recent local impacts of Sea-Level Rise on current real-estate losses: A housing market case study in Miami-Dade, Florida“, (open access) Population Research and Policy Review, December 2018, 37(6), 871–895. ###### (Click on image to see larger figure and use browser Back Button to return to blog.) ###### (Click on image to see larger figure and use browser Back Button to return to blog.) ## Procrustes tangent distance is better than SNCD I’ve written two posts here on using a Symmetrized Normalized Compression Divergence or SNCD for comparing time series. One introduced the SNCD and described its relationship to compression distance, and the other applied the SNCD to clustering days at a high school based upon patterns of electricity consumption. Having good tools for making such comparisons is important, because such bases for clustering and exploration are useful when examining large datasets, like the hydrological datasets I’ve previously described. I am also finally getting around to doing something with these datasets, a project I put off because of my commitments to climate activism over the last few years. Despite my earlier enthusiasm for SNCD as a tool for series comparisons, it turns out there is a better measure, something called Procrustes tangent distance (“PTD”). I discovered this in the second edition of a book by I. L. Dryden and K. V. Mardia, called Statistical Shape Analysis, with Applications in R (2016) and through related literature and scholarship. A key paper is J. T. Kent, K. V. Mardia, “Shape, Procrustes tangent projections and bilateral symmetry“, Biometrika, 2001, 88(2), 469-485 (with correction). PTD is superior because it and related efforts reduce shape comparisons like that of two time series to ordinary multivariate analysis. (See pertinent book by Mardia, J. Kent, and J. Bibby as well.) For purposes of statistical analysis, it’s difficult to get better than that. This is an outcome of a problem area dubbed Generalized Procrustes Analysis (“GPA”), and arises in applications where biological shapes need to be matched, such as bivalve shells. It also arises in archaeological work where automated methods for matching shards of pottery are engaged. These techniques and problems have deep connections to differential geometry and have engaged other great minds besides Mardia, Dryden, and Kent. PTD may not be the last word. In particular, C. P. Klingenberg, L. R. Monteiro, “Distances and directions in multidimensional shape spaces: Implications for morphometric applications“, Systematic Biology, 54(4), 1 August 2005, 678–688 reviewed some criticisms of PTD, along with discussion by Dryden and Mardia, with others. My application is more modest than the general multidimensional shapes problem, being limited strictly to two dimensions where some of these complications to not arise. Unfortunately, the details of defining the Procrustes tangent distance are involved. Procrustes analysis begins with the consideration of $k$ $m$-dimensional landmarks and proceeds to the recovery of a rotational invariant shape, obtained by maximizing the trace of a product, $\text{tr}(\mathbf{A} \mathbf{Q})$, involving a symmetric landmarks distance matrix, $\mathbf{A}$ and a rotation matrix, $\mathbf{Q}$, over all $\mathbf{Q}$. The value of the trace and the maximizing rotation is found using the SVD, and that is also used in the practical construction of the PTD. The next step is a linearization by constructing a tangent space, namely, the Procrustes tangent space, and an associated tangent matrix, $\mathbf{T}$, which is constructed as follows. Let $\mathbf{A}_{1}, \mathbf{A}_{2}$ be two sets of $k$-by-$m$ landmarks matrices. Recall these are landmark coordinates in $m$ dimensions and there are $k$ of them. Find the maximum over rotation matrices $\mathbf{Q}$ of $\text{tr}(\mathbf{A}_{2}^{\top}\mathbf{A}_{1}\mathbf{Q}) = \alpha$ Call that maximum point $\hat{\mathbf{Q}}$. Then $\mathbf{T} = \mathbf{A}_{1} \mathbf{Q} - \alpha \mathbf{A}_{2}$ and this can be re-expressed, after some algebra, as $\mathbf{A}_{1} = (\cos{(\rho \mathbf{A}_{2} + \mathbf{T})})\mathbf{Q}^{\top}$ Because of an implicit constraint on $\alpha$, $\rho$ turns out to be a bounded, non-negative Riemannian distance between $\mathbf{A}_{1}$ and $\mathbf{A}_{2}$ and their shapes. While the equation above could be solved using non-linear minimization, there are more direct approaches sketched in Kent and Mardia. Moreover, my calculations of PTD are obtained by calls to the function procGPA from the shapes package offered by I. L. Dryden. The article by Klingenberg and Monteiro cited above also gives a qualitative overview. The insight for applicability to time series comes from this sketch: Applying the PTD to unique pairs of edges results in: Note however that the traces in the picture could just as well be three different time series. Accordingly, the PTD for shapes also yields distances between time series. Does this generalize, however? Do the distances continue to make sense even when the series differ in other ways? Consider ###### (Click image to see a larger figure, and use browser Back Button to return to blog.) In the labeling atop of each, the “L” factor is inversely proportional to slope, except for the zero case, which is a zero slope. In the same, the “W” factor is inversely proportional to frequency. What does the PTD produce as distances among these? Note the the larger the number in the following figure, the farther away the cases are: ###### (Click image to see a larger figure, and use browser Back Button to return to blog.) The distances show that irrespective of slope, the PTD is picking up ripple trains with the same frequency. Some are annotated. Note that these distances have been multiplied by 100 times to get the distances in a range where they register well in the plot. What this means is that PTD considers all the cases pretty close to one another in shape. Nevertheless, it is capable of good discriminations. What does SNCD do with the same 16 cases? ###### (Click image to see a larger figure, and use browser Back Button to return to blog.) In short, the divergences are very difficult to reconcile with any pattern of similarity. Even shorter, SNCD butchered it. Code for calculating these figures and results is available in my Google repository. Finally, I have repeated the analysis of high school electricity consumption clustering with PTD and found it gave nearly identical results to use of SNCD, ## “Unpleasant surprises in the greenhouse” (in memorium, Professor Wallace Broecker) These are excerpts from a 1987 paper by Professor Wallace Broecker, widely acknowledged to be one of the greatest climate scientists and oceanographers in the last century. . . . Posted in science | 1 Comment ## One possible way to do small, modular nuclear power Featured in Science Magazine today, NuScale Power, a spinout from Oregon State University, is planning simpler, smaller, safer gang-lashable nuclear reactors, with a trial in the early 2020s. A schematic is shown below. As I’ve noted here elsewhere, the reason why conventional nuclear reactor designs have a negative learning curve is because the industry did not turn the nuclear reactors into commodities, taking advantage of large scale replication. Despite the unhappiness some have with nuclear power, it is clear that a good solution to most of its ills, including cost and rollout time, would be a godsend for providing the massive amounts of electrical power we need to electrify the entire United States and the world. I continue to argue that those who oppose such developments on some kind of principle do not understand or appreciate the desperate solution situation with respect to climate change we have placed ourselves, and the soon-to-be-realized consequences. ## Legacy It should be noted that, exponential growth is a plank in the theoretical framework of modern Ecology. See L. Pásztor, Z. Botta-Dukát, G. Magyar, T. Gzárán, G. Meszéna, Theory-Based Ecology: A Darwinian approach, 2016. Dr Suzuki points out that, objectively, people are big animals, and the total biomass on Earth due to human beings is quite large. We are also large in terms of our demands upon the natural world, and, in fact, each one of us consumes many times more than the world’s natural carrying capacity for us. This is possible because of technology, and fossil fuels. ## Professor Kevin Anderson: “Climate’s holy trinity” ## 24th January 2019, Oxford, England, UK Appalling failure: Who is responsible: Yeah, it’s us. ## On bag bans and sampling plans Plastic bag bans are all the rage. It’s not the purpose of this post to take a position on the matter. Before you do, however, I’d recommend checking out this: and especially this: Good modern governance means having evidence-based decisions. So, if a bag ban of any kind, or a bag tax of any kind is going to be imposed, it makes sense to assess how much and what kinds of use of bags are prevalent before the ban or tax, and how this changes after the ban or tax. This kind of thing used to need to be done with professional surveyors and statisticians. But with the availability of online datasets, access to the experience of others, widely available and open-source computing, and new survey technology and methods, expensive professional options aren’t the only way this can be done. Professional surveyors tend to argue otherwise. But, facts are, you can learn a lot by using Google Earth and Google Maps these days. Surveys are designed around answering specific questions. If the objective is to estimate how many bags of one kind or another are being consumed per week in a town or county, that’s one question. If the objective is to estimate how many people regularly choose paper over plastic, or bring-their-own-bags, that’s an entirely different question. The governance and the group need to choose what’s important to them. Surveys are also designed around the skillsets of the people involved in conducting them. With a volunteer organization, it is important that the procedure be something they can readily be trained in, and I say “trained” because no survey can do without training, however simple. Surveys also ought to be easy on the surveyors, especially if they are volunteers. The requirements of when they need to be on stations oughtn’t be so onerous that they might not arrive on time, or not show up at all, and, worse, misrepresent to the group what happened. So, for instance, even if there are shoppers using bags in a store at 6:00 a.m., it’s probably not going to get covered well if a sampling plan were to require it. Surveys also ought to be easy to explain to those who want to know how they were done. Along with this, it is critically important that, as part of an analysis of the primary quantities of interest, like plastic bags used per week, the survey’s contribution to overall uncertainty is quantified. All that noted, there is a lot interested and committed citizens can do to gather data like this, and interpret results. This can be important, whether or not a town or county governance consider its findings as inputs. It can serve as a check on their result. It can also serve as a check on their budget, meaning why did the pay for some expensive professional organization to do something when something good enough for the purpose could have been had much cheaper? That said, any old surveying or sampling technique which appears to be good enough isn’t good enough. That is, there is some training and learning involved. Returning to the bag ban matter, use of bags is key to the project. As with any policy, if a regulation is imposed and there is no evidence it helps or has untoward consequences, it ought to be revoked. To do that means measuring a baseline, and then measuring after the regulation is in place. It probably is a good idea to measure a couple of times after the regulation is in place. In statistics and engineering, this general approach is called A/B testing, which is explained better here. As mentioned above, how one gets counts — they are nearly always counts — and then analyzes them depends very much on the question being asked. But, in the case of bags, there’s the really important question of where and when to sample. In this case, I’m setting aside bags given out in stores other than grocery stores. And, in this case, I’m using the example of my home town, Westwood, Massachusetts. Counting noses or bags assumes there’s a sampling frame in hand. In Westwood’s case, the concern is the population of residents or visitors frequenting local grocery stores. And in Westwood’s case, there’s a desire to count people and their preferences for bags, whether plastic, paper, some mix, or whether they bring their own bags and some mix, as well as size of order. So this means counting people. There are three grocery stores in Westwood: Roche Brothers in Islington, Wegman’s at University Station, and Lambert’s. There are convenience and other stores which sell small amount of groceries, but these were assumed to show behaviors which would be exhibited by the populations of the three majors. But, still, surveying these stores either demands deep cooperation on the part of their owners, an outrageous commitment on the part of volunteers, or a sampling scheme that is constructed with knowledge of who goes where when. Where to find such a thing? Google. Most substantial grocery store entries on Google Maps now present a bar chart of when they are most frequently visited. It looks like this: Now, I’ve discovered that that dashed line is a fixture marking a certain number of visits per hour. It is constant for a given store across days of the week. And it is at least roughly consistent across stores in an area. This is great. This was helped, in part, by a visit to one of the stores by a volunteer to take data for a half hour. She was collecting data for me, and was also trying out a data collection form and seeing how difficult it was or easy to get the kind of data that was pertinent. Doing this is an excellent idea. But, wait, you say: These aren’t numbers. It’s a bar chart. Digitizing. I learned this when I took courses in Geology. An amazing amount of data is recovered by digitizing figures in scientific journals. Why not Google? There are several digitizing applications out there. I’ve tried a couple and, so far, I like WebPlotDigitizer best. So, I did. Digitize, that is. How? Here I’ve marked, by hand, two points on the bar chart, attempting to ascertain the height of the dashed line in pixels from the baseline. Note the original images aren’t produced to the same resolution or size, so it’s important to calibrate each one. In the upper right you can see WebPlotDigitizer‘s close-up of the place where the cursor is. That’s a little hard to see, since there’s so much real estate there, but here’s a close-up of the bottom: And here’s a close-up of the upper right, a close-up of a close-up: The completed digitization looks like this: and results in a .csv file which looks partly like: There look to be extra points in the digitization, which I’ll explain. It is important to note that the code I reference later which is available to the public demands digitization be done in this style. That code has no other documentation. I don’t give a recipe. That said, it’s not difficult to figure out. The first point I take is the baseline, not in any of the bars of the bar plot. The next point I take is on the dashed upper score. I then do two bars, taking the baseline of the bar, and the upper horizontal of the bar. The rest of the bars have only their upper horizontal marked. The point is to get a good estimate of the baseline, obtained as an average of three baseline observations, the initial outside of bars, and then from two bars. There is one observation on the upper score, and then there are observations of the upper horizontals for the rest of the bars. The heights of the bars can be estimated from the difference between the reading for their upper bars and the estimate of the baseline as the mean of three observations. These can be divided by the distance between the estimate of baseline and the upper score in order to calculate a portion of the range to upper score. Note that because these are pixel coordinates, the ordinate values of the observations higher up on the bar plot are lower in coordinates than, say, the baselines. This is because distances on the ordinate are measured (ultimately) as pixels from the top of the image. Accordingly, some distance calculations need to have their signs reversed. The accompanying R code reads in these .csv files and then extracts the heights for each of the hours in a day. There is a system for the Westwood case which you can understand by reading the code where each of the separate stores’ files are assimilated into a single corresponding matrix of scored versus hour of day and day of week. In the end what’s in hand is a matrix of values proportional to numbers of visits to the stores. Calibration and actual counts have indicated that a value of unity corresponds to about 140 visitors. Now that traffic to stores is available, or at least, something proportional to traffic, it is a matter of constructing a sampling plan. A plan which is proportional to traffic makes the most sense. This is equivalent to sampling time intervals where probability of electing an interval is proportional to the estimated traffic in the interval. For this study, the surveyors expressed a desire not to be surveying more than 60-90 minutes at a time. I settled for 60 minutes. So the question became one of finding a set of samples of individual hours for a store weighted by the probability of traffic. The melt.array function of the R reshape2 package was handy here, and I was able to use the sampling-without-replacement of the R built-in sample to achieve the appropriate election. The volunteers had a strict constraint on the total number of times they wanted to visit stores. The code in the R file generateSamplingPlan.R produces several options, based upon the setting of the N.stage1 variable. They also did not want to survey before 9:00 a.m. and after 10:00 p.m. The result is a sampling plan which looks a little like this: The code and data supporting this post is available in a repository. Note that it is live, and exists to support an ongoing project, so there is no promise of stability. Note, however, that it is subject to Google’s version controls system. So, what happens after the regulation or ordinance is adopted? What’s the sampling plan to find out how things are going? At first it seems that simply repeating the days and times would be the best. On the other hand, remember that the sampling plan designed was intended to expose data collection to as representative a set of people as could be had given the constraints on that sampling. So, in principle, it shouldn’t harm at all to generate new sampling plans with the same constraints, ones which, invariably, will give other times and days. They are all vehicles for getting at what the population prefers. ## A lagomorph has an idea which might save the world Eli, who offers a clever and consistent consumption-based accounting scheme. Aside | Posted on by | Leave a comment ## “Renewables are set to penetrate the global energy system more quickly than any fuel in history” (BP, 2019 Energy Outlook) Selections from BP Energy Outlook: 2019 edition: In the ET scenario, the costs of wind and solar power continue to decline significantly, broadly in line with their past learning curves. To give a sense of the importance of technology gains in supporting renewables, if the speed of technological progress was twice as fast as assumed in the ET scenario, other things equal, this would increase the share of renewables in global power by around 7 percentage points by 2040 relative to the ET scenario, and reduce the level of CO2 emissions by around 2 Gt. The impact of these faster technology gains is partly limited by the speed at which existing power stations are retired, especially in the OECD. If, in addition to faster technological gains, policies or taxes double the rate at which existing thermal power stations are retired relative to the ET scenario, the reduction in emissions is doubled. This suggests that technological progress without other policy intervention is unlikely to be sufficient to decarbonize the power sector over the Outlook. The ‘Lower carbon power’ scenario described below considers a package of policy measures aimed at substantially decarbonizing the global power sector. The extent to which the global power sector decarbonizes over the next 20 years has an important bearing on the speed of transition to a lower-carbon energy system. In the ET scenario, the carbon intensity of the power sector declines by around 30% by 2040. The alternative ‘Lower-carbon power’ (LCP) scenario considers a more pronounced decarbonization of the power sector. This is achieved via a combination of policies. Most importantly, carbon prices are increased to$200 per tonne of CO2 in the OECD by 2040 and $100 in the non-OECD – compared with$35-50 in OECD and China (and lower elsewhere) in the ET scenario.

Carbon prices in the LCP scenario are raised only gradually to avoid premature scrapping of productive assets.

There is one gloomy projection. Despite the progress on the world scene,

The share of renewables in the US fuel mix grows from 6% today to 18% by 2040.

If that were to come true, in the context of these other changed, it is possible the United States would be regarded a pariah state and have economic sanctions imposed upon it. But … these projections have several built-in assumptions. Recall, BP is a bit like the U.S. Energy Information Administration and the world IAEA in that they are an established bureaucracy of forecasters. Both EIA and IAEA have systematically underestimated the acceleration in solar and wind adoption over the last decade.

Also, it is telling that BP’s assessment regarding the slowness with which wind and energy displace fossil fuel generation is because of the capital costs of retiring existing generation and replacing it. There are two points here.

First, the incremental capital costs for substituting solar+wind+storage for the same unit of fossil fuel energy is much smaller, as long as the accounting is done correctly. In particular, the costs to society are not just the generating plant, but the capital infrastructure needed to mind and bring the fuel to the point of combustion. There are also tremendous Sankey losses associated with Carnot cycle energy production. (See also.) That’s wasted money.

Second, the BP analysis clearly assumes the market and business structure for providing such energy remains intact. That assumption is big, one akin to assuming the there will always be a Sears and always be a Kodak. If, in fact, there are energy sources available at much lower costs per kWh or BTU, the market isn’t going to care about the sunk costs of existing players. It will go around them, and they will either seek government subsidies to remain intact, or economically die.

So, the “pariah state” outcome for the United States is too gloomy. I, instead, see a United State whose economic productivity might be increasing assaulted by challenges from climate change, including impacts to personal wealth and, so, unwillingness to consume at rates comparable to before, direct damage to productive capacity, including extensive damage to supply chains within country, and to basic infrastructure that permits people to get to their jobs, and costs in insurance and of doing business. But, I also see a hunger for cheaper everything, especially energy, and a thriving market willing to supply that with wind and solar and storage, widely distributed, overcoming zoning and other objections because many people have abandoned suburbs due to affordability and proximity to work, and because the gap between cost of energy from zero Carbon sources is so huge, a tenth of the comparable cost from fossil fuel sources.

It’s one thing to be a zealot for fossil fuels. It’s something else to ignore paying but 10% of the cost of something if zealotry is pursued.

## Tit-For-Tat in Repeated Prisoner’s Dilemma: President Donald Trump creates the Green New Deal

Jonathan Zasloff at Legal Planet offers “Donald Trump creates the Green New Deal”. The closing excerpt:

But what goes around comes around. A President Harris, or Warren, or Booker, etc. etc. can just as easily declare a National Emergency on Climate Change — one that would have a far better factual predicate than Trump’s patently false border emergency — and he or she will a lot more money to move around. After all, a lot of the climate crisis is about infrastructure, and if the relevant statute allows the President to move money from one project to another, then it is very easy to do that. Or the \$100 billion that DOD has for national security emergencies: given that both the Pentagon and the heads of the national intelligence agencies have already said that climate represents a serious national security challenge, it’s not a hard legal lift (assuming intellectually honest and consistent judges, which of course we cannot). This fund must be for a military purpose, and a smarter, more energy efficient energy grid could do the trick.

It’s no way to run a democracy. But Trump and the GOP have made it clear that they do not believe in democracy, and as Robert Axelrod demonstrated years ago in his classic book The Evolution of Cooperation, the best strategy in repeat-player games to facilitate cooperation is playing Tit-For-Tat.

### Update, 2019-02-18

Dan Farber writes on “National Security, Climate Change, and Emergency Declarations” at Legal Planet that:

If the Supreme Court upholds Trump, it will have to uphold an emergency declaration for climate change.

One reason why it would be hard for the Supreme Court to overturn a climate change declaration is that some attributes of climate change and immigration are similar. Both issues involve the country’s relations with the outside world, an area where presidential powers are strong. But it isn’t as if we suddenly found out about border crossings or climate change. Given these similarities, it would be very difficult for the conservative majority to explain why it was deferring to the President in one case but not the other.

The only major difference actually cuts strongly in favor of an emergency declaration for climate change: The U.S. government has already classified climate change as a serious threat to national security, and it is a threat that is getting stronger daily. Recent science indicates that climate action is even more urgent than we thought.

Trump’s stated justification in his proclamation is that “the problem of large-scale unlawful migration through the southern border is long-standing, and despite the executive branch’s exercise of existing statutory authorities, the situation has worsened in certain respects in recent years.” Trump’s stated justification in his proclamation is that “the problem of large-scale unlawful migration through the southern border is long-standing, and despite the executive branch’s exercise of existing statutory authorities, the situation has worsened in certain respects in recent years.” Climate change, too, is a “longstanding problem,” and it certainly has gotten worse despite the effort of the executive branch (Obama) to address the problem. Federal agencies, as well as Congress, have made it clear that climate is a serious threat to our nation.

## “What’s new with recycling”

spoke in Norwell, at the South Shore Natural Science Center, a couple of weeks ago:

## “Is the Green New Deal’s ambition smart policy?”

Ann Carlson is the Shirley Shapiro Professor of Environmental Law and the co-Faculty Director of the Emmett Institute on Climate Change and the Environment at UCLA School of Law. Writing at Legal Planet, she takes on assessing the Green New Deal, admitting she is “conflicted about a proposal that seems untethered to what is actually achievable.” She begins:

At the the heart of the Green New Deal — which demands slashing U.S. carbon emissions by 2030 by shifting to 100 percent clean energy — is a major conundrum. Even the most enthusiastic proponents of ambitious climate policy don’t believe the goals are achievable, technologically let alone politically. Stanford Professor Marc Z Jacobsen, for example, among the most ardent advocates for decarbonizing the electricity grid completely, believes that we can achieve 100 percent renewable energy by 2050, three decades after the Green New Deal’s target date. Ernie Monitz, the former Secretary of Energy under President Obama, laments that he “cannot see how we could possibly go to zero carbon in a 10-year time frame.” A number of columnists have noted that the Green New Deal will never become law because of its expense, its political impracticability and its technological infeasibility. And yet, the Green New Deal has attracted huge public support, the endorsement of all of the 2020 Democratic candidates for President, and a large number of Senators and members of Congress. It promises to mobilize a generation of young activists to work to solve the existential crisis of their lives.

Read on. She’s more optimistic than it sounds, although, I think Professor Carlson is realistic.

I remarked in a comment:

I wish the GND proponents well, too, although I worry about a couple of things.

First, the comparison with other environmental programs, while inspiring, is a little inappropriate. There has never been a problem of this scale, and not one whose amplification is so thoroughly integrated in with the daily comforts of affluent humans. Fossil fuels do have high energy densities, and that can be convenient.

Also, related to this, benefits do not accrue if we simply cease emitting. We have a timetable, and Nature will not scrub the harmful materials on any reasonable human timetable, and conditions at the moment we succeed at achieving zero emissions will persist for centuries. The alternative, artificial removal of atmospheric CO2, is both horrifically expensive (multiples of 2014 Gross World Product size at present prices) and pursuit of the technology has been explicitly rejected by GND proponents. (They’ve ruled out advanced nuclear technologies, too.)

Second, without policy which is “tethered to what is actually achievable”, GND suggests the bar is lower than it actually is and could, in itself, both present a moral hazard and make people think climate change is not being mitigated purely for reasons of politics and greed. (This is in bounds because the rejection of negative emissions technology is done because it, too, could be a kind of moral hazard.) Sure, those are involved, but it is also true people don’t like the things that a GND-style solution, or a Professor Mark Z Jacobson solution entail. In my opinion, their choice is silly, but people are people.

Third, aspirational, engineering-free solutions to big, big problems are likely to founder, because they won’t assess and contain their own complications, particularly if they are rushed. Uncoordinated rollout of zero Carbon energy won’t only trash pieces of the grid which will have repercussions for the less well off and people of color, but could also exacerbate climate conditions and regional weather. Large scale plantings, for example, of Jatropha curcas, thought to be a way of doing rapid CO2 drawdown and projecting biodiesel oils, could change albedo in the wrong direction for the arid regions it loves, and, indeed, could do itself in if the same regions transform into tropics. Uncoordinated rollouts of wind farms will affect weather system energies. That’s no reason not to do it, but it needs to be studied and thought through.

Fourth, there is (still) a substantial education component needed, one done in a manner that evades the impression climate change-fixing proponents are pulling their punches. For if byproducts of climate change are severe enough to move people into action, and gets them to accept sacrifices needed to do so, then they probably will expect to see improvements once these changes are made. The science says that expectation is unreasonable, because of the inertia of the climate system and because the human emissions impact is a perturbation on a geological scale in a geological moment. The political ramifications of this realization are both difficult to assess but could be damaging to the long term health of the collective project.

I did not mention other things, such as the intrinsic greenhouse gas emissions from agriculture, even if planting, harvesting, fertilization, transport, and processing are all decarbonized. Cement production is a big piece of emissions, too. The troubling thing is that GND doesn’t mention these: It focuses almost exclusively upon energy.

## From the YEARS Project: How Climate Impacts Mental Health (#climatefacts)

Also the magnificent “We should never have called it Earth“, also from Dr Marvel.

In “Hope, despair and transformation: Climate change and the promotion of mental health and wellbeing“, ritze, Blashki, Burke, and Wiseman [International Journal of Mental Health Systems, 2008, 2(13)] note in a section titled “Emotional distress arising from awareness of climate change as global environmental threat”:

The question that McKibben raises is how psychologically, emotionally and politically should we as human beings respond to this fundamental change in the relationship between the human species and the world we inhabit?
.
.
.
For many people, the resulting emotions are commonly distress and anxiety. People may feel scared, sad, depressed, numb, helpless and hopeless, frustrated or angry. Sometimes, if the information is too unsettling, and the solutions seem too difficult, people can cope by minimising or denying that there is a problem, or avoiding thinking about the problems. They may become desensitised, resigned, cynical, skeptical or fed up with the topic. The caution expressed by climate change skeptics could be a form of denial, where it involves minimising the weight of scientific evidence/consensus on the subject. Alternatively, it could indicate that they perceive the risks of change to be greater than the risks of not changing, for themselves or their interests …
.
.
.
Notwithstanding the enormity of the climate change challenge, we know what many of the solutions are, and there are many actions that citizens can take individually and collectively to make a difference at household, local, national and global level. When people have something to do to solve a problem, they are better able to move from despair and hopelessness to a sense of empowerment.

Blashki, et al include a table from the Australian Psychological Society about how individuals can respond to the stress of being aware of climate change and its impacts:

Finally, there is the tongue-in-cheek yet serious work by Nye and Schwarzennager:

## Status of Solar PV in Massachusetts

At Solar Power Northeast, the DOER of Massachusetts noted that with the mandated 400 MW of qualified projects program review upcoming, and heavy volume deployed in National Grid territory, there is strong consideration to expand and evolve the SMART program.

## “Applications of Deep Learning to ocean data inference and subgrid parameterization”

This is another nail in the coffin of the claim I heard at last year’s Lorenz-Charney Symposium at MIT that machine learning methods would not make a serious contribution to advancements in the geophysical sciences.

Claire and I do.