## 20 July 1969

I worked at IBM Federal Systems Division in Owego, NY from May, 1976 to March of 1994. I did a lot with the IBM System/4 Pi and its operating and support software.

## Shifting to a Sustainable Future (Professor Steven Chu)

A lecture at MIT, in 2018, as the Hoyt C Hottel Lecture in Chemical Engineering.

Notable quote: “The half-life of CO2 in atmosphere is 10,000 years.” (Professor Steven Chu)

## Unpacking and Packing (WHOI)

### “What does it take to unpack and repack R/V Neil Armstrong?”

That’s the R/V Neil Armstrong operated by Woods Hole Oceanographic Institution out of Woods Hole, MA.

A bit appropriate as the 50th anniversary of Moon Day approaches.

## In case you wondered if Carbon Dioxide increases caused climate change, here’s the latest news

In case you wondered if Carbon Dioxide (also called, carbonic acid, CO2) increases caused climate change, here’s the latest news … from 1856-1896:

## Solar plus storage is now cheaper than any non-solar electrical power

And, from that Lefty Socialist rag, Forbes.

## Natural Gas Companies are Doing Themselves In

As I wrote before, there will be no Golden Age of Natural Gas. The climate periodical DeSmog Blog now reports on its outright dissolution.

But the surprise is that natural gas miners, particularly shale gas miners, are apparently destroying themselves. This comes from an an article at DeSmog Blog recounting a talk by Steve Schlotterbeck, a former shale gas CEO. The following quotes the DeSmog Blog article heavily, but I have added material found in the public domain from Schlotterbeck himself and from gas and oil industry trades.

Steve Schlotterbeck, who led drilling company EQT as it expanded to become the nation’s largest producer of natural gas in 2017, arrived at a petrochemical industry conference in Pittsburgh Friday morning with a blunt message about shale gas drilling and fracking.

Actually, this was at the Northeast Petrochemical Exhibition and Conference held on 20th June 2019. Mr Schlotterbeck’s full slide deck is available.

“The shale gas revolution has frankly been an unmitigated disaster for any buy-and-hold investor in the shale gas industry with very few limited exceptions,” Schlotterbeck, who left the helm of EQT last year, continued. “In fact, I’m not aware of another case of a disruptive technological change that has done so much harm to the industry that created the change.”

“While hundreds of billions of dollars of benefits have accrued to hundreds of millions of people, the amount of shareholder value destruction registers in the hundreds of billions of dollars,” he said. “The industry is self-destructive.”

Schlotterbeck is not the first industry insider to ring alarm bells about the shale industry’s record of producing vast amounts of gas while burning through far more cash than it can earn by selling that gas. And drillers’ own numbers speak for themselves. Reported spending outweighed income for a group of 29 large public shale gas companies by $6.7 billion in 2018, bringing the group’s 2010 to 2018 cash flow to a total of negative$181 billion, according to a March 2019 report by the Institute for Energy Economics and Financial Analysis.

But Schlotterbeck’s remarks, delivered to petrochemical and gas industry executives at the David L. Lawrence Convention Center in Pittsburgh, come from an individual uniquely positioned to understand how major Marcellus drillers make financial decisions — because he so recently ran a major shale gas drilling firm. Schlotterbeck now serves as a member of the board of directors at the Energy Innovation Center Institute, a nonprofit that offers energy industry training programs.

His warnings on Friday were also offered in unusually stark terms.

“The technological advancements developed by the industry have been the weapon of its own suicide,” Schlotterbeck added, referring to the financial impacts of shale gas drilling on shale gas drillers. “And unfortunately, the industry still has not fully realized how it’s killing itself. Since 2015, there’s been 172 E&P company bankruptcies involving nearly a hundred billion dollars of debt.”

“In a little more than a decade, most of these companies just destroyed a very large percentage of their companies’ value that they had at the beginning of the shale revolution,” he said. “It’s frankly hard to imagine the scope of the value destruction that has occurred. And it continues.”

At the Friday conference, he displayed a slide showing the stock prices of eight major Marcellus shale gas drillers: Antero, Range Resources, Cabot Oil and Gas, Southwestern Energy, CNX Gas, Gulfport, Chesapeake Energy, and EQT, the company that Schlotterbeck ran until he resigned in March 2018. Seven of the eight companies saw their stock prices fall between 40 percent and 95 percent since 2008, the slide showed.

“Excluding capital, the big eight basin producers have destroyed on average 80 percent of the value of their companies since the beginning of the shale revolution,” Schlotterbeck said. “This is not the fall from the peak price during the shale decade, this is the drop in their share price from before the shale revolution began.”

Mr. Schlotterbeck credited the shale rush with lowering power and natural gas bills nationwide and offering significant economic benefits since 2008, when he said the shale revolution began.

“Nearly every American has benefited from shale gas, with one big exception,” he said, “the shale gas investors.”

Residents of communities where shale gas drilling and fracking have caused disruptions and health issues might take exception to Mr. Schlotterbeck’s categorical description of the beneficiaries of shale gas, as might climate scientists who have warned that the shale industry’s greenhouse gas emissions are so severe that burning gas for power may be worse for the global climate than burning coal.

Only Cabot Oil and Gas, which owns the rights to drill gas from roughly 174,000 acres, mostly in one county in the northeastern corner of Pennsylvania, saw its stock price rise since 2008, according to Schlotterbeck’s presentation.

Cabot remains at the center of disputes tied to water contamination, a gas well blow-out, and other problems in Dimock, PA. One major lawsuit in that dispute was filed against Cabot back in November 2009 and legal battles have continued since. The company has denied liability and settled on undisclosed terms with landowners along Carter Road in Dimock.

Schlotterbeck made no mention of Dimock, focusing his remarks on the economic decisions made by the shale gas industry’s corporate management and boards of directors — not just in the past, but also in the present.

“The fact is that every time they put the drill bit to the ground, they erode the value of the billions of dollars of previous investments they have made,” he said. “It’s frankly no wonder that their equity valuations continue to fall dramatically.”

Can’t happen soon enough in my book, even if they are hoping for an upside. I can’t seen how that will happen, with increasing prices. Of course, they could (try to) get a subsidy from the present federal government, but it would have to be using another so-called emergency declaration from 45.

There was also coverage by oil and gas industry periodicals. One or two are claiming the overall talk was positive, emphasizing Cabot’s performance. For example,

There’s another parse: The shale gas drillers understand they may be a day soon when they cannot afford to extract gas, even at the economies of fracking, where it’s done in part because environmental costs are imposed upon local communities’ water supplies and causing structural damage to homes and buildings. So they are in a panic and spending investors’ capital with the hopes that they recover as much of the explosive methane as possible so they can do something with it.

On the other hand, Schlotterbeck has continued to acknowledge the pipeline situation, particularly in the eastern United States, is a big headache. That’s terrific.

## “… [A] new scientific paper overstates forests’ potential” (Reynolds)

##### (On 2019-07-06, repaired a typo, and on 2019-07-16 linked in a post by Professor Stefan Rahmstorf at RealClimate.)

Jesse Reynolds at Legal Planet is on this.

But, as I noted at LinkedIn, even if I accept the entirety of the well-meaning paper by Bastin, et al admits that planting 500 billion trees, as they propose, will only solve 25% of the atmospheric CO2 problem. Actually, I believe they miscalculated that, but we’ll get to seeing how in a moment.

Let’s say the half trillion trees are planted, emissions of CO2 from human sources and other precursors, like CH4 are completely stopped (setting aside the challenge of how to get agriculture to stop emitting, too), and deforestation is stopped. Atmospheric CO2 is now about 414 ppm. The preindustrial baseline was 288 ppm. That means 126 ppm more CO2 is in atmosphere over pre-industrial. We probably don’t need to get to 288 ppm. 350 ppm will do. So that means we’re 64 ppm out, or so it seems. Atmospheric CO2 is increasing by about 2 ppm per year.

Bastin, et al estimate the half trillion trees will take out 200 GtC at maturity. 1 GtCO2 is 0.127 ppm. So 200 GtC is a bit more than 25 ppm. That’s 39% of 64 ppm or 20% of 128 ppm.

Setting aside that this won’t happen overnight, or what the associated emissions of planting 500 billion trees are, this has another problem, alluded to above. Atmosphere only retains about 40% of total human emissions. That means 60% of human emissions (already) either go into soils or into the oceans. (In the long run, CO2 in oceans will turn into carbonates, but this is a very slow process.)

###### (Graphic courtesy of NASA. Click on it if you want to see a larger version in a separate window.)

Most importantly, oceans and soils are in equilibrium with atmosphere. This means if a ppm of CO2 is drawn from atmosphere, the partial pressure of CO2 in atmosphere will be lowered, and the entire climate system will come to a new equilibrium, drawing CO2 from soils and oceans. In the end, the total amount of CO2 to extract isn’t 128 ppm or 64 ppm, but 128/0.4 ppm or 64/0.4 ppm. These are 320 ppm and 160 ppm, respectively. Also, as Dr Steven Chu has pointed out, we’re not really at 414 ppm CO2 but, after considering the methane (CH4) in atmosphere, other hydrocarbon greenhouse gases which break down into CO2, and other greenhouse gases like N2O, 500 ppm CO2e. He also indicates we’re probably going to get to at least 600 ppm CO2e.

What does that mean? That means, even accepting Bastin et al uncritically, their 500 billion trees will do 7% of going from 500 ppm to 350 ppm CO2e (and that’s generous because they do little about, say, N2O), and 4% of going from 600 ppm to 350 ppm CO2e.

This is why Reynolds and I and others say the article is misleading. I also claim the Project Drawndown is highly misleading. These are at least wishful environmentalism if not greenwashing. If a political movement hangs its hat on the proposal, it is greenwashing.

The limitations of planting forests for this purpose are well known. For instance,

 L. Nave, G. M. Domke, K. L. Hofmeister, U. Mishra, C. H. Perry, B. F. Walters, C. W. Swanston, “Reforestation can sequester two petagrams of carbon in US topsoils in a century“, PNAS, 2018.

A petagram of Carbon is a single GtC. (That’s because 1 Gt = 1 billion of a tonne = 1 billion of 1000 kilograms each = 1 billion of 1 million grams each = 10^15 grams = 1 petagram.) So, it’s not fast, either.

I’ve written about what it would take to really reduce CO2 in atmosphere at two blog posts:

and I have written about the problem of convincing greenhouse gases to remain in soils even if they are put there through afforestation.

In short, (1) it’s far better and cheaper not to put the emissions up there in the first place, and (2), if we do, we’d better be prepared to live with the consequences, because CO2 in atmosphere is very long-lived. I think, too, in our personal lives, we need to be looking at what really contributes to global emissions: Consumption and its upstream emissions accounts for a lot! So do McMansions and does expansion of suburbs.

## “‘Why hasn’t anyone told us of this before?‘”

### 2019-07-16, update

Professor Stefan Rahmstorf has posted a piece on this matter at RealClimate.

## Letter to the MIT community: Immigration is a kind of oxygen

(The following email was sent today to the MIT community by President L. Rafael Reif.)

To the members of the MIT community,

MIT has flourished, like the United States itself, because it has been a magnet for the world’s finest talent, a global laboratory where people from every culture and background inspire each other and invent the future, together.

Today, I feel compelled to share my dismay about some circumstances painfully relevant to our fellow MIT community members of Chinese descent. And I believe that because we treasure them as friends and colleagues, their situation and its larger national context should concern us all.

The situation

As the US and China have struggled with rising tensions, the US government has raised serious concerns about incidents of alleged academic espionage conducted by individuals through what is widely understood as a systematic effort of the Chinese government to acquire high-tech IP.

As head of an institute that includes MIT Lincoln Laboratory, I could not take national security more seriously. I am well aware of the risks of academic espionage, and MIT has established prudent policies to protect against such breaches.

But in managing these risks, we must take great care not to create a toxic atmosphere of unfounded suspicion and fear. Looking at cases across the nation, small numbers of researchers of Chinese background may indeed have acted in bad faith, but they are the exception and very far from the rule. Yet faculty members, post-docs, research staff and students tell me that, in their dealings with government agencies, they now feel unfairly scrutinized, stigmatized and on edge – because of their Chinese ethnicity alone.

Nothing could be further from – or more corrosive to ­– our community’s collaborative strength and open-hearted ideals. To hear such reports from Chinese and Chinese-American colleagues is heartbreaking. As scholars, teachers, mentors, inventors and entrepreneurs, they have been not only exemplary members of our community but exceptional contributors to American society. I am deeply troubled that they feel themselves repaid with generalized mistrust and disrespect.

The signal to the world

For those of us who know firsthand the immense value of MIT’s global community and of the free flow of scientific ideas, it is important to understand the distress of these colleagues as part of an increasingly loud signal the US is sending to the world.

Protracted visa delays. Harsh rhetoric against most immigrants and a range of other groups, because of religion, race, ethnicity or national origin. Together, such actions and policies have turned the volume all the way up on the message that the US is closing the door – that we no longer seek to be a magnet for the world’s most driven and creative individuals. I believe this message is not consistent with how America has succeeded. I am certain it is not how the Institute has succeeded. And we should expect it to have serious long-term costs for the nation and for MIT.

For the record, let me say with warmth and enthusiasm to every member of MIT’s intensely global community: We are glad, proud and fortunate to have you with us! To our alumni around the world: We remain one community, united by our shared values and ideals! And to all the rising talent out there: If you are passionate about making a better world, and if you dream of joining our community, we welcome your creativity, we welcome your unstoppable energy and aspiration – and we hope you can find a way to join us.

* * *

In May, the world lost a brilliant creative force: architect I.M. Pei, MIT Class of 1940. Raised in Shanghai and Hong Kong, he came to the United States at 17 to seek an education. He left a legacy of iconic buildings from Boston to Paris and China to Washington, DC, as well on our own campus. By his own account, he consciously stayed alive to his Chinese roots all his life. Yet, when he died at the age of 102, the Boston Globe described him as “the most prominent American architect of his generation.”

Thanks to the inspired American system that also made room for me as an immigrant, all of those facts can be true at the same time.

As I have discovered through 40 years in academia, the hidden strength of a university is that every fall, it is refreshed by a new tide of students. I am equally convinced that part of the genius of America is that it is continually refreshed by immigration – by the passionate energy, audacity, ingenuity and drive of people hungry for a better life.

There is certainly room for a wide range of serious positions on the actions necessary to ensure our national security and to manage and improve our nation’s immigration system. But above the noise of the current moment, the signal I believe we should be sending, loud and clear, is that the story of American immigration is essential to understanding how the US became, and remains, optimistic, open-minded, innovative and prosperous – a story of never-ending renewal.

In a nation like ours, immigration is a kind of oxygen, each fresh wave re-energizing the body as a whole. As a society, when we offer immigrants the gift of opportunity, we receive in return vital fuel for our shared future. I trust that this wisdom will always guide us in the life and work of MIT. And I hope it can continue to guide our nation.

Sincerely,

L. Rafael Reif

Posted in innovation, science

## “Strong First Quarter Growth and Fracked Gas Takes a Hit”

###### (To see a larger figure, click on image to open it in a new browser window. This is from ILSR‘s “Of New Power Generation, How Much is on the Roof? Quarterly Update — 2019 Q1“.)

It was once warned, and now it’s coming true: The government unsubsidized cost of utility scale solar is now cheaper than any fossil fuel, including fracked gas. What these cost curves do not show is the closing gap between residential PV generation and cost of electrical transmission from a utility. Residential PV is still twice as high, but eventually residential PV, especially if incentivized, will match or beat the cost of transmission. That is expected to occur by 2025. When that occurs, there is no reason at all to purchase that portion of electricity from a utility. Whether storage is used or not, whether net metering is available or not, whatever portion of electrical consumption for a home comes from solar PV is a clear win.

Utility scale solar is cheaper than any fossil fuel, including fracked gas.

## “Ten Fatal Flaws in Data Analysis” (Charles Kufs)

Professor Kufs has a fun book, Stats with Cats, and a blog. He also has a blog post tiled “Ten Fatal Flaws in Data Analysis” which, in general, I like. But the presentation has some shortcomings, too, which I note below.

1. Where’s the Beef? Essentially, there’s no analysis. There’s a data summary.
2. Phantom Populations Samples need to represent the population of interest. There has to be a population of interest. They need to have something specific in common which, if absent, has a good chance of affecting an outcome.
3. Wow, Sham Samples The population is real, but the samples don’t represent it well, or at all. But be careful, here! I don’t think Kufs emphasizes this enough: A sample need not contain observation of population groups in the same proportions with which they occur in the population. Professor Yves Tillé makes this point strongly in his book, Sampling Algorithms.
4. Enough is Enough No confidence and statistical power with too few, and no meaning with too many. I’d add, underscoring something Kufs says, be sure each category has a fair number of observations as well.
5. Indulging Variance Unless variance is assessed and reported, an analysis is not statistical and it’s not scientific. Means mean muck without accompanying reports of variability. There’s a lot to appreciate about variability. Properly assessed, variability can be an important tool, as it is capable of separating out subpopulations which otherwise have common means. Ignoring heteroscedasticity can break many a standard analytical tool. The place to start dealing with variance when studying a population is at the sampling plan. There’s often a tradeoff between low variance and low bias. Merely calculating and reporting a standard deviation at the end of an analysis is almost always insufficient treatment.
6. Madness to the Methods Data inspection, cleaning, correcting, and testing for assumptions of modeling are unglamorous, tedious, and time consuming parts of any statistical analysis or data science project. It is also a portion which is hard to defend, since, in industry, management is sometimes impatient to see results coming from an allocation of expensive people and resources. But these steps are totally necessary. Without the last step, testing, you cannot know if the data are sufficiently cleaned or representative.
7. Torrents of Tests Kufs treatment of the multiple testing problem is old (Bonferroni), whether addressed from a quasi-Frequentist perspective or the more modern Bayesian one. There are now techniques for controlling family-wise error rates when large numbers of tests are conducted. Bayesian methods don’t have a problem with multiple comparisons. That’s one reason why I use them (a lot).
8. Significant Insignificance and Insignificant Significance Ah, significance tests! I could go on and on about these, and often have. The kindest thing to say here is a quote from Jerome Cornfield (1976): “The most general and concise way of saying all this is that $p$-values depend on both the $x$ observed and on the other possible values of $x$ that might have been observed but [were not], i.e., the sample space, while the likelihood ratio depends only on the observed $x$.”
9. Extrapolation Intoxication This is a caution against making the same mistake NASA and its subcontractors did when making the fateful decision to launch the Space Shuttle Challenger after it was exposed to freezing temperatures.
10. Misdirected Models Models are critical for understanding data, whether they are derived from domain knowledge, like physical theory, or not. There are many ways theories or hypotheses upon which models are based can themselves be wrong. The most important criterion a theory or hypothesis needs to have is it must be falsifiable. Extending, then, every model much have diagnostics within it which tell when the model is broken. This can be inherent to the application of the model, or it can be done by comparing its performance with a straw man model which is completely nonsensical but is fit to the same data. However, in this day and age there are such things as non-mechanistic empirical dynamic modeling which have shown success in the absence of underlying theory.

In practice, I witness projects running aground on these sandbars way too often. Kufs’ cautions are good. But Kufs’ advice could use an update.

## “Climate Change: Information on potential economic effects could help guide Federal efforts to reduce fiscal exposure” (GAO, September 2017)

In September 2017, the U.S. General Accounting Office completed a report Climate Change: Information on Potential Economic Effects Could Help Guide Federal Efforts to Reduce Fiscal Exposure. A copy is at that link.

Foremost, in case anyone doubts it, there is an appreciable fiscal risk to the federal government. Quoting:

Over the last decade, extreme weather and fire events have cost the [F]ederal government over 350 billion, according to the Office of Management and Budget. These costs will likely rise as the climate changes, according to the U.S. Global Change Research Program. In February 2013, GAO included Limiting the Federal Government’s Fiscal Exposure by Better Managing Climate Change Risks on its High-Risk List. GAO was asked to review the potential economic effects of climate change and risks to the federal government. This report examines (1) methods used to estimate the potential economic effects of climate change in the United States, (2) what is known about these effects, and (3) the extent to which information about these effects could inform efforts to manage climate risks across the federal government. GAO reviewed 2 national-scale studies available and 28 other studies; interviewed 26 experts knowledgeable about the strengths and limitations of the studies; compared federal efforts to manage climate risks with leading practices for risk management and economic analysis; and obtained expert views. This fiscal risk does not reflect risk to the private sector. It also is not a cost-benefit analysis of taking action versus not. The report summarizes results in more detailed studies, such as the American Climate Prospectus: Economic Risks in the United States commissioned by the Risky Business Project(*) and performed by Rhodium Group, LLC, in 2014. The GAO also reviewed the American Climate Prospectur among other reports. While GAO identified substantial uncertainties in projections of economic damage from climate change (**), the reviews in these reports was sufficient to provide substantial guidance. Some key results (click images to see larger figures in separate browser windows): This report is useful, beyond its summary of studies of climate change economic harm, and of results and reports as others, because it provides citations and links to several other reports the U.S. GAO has written regarding climate change risk to both the federal government and to states, along with recommendations. One important recommendation was to provide a central clearinghouse of climate change related information to help states and localities plan adaptation. This is recommended in Climate Information: A National System Could Help Federal, State, Local, and Private Sector Decision Makers Use Climate Information, GAO-16-37 (Washington, D.C.: Nov. 23, 2015). Of course, as the report indicates in a footnote: Specifically, Executive Order 13783 rescinded the Climate Action Plan and revoked the executive order establishing the Council on Climate Preparedness and Resilience. Although Executive Order 13693 has not been revoked, it is uncertain whether the agency adaptation plans and other strategic planning efforts it calls for will continue. This suggests the present posture of the federal government is to accept and incur the estimated costs inflicted upon its budget indefinitely. ##### (*) Risky Business is hardly a “socialist plot”, as seems to be the characterization of climate change adaptation and mitigation measures by science denying Republicans and others these days. In the organization’s About Us, there’sIn October 2013, NYC Mayor Michael Bloomberg, former U.S. Secretary of the Treasury Hank Paulson, and business leader and philanthropist Tom Steyer, founded a new initiative to assess and publicize the economic risks to the U.S. associated with climate change. The project grew out of concerns by the Co-Chairs that the U.S. was not developing sound risk assessments to respond to the impacts of a changing climate. In their development of this initiative, the three founders recruited additional members to forge the Project’s Risk Committee, a group of dedicated individuals concerned about the economic future of America under the threat of global climate change.It is big business- and investment-friendly, and seeks assessments of exposures to companies and their balance sheets from climate, similar to those advocated by BoE Governor Mark Carney. Moreover, it argues for large scale investments in climate mitigation as well as highlighting investment opportunities. ##### (**) Uncertainties in these projections are due to uncertainties in climate projections, as well as from incomplete and imperfect characterizations of economic activity which have exposure to climate change effects. Uncertainties in climate projects are due less to imperfections in the science (uncertainty in climate sensitivity is in my opinion particularly overplayed as an issue) and more to uncertainty in the world’s continuing profile of greenhouse gas emissions. Imperfections in characterizations of economic activity are due in large measure to companies and states failing to provide as yet a comprehensive assessments of their exposure to climate change effects. Moving them to perform and publish such assessments is the primary goal of Risky Business and of BoE Governor Mark Carney. Uncertainties in projections include substantial upside uncertainties as well as downside. For example, few if any of the studies have assessed effects of depreciation of property values due to risk and harm, or due to uninsurability. None of the studies assess risks of loss of ecological services. ## It’s hot (in Spain, France, Germany, Italy) ##### Update, 2019-06-30 Wonderful graphics and discussion in this blog post by Ian Livingston and Jason Samenow of The Washington Post. ## For Thursday, 27th June 2019. ### France (Marseille): 97FGermany (Fushstal): 88FItaly (Caserta): 94FSpain: (Sabadell): 87F ### From AFP, “Mercury tops 45C in France as deadly heatwave roasts Europe” (28th June 2019). And wildfires in Catalonia: ## Climate Adam‘s take: ## A response to a post on RealClimate ##### (Updated 2342 EDT, 28 June 2019.) This is a response to a post on RealClimate which primarily concerned economist Ross McKitrick’s op-ed in the Financial Post condemning the geophysical community for disregarding Roger Pielke, Jr’s arguments. Pielke, in that link, is recounting his few person crusade (along with a couple of others) that, in terms of a climate emergency, paraphrasing, “There’s nothing to see here, folks. Move along home.” The post was by Michael Tobis, a retired climate scientist, working as a software developer and science writer living in Ottawa, Ontario. Unfortunately, in my opinion, along the way, Dr Tobis threw Statistics under a small bus. At least he mischaracterized it. I went on to agree with the criticism of McKitrick. But I felt that modern Statistics needed explaining, and I cautioned that a lot of the problem is the statistics which Pielke did was shoddy, using antiquated methods. I also pointed out that similar imperfections could be found in, for instance, The Journal of Climate. Separately, I’ve documented a particularly egregious case, one which contributed to the claim there was a “hiatus” in increasing global warming, something which was not true then, “if thou reckon right“, and turned out not to be true afterwards. Anyway, apparently, RC isn’t going to post my Comment, so I’m placing it here, below, in its entirety. It’s up to them. The Comment may have been too far off the beaten path. I have also answered in the above a couple of my own questions about the original articles which pertain. ### As of today, I note that my comment was indeed posted. The delay was probably due to simple moderation delay.] So, thisThis slightly revised version augments that. I’m not sure which Pielke-Jr article McKitrick is referring, and, whichever it is, I mean one with technical details (link?), but, nevertheless, I wanted to gently disagree with part of Dr Tobis post above. Consider Statistics is a vital tool of science, but it is not the only one. It is most effective when dealing with large quantities of data. Using statistical methods to detect the effect of one factor among several amounts to proving that the other factors did not align as a matter of happenstance. The more abundant the data, the less likely such a coincidence. To the extent that Pielke-Jr is running some kind of hypothesis test for determining attribution, the problem isn’t Statistics, it’s wrong-headed and out-of-date statistical technique. It is not true that inference and estimate rely upon “large quantities of data”. Sometimes, in fact, relative small amounts of well chosen data are far more powerful: Think of an experimental design with controls and balance. That’s important to remember. In many engagements I have with Big Data people, I need to emphasize that the size of your dataset isn’t the raw size, it’s the number of replicas of unique combinations of explanatory variables you have. If most in a big dataset have but one observation, that’s not a big dataset at all. If there is a wide range in the numbers of replicas, that’s a problem with balance, and it can even harm seemingly non-parametric techniques like cross-validation. In fact, there is now a rich set of methods for embodying domain knowledge in statistical models, namely, the Bayesian hierarchical modeling approach, whether these derive from meteorology or climate science or baseball statistics. Judging from literature in say, Journal of Climate, few papers appear aware of these techniques, preferring to argue piecemeal from grounding in the domain. That’s okay, but large comprehensive studies are hard, and it’s unfortunate that these modern methods aren’t better understood and used. Deniers and doubters, in my experience, are the least likely to be aware of such methods, and I find many instances, whether in denial-oriented assessments of temperature or sea level changes, where Statistics is practiced as if by rote, and with a lot of confusion. Time series seems to be a particular tripping point. Others have noted these kinds of blemishes, but often such underscores don’t go on to indicate what should be done. There are many sound statistical analyses which demonstrate any of these claims, from the statistical significance of temperature rises, to increased damage from storms, to droughts and heatwaves, to phenological aberrations in species migrations. There is no doubt these are true. There are many wonderful Bayesian analyses which can serve as examples. Schmittner, et al from Science in 2011 is wonderful (“Climate Sensitivity Estimated from Temperature Reconstructions of the Last Glacial Maximum”). There are several papers where Professor Mark Berliner is a co-author, like his “Uncertainty and Climate Chance“, or, with Professor Levine, “Statistical Principles for Climate Change Studies“. The latter paper addresses the nuances of hypothesis testing, showing that while it seems simple, it is clouded with delicate assumptions. Smashing that all aside and oversimplifying, it’s that p-values (themselves) are random variables. Statistics as a field hasn’t left these questions alone. The American Statistical Association has, like many scientific organizations, a formal statement on climate change. But, too, the status of Statistics in climate change science has been examined. There have been special issues of journals. I also urge readers to be open-minded about applying techniques from machine learning and related areas to geophysical problems. They certainly have their faults and limitations, opacity in explanation being one formidable issue. But as O’Gorman and Dwyer showed in 2018 , ML techniques are beginning to make their presence felt in geophysics. See also Rasp, Pritchard, and Gentine in a 2018 PNAS. ##### Figure above is from the paper:Christopher K Wikle, Ralph F Milliff, Doug Nychka & L Mark Berliner (2001) “Spatiotemporal hierarchical Bayesian modeling of tropical ocean surface winds”, Journal of the American Statistical Association, 96:454, 382-397, DOI: 10.1198/016214501753168109 ## “Build way more wind and solar ‘than needed'” Many people familiar with traditional energy networks, including the electrical grids of utilities, come with strong preconceptions to considering zero Carbon energy sources. This is particularly true of and for experts in traditional energy, including engineers. They focus upon the intermittency of such renewable sources. Translated, they don’t really mean intermittency, since all power sources can go offline, they mean they can’t flip a switch and deliver energy, or back up a faulted energy source upon demand. It is a mindset. Some traditional energy sources, particularly large nuclear power plants, can go offline in short notice, and, because of their size, the rest of an electrical power grid needs to jump through hoops to respond, maintaining load. The preconceptions extend to having what’s called in the consumer electronics business, a closed ecosystem (e.g., Apple products), where hardware and software options meet standards set by a central point. With the advent of distributed energy, notably solar PV on home and commercial rooftops, and demand response options, end users are becoming generators of electricity and, by withholding demand, can influence and help grids respond to loads. This changes the business relationship between “the edge” of electrical networks and their operators, something which utilities and RTOs/ISOs have only come to appreciate over several years, some more slowly than others. Part of the problem of these attitudes is that they leak out into the mostly uneducated public and even among policymakers. The former head of EOER in Massachusetts, Matthew Beaton, refused to address issues of why Massachusetts couldn’t be more like Texas when directly questioned, appealing to the silly aphorism that renewables couldn’t be relied upon when “the sun don’t shine and the wind don’t blow”. But, in fact, that is a gross simplification, grounded in expectations set by years of operating conventional fossil fuel energy in the late 20th century. The fact of renewables is that they have zero marginal cost. This means that, once constructed, they deliver energy for 2-3 decades with little additional financial inputs, governed solely by the circumstances of sun and wind. In comparison to traditional fossil fuel plants, they are also much less expensive to build. They do require more space, and they work best if scattered over a wide area, although exploiting rich, high wind areas like the offshore New England coast offers competing advantages. There is an idea that a region or a state or a town or a home or business has a certain amount of energy “it needs”. That’s true, but to the degree demand can be adjusted or shaped in time and by choice means these aren’t inflexible. It is also true that not all consumers of electrical energy need the same quality of electrical power, although as far as I know, there are presently no plans to relax that standard, possibly because it trades an economy for a greater headache in managing different mixes of electrical energy. One way to deal with having things like capacity factors, where a resource isn’t always available, is to massively overbuild the resource. This means building 3X to 8X the capacity typically expected based upon such “needs”. This is possible only because renewables are inexpensive to build, and they can be built rapidly. These costs, as I’ve recently observed, as have others like Professor Tony Seba and Haegel, Atwater, and colleagues have observed, are dramatically decreasing. But Professors Richard Perez and Karl Rabago have underscored the idea of overbuild as a new principle of operation. The idea is not new, because it’s engineering commonsense. Indeed, the idea is part of the mix which Professor Mark Jacobson and colleagues proposed. Indeed, overprovisioning has long been a tactic used in the design of supercomputers. In fact, when I worked with installers to buy a PV system for our home, I ran into this kind of counterproductive thinking. Installers wanted to maximize utilization of the PV panels across a year, and, when picking the number of panels, chose the number which maximized that measure. I, on the other hand, wanted to generate enough Watt-hours across the year to offset our entire electricity demand. We have a shaded situation, partly there from trees, so panels were not going to be uniformly illuminated. I finally found a terrific installer, RevoluSun, and they understood and wanted to do exactly what I thought and wanted. I ran into the same kind of thinking when I worked at IBM Federal Systems in upstate New York: Managers had a hard time understanding that if the objective was to get a calculation done as quickly as possible, there were times when only a portion of the computers in a multicomputer could be used, but all of them were needed for other parts of the calculation. They wanted to pick a number which maximized utilization at the price of slower computation. These were otherwise sharp people, but they were trapped in a certain way of thinking. ## 50 Terawatts of Solar Photovoltaics (“PV”) is now feasible by 2050 This is from an article in Science by Haegel, et al which just was released today. It means, documented in detail, that the projections of Professor Tony Seba are not only right on, but Professor Seba may have underestimated the impact. To put this in perspective, in 2013, for all purposes from all sources, the entire world consumed 18 Terawatt-hours of energy. Accordingly, were the 50 Terawatt PV run for an hour, that would be nearly three times the need of all energy for 2013 for the world. Naturally, there are questions of storage, converting electrical energy to usable forms, etc. But, on the other hand, if there is 50 Terawatt-hours of energy available, no matter what the form, there is also significant economic incentive to learn how to do these things. The actual paper from Science is: N.M. Haegel, H. Atwater, T. Barnes, C. Breyer, A. Burrell, Y.-M. Chiang, S. De Wolf, B. Dimmler, D. Feldman, S. Glunz, et al. “Terawatt-scale photovoltaics: Transform global energy”, 2019, Science 364, 836–838. Note this is a much more optimistic update of: N. M. Haegel, R. Margolis, T. Buonassisi, D. Feldman, A. Froitzheim, R. Garabedian, et al., “Terawatt-scale photovoltaics: Trajectories and challenges”, 2017, Science, 356, 141–143. from barely two years before. The authors say this themselves in the Abstract of the 2019 paper: Solar energy has the potential to play a central role in the future global energy system because of the scale of the solar resource, its predictability, and its ubiquitous nature. Global installed solar photovoltaic (PV) capacity exceeded 500 GW at the end of 2018, and an estimated additional 500 GW of PV capacity is projected to be installed by 2022–2023, bringing us into the era of TW-scale PV. Given the speed of change in the PV industry, both in terms of continued dramatic cost decreases and manufacturing-scale increases, the growth toward TW-scale PV has caught many observers, including many of us (1), by surprise. Two years ago, we focused on the challenges of achieving 3 to 10 TW of PV by 2030. Here, we envision a future with ∼10 TW of PV by 2030 and 30 to 70 TW by 2050, providing a majority of global energy. PV would be not just a key contributor to electricity generation but also a central contributor to all segments of the global energy system. We discuss ramifications and challenges for complementary technologies (e.g., energy storage, power to gas/liquid fuels/chemicals, grid integration, and multiple sector electrification) and summarize what is needed in research in PV performance, reliability, manufacturing, and recycling. ###### (Emphasis added.) To skeptics who thought these would be limited by available materials, essentially taking a narrow-minded view of how to do PV, there appear to be many alternatives, some better than the originals. ## Cumulants and the Cornish-Fisher Expansion “Consider the following.” (Bill Nye the Science Guy) There are $N$ random variables drawn from the same kind of probability distribution, but with different parameters for each. In this example, I’ll consider $N$ random variables $x_{j} \sim \mathcal{B}(p_{j})$, that is, each drawn from a Bernoulli distribution, but having different probabilities $p_{j}$. Of interest is a particular sum: $S = \sum_{j=0}^{N-1} w_{j} x_{j}.$ where the $w_{j}$ are fixed weights, not all of which are zero. Given the $N$ $p_{j}$ values, and the $N$ weights $w_{j}$, the problem is to calculate the value of $S$, say, $S_{q}$, at a given quantile quantile $q$ of the cumulative distribution of $S$. Indeed, suppose many such values $S_{q}$ are sought, for various $q$. How do you do it? One technique which most data scientists proffer is to simulate many such Bernoulli draws and calculate quantiles empirically from the resulting simulations. This is possible, but the question is how much error is there in such simulations given arbitrary choices of $N$, the $p_{j}$, the $w_{j}$, and $q$? Also, how does the error behave as the number of simulated draws increases? Does it decrease monotonically? There is another approach: Using the Cornish-Fisher expansion, a technique dating from 1937. But before meeting the expansion, cumulants need to be introduced. ### Cumulants Technically, cumulants are the coefficients obtained from a series expansion of the log of the characteristic function of a probability density function. That’s dense. Let’s break it down. The characteristic function is the expected value of a particular non-linear function of a random variable $X$, where $X \sim P(x)$, and $P(x)$ is the probability density function for $X$. That non-linear function is $\varphi_{X}(t) = E\llbracket \exp{(\mathbf{i} t X)} \rrbracket.$ where $\mathbf{i} = \sqrt{-1}$, the complex basis. Using the definition of expected value, and assuming the domain for $X$ is continuous, $\varphi_{X}(t) = E\llbracket \exp{(\mathbf{i} t X)} \rrbracket = \int_{-\infty}^{\infty} \exp{(\mathbf{i} t X)} P(x)\,dx.$ If $X$ were discrete, then $\varphi_{X}(t) = E\llbracket \exp{(\mathbf{i} t X)} \rrbracket = \frac{\sum_{x \in \mathcal{D}(X)} \exp{(\mathbf{i} t X)} P(x)}{\sum_{x \in \mathcal{D}(X)} P(x)}.$ where $\mathcal{D}(X)$ denotes the (finite) domain of the random variable $X$. Inspection shows these expressions are the value of a general Fourier transform of the probability density function, $P(x)$, one having constants $a=1, b=1$. Taking the log, then $\log{\varphi_{X}(t) } = \sum_{n=1}^{\infty} \kappa_{n} \frac{(\mathbf{i} t)^{n}}{n!}.$ where (finally!) the $\kappa_{1}, \kappa_{2}, \dots$ are the cumulants. There is a cumulant generating function definable, $\mathbf{K}(h) = \log{\mathbf{M}(h)} = \sum_{n=1}^{\infty} \kappa_{n} \frac{h^{n}}{n!}.$ $\mathbf{M}_{X}(t) = E\llbracket \exp{(t X)} \rrbracket = \sum_{n=1}^{\infty} \frac{t^{n} E\llbracket X^{n} \rrbracket}{n!} = \sum_{n=1}^{\infty} \frac{t^{n} \mu_{n}}{n!}.$ where $\mu_{n}$ denotes the $n$th moment of $X$. In fact, moments and cumulants are interrelated and can be obtained from one another. This can be useful. Another fact, particularly useful for the problem at hand, is that if $\kappa_{r}(Y)$ denotes the $r$th cumulant of some random variable $Y$ and that random variable is itself the sum of $m$ other random variables, per $Y = \sum_{k=1}^{m} X_{k}.$ then $\kappa_{r}(Y) = \kappa_{r}(\sum_{k=1}^{m} X_{k}) = \sum_{k=1}^{m} \kappa_{r}(X_{k}).$ assuming cumulants exist for all $X_{k}$. What this means is that to the degree to which the probability distribution of a sum of random variables can be characterized by or even recovered from knowledge of its cumulants, we can readily find the cumulants of such a sum from the cumulants of each component of the sum. If moment generating functions are used, or simply probability densities are used, one can calculate the same, but it entails convolutions of densities or products of characteristic functions. These interchanges are really just consequences of the product-convolution identity for Fourier transforms. This just sketches the beginning of things that can be done with cumulants. See k-statistics and polykays for others, even empirical uses. ### The Cornish-Fisher expansion Given knowledge of cumulants, the Cornish-Fisher expansion gives an estimate for $S_{q}$ directly from the cumulants of $S$, assuming these exist. In the case of the weighted Bernoulli sum here, they certainly do. ### Cumulants for $S$ The first five cumulants for each part, $w_{j} x_{j}$ of the sum $S$ are: $\kappa_{1,j} = p_{j} w_{j}.$ $\kappa_{2,j} = p_{j} (1 - p_{j}) w_{j}^{2}.$ $\kappa_{3,j} = p_{j} (p_{j} - 1) (2 p_{j} - 1) w_{j}^{3}.$ $\kappa_{4,j} = p_{j} (1 - p_{j}) (1 + 6 p_{j} (p_{j} - 1)) w_{j}^4.$ $\kappa_{5,j} = p_{j} (p_{j} - 1) (2 p_{j} - 1) (1 + 12 p_{j} (p_{j} - 1)) w_{j}^{5}.$ These are elemental cumulants and, so, to obtain the corresponding cumulants for $N$ pairs of $w_{j}$ and $p_{j}$ where, recall, $x_{j} \sim \mathcal{B}(p_{j})$, $\kappa_{k} = \sum_{j=0}^{N-1} \kappa_{k,j}.$ Note that • $\kappa_{1}$ corresponds to the mean. • $\kappa_{2}$ corresponds to the variance. • $\kappa_{3}$ corresponds to skewness, hereinafter denoted $\mathcal{S}$. • $\kappa_{4}$ corresponds to kurtosis, hereinafter denoted $\mathcal{K}$. • $\kappa_{5}$ does not correspond to any central moment or simple combination of them. Details are available here. ### A Cornish-Fisher expansion for $S$ library(mpoly) theoreticalMean<- function(W,P) sum(W*P) theoreticalVariance<- function(W,P) sum(W^2*P*(1-P)) theoreticalSkew<- function(W,P) sum(P*(P-1)*(2*P-1)*W^3) # (This is not excess kurtosis. Still need to subtract 3 to get kurtosis w.r.t. Gaussian.) theoreticalKurtosis<- function(W,P) sum(P*(1-P)*(1 + 6*P*(P - 1))*W^4) theoreticalKappa5<- function(W,P) sum(P*(P-1)*(2*P-1)*(1 + 12*P*(1-P))*W^5) gamma1<- function(W,P) theoreticalSkew(W,P)/sqrt(theoreticalVariance(W,P)^3) gamma2<- function(W,P) theoreticalKurtosis(W,P)/sqrt(theoreticalVariance(W,P)^4) gamma3<- function(W,P) theoreticalKappa5(W,P)/sqrt(theoreticalVariance(W,P)^5) He.polynomials<- hermite(degree=1:4, kind="he", normalized=TRUE) ThePoint<- 0.10 TheQuantile<- qnorm(ThePoint) yAtThePoint<- function(xPoint=0.1, W, P) { # xQuantile<- qnorm(xPoint) # He<- unlist(sapply(X=He.polynomials, FUN=function(He.k) as.function(He.k, silent=TRUE)(xQuantile))) # h1<- He[2]/6 h2<- He[3]/24 h11<- - (2*He[3]+He[1])/36 h3<- He[4]/120 h12<- -(He[4] + He[2])/24 h111<- (12*He[4] + 19*He[2])/324 # g1<- gamma1(W,P) g2<- gamma2(W,P) g3<- gamma3(W,P) # mu<- theoreticalMean(W,P) sigma<- sqrt(theoreticalVariance(W,P)) # w<- xQuantile + (g1*h1) + (g2*h2 + g1^2*h11) + (g3*h3 + g1*g2*h12 + g1^3*h111) # yp<- mu + sigma*w # return(yp) }  ### Simulating $S$ using many Bernoulli draws A simulation for $S$ was developed wherein a vector, $\mathbf{p}$, of probabilities were drawn from a Beta distribution, a vector of non-negative weights, $\mathbf{w}$, were drawn from a Gamma distribution, and then, governed by $\mathbf{p}$, $M = 10000$ vectors $x_{k}, k \in \{1, \dots, M\}$, were drawn from the Bernoulli distribution. Then, $S_{k} = w_{k} \cdot x_{k}$ was calculated for each and saved. $N$, the length of these vectors, was chosen to be 10, typical for the application described motivating this problem. It also suffices for illustration. An estimated probability density was developed from the $M$-sized collection of saved sums. Thirty of these are shown below. ### Comparing the two approaches $S_{q=0.1}$ was a quantile chosen of interest, and used for comparison. In addition to comparison with the numerical simulation, the PDQutils package of R was used to independently calculate the value for these quantiles as a check on derivations. Empirical quantiles were obtained from the simulations using the hdquantile function from the Hmisc package of R. The results are given below. ### Comment on Cumulants and the Cornish-Fisher expansion in statistical education Cumulants and the Cornish-Fisher expansion aren’t regularly taught any longer in courses on theoretical statistics, at least judging by material in statistics textbooks. For example, • J. S. Bendat, A. G. Piersol, Random Data: Analysis and Measurement Procedures, $4^{th}$ edition, Wiley, 2010 • A. W. Drake, Fundamentals of Applied Probability, McGraw-Hill Book Company, 1967 • J. A. Rice, Mathematical Statistics and Data Analysis, $3^{rd}$ edition, Duxbury, Thomson/Brooks/Cole, 2007 • M. H. DeGroot, M. J. Schervish, Probability and Statistics, $3^{rd}$ edition, Addison-Wesley, 2002 • K. V. Mardia, J. T. Kent, J. M. Bibby, Multivariate Analysis, Academic Press, 1979 • R. Durrett, Probability: Theory and Examples, $4^{th}$ edition, Cambridge University Press, 2010 lack any mention of either cumulants or the expansion, although moment-generating functions are inevitably mentioned. In contrast, • R. V. Hogg, J. W. McKean, A. T. Craig, Introduction to Mathematical Statistics, $6^{th}$ edition, Pearson/Prentice-Hall, 2005 • A. N. Shiryaev, R. P. Boas (translator), Probability, $2^{nd}$ edition, Springer, 1996 • E. Parzen, Modern Probability Theory and Its Applications, John Wiley, 1960 • D. E. Knuth, The Art of Computer Programming, Volume 1: Fundamental Algorithms, $3^{rd}$ edition, Addison-Wesley, 1997 do mention cumulants. Parzen (1960) has quite a bit about them, and relationships to sums of random variables. The erudite Professor Knuth has three pages mentioning them. That suggests, however, that most specialists in computer science ought to know of them, but I’d bet most don’t. Cumulants are often mentioned during courses on random walks and diffusion, because a random walk can be considered a kind of a sum. ### Postscript Should readers be interested in the code, I can prepare a documented version of it to use. Note that this would be in the interest of reproducibility. As mentioned above, the PDQutils package of R offers a means of calculating quantiles given cumulants directly, via its function qapx_cf. Of course, theoretical expressions for quantiles demand symbolic calculation, such as offered by the capabilities of Wolfram Mathematica. ## Pesticide Perspective ##### (This is in the main a reblog of an opinion piece by Andrew Gottlieb, APCC) #### May 7, 2019 ### Pesticide Perspective by Andrew Gottlieb, Executive Director, Association to Preserve Cape Cod Fresh off the taping of a Lower Cape TV segment on the merits of continuation of Eversource’s use of herbicides, I am reminded of the importance of individual behavior. While not in any way making Eversource’s herbicide use ok, the persistent and excessive household use of herbicides and pesticides needs to change. Eversource points the finger at household use as a justification for its practices, but the fact is that true resource protection and restoration relies on big changes in personal behavior. Just go to any garden center or box store and you will be confronted with gallons and gallons of pesticides and herbicides, and pounds and pounds of fertilizers. The marketing messages are clear: You need this and more is better. Both are wrong. Spend your money and time on native plantings and minimizing your lawn. Your water use and chemical bills will go down and you will help restore ecological balance and habitat one yard at a time. You will become part of the solution to water quality problems instead of part of the cause. So resist the siren song and turn your back on the chemicals. And while you are at it, we can help build pressure on big users like Eversource to be leaders by example. I have written about sustainable landscaping before, although not with an emphasis upon herbicide use. I have mentioned that elsewhere, though. While Mr Gottlieb, who I admire, might not agree with the following, he implicitly brings up an argument which is made not only by the purveyors of pesticides and herbicides, but also by (some) fossil fuel companies. That argument is that they merely produce a product for the shelves of stores, and it is not their responsibility how their product is used or what consequences it has. That’s a bit starker and, to my mind, more honest than what they actually argue, but there it is. Mr Gottlieb argues for personal choice and convincing people. I’m not sure how that will work out. It surely hasn’t with fossil fuel use. And there are home yard maintenance businesses with low barriers to entry which rely upon application of pesticides and herbicides to produce enough earnings per unit time to remain afloat. Accordingly, I would prefer to see stronger extended producer responsibility here. That is, the manufacturers of products need to take a cradle-to-grave responsibility for their products and their effects, environmental, health, and otherwise. I don’t mean this needs to be enshrined in legislation, but I am skeptical such manufacturers or their trade organizations will jump to volunteer that. Accordingly, legislation is required. There surely is a case for such responsibility with respect to solid waste, notably in packaging. There are organizations advocating such responsibility for herbicides and pesticides, much of it in connection with household hazardous waste. People pay for household hazardous waste at least twice. Once is off the shell. A second time is in taxes paid to towns for collecting and disposing of excess. But there are third and fourth costs, including human health costs, and environmental costs, at least in ecological services such as honeybees and ramifications of mortality in insectivore birds. ## #AllEyesOnJuliana👀 ### Juliana v. US June 4 Hearing at Ninth Circuit The constitutional youth climate lawsuit, Juliana v. United States, will be heard before the Ninth Circuit Court of Appeals in Portland, Oregon. Let’s make history and have this be the most watched Ninth Circuit oral argument ever. 🔊 We need to protect the constitutional rights of young people. 🔊 We need the U.S. government to know that we are watching as it tries to deny the rights of young people in court. 🔊 We need #AllEyesOnJuliana 👀 #### Background: The constitutional youth climate lawsuit, Juliana v. United States, will be heard before the Ninth Circuit Court of Appeals in Portland, Oregon on June 4, 2019. Counsel for youth plaintiffs, Julia Olson, will argue on their behalf and an attorney from the Department of Justice will argue on behalf of the federal government. The oral arguments will be livestreamed by the Court and the number of viewers is tracked. We want to make history by having this be the most watched Ninth Circuit argument ever. We need to protect the constitutional rights of youth. We need the U.S. government to know that the American people are watching as it tries to deny the rights of young people in court. We need #AllEyesOnJuliana👀 Tune in to the livestream of the oral argument at 2:00 p.m. on June 4, 2019, on the Ninth Circuit Court of Appeals YouTube channel here. Watch at work, organize a teach-in at school, or host a watch party. Whatever you do, post a pic! #### General Information: The Juliana v. US lawsuit established that young people have a constitutional right to “a climate system capable of sustaining human life.” That right is being violated. On June 4, tune in to the livestream of the plaintiffs’ hearing and post with #AllEyesOnJuliana👀 so that the government can see how many are watching as it tries to deny the rights of young people in court. Sign up to watch: www.youthvgov.org/alleyesonjuliana By promoting a fossil fuel energy system, our government violates the constitutional right of youth to a livable future. On June 4, tune in to the livestream of the plaintiffs’ hearing and post with #AllEyesOnJuliana👀 so that the government can see how many people are watching as it tries to deny the rights of young people in court. Sign up to watch: www.youthvgov.org/alleyesonjuliana The constitutional rights of young people are being violated and the Juliana plaintiffs have had enough. They’re suing the federal government for contributing to the climate crisis. On June 4, tune in to the livestream of the plaintiffs’ hearing and post with #AllEyesOnJuliana👀 so that the government can see how many are watching as it tries to deny the rights of young people in court. Sign up to watch: www.youthvgov.org/alleyesonjuliana ### “We are now faced with the fact that tomorrow is today. We are confronted with the fierce urgency of now.” – Dr. Martin Luther King Jr., 1963. On June 4, the Juliana plaintiffs have a hearing to demand their constitutional right to a livable future be protected before it’s too late. Tune in to the livestream of the plaintiffs’ hearing and post with #AllEyesOnJuliana👀 so that the government can see how many are watching as it tries to deny the rights of young people in court. Sign up to watch: www.youthvgov.org/alleyesonjuliana #### Important Web Links: Main hashtags: #AllEyesOnJuliana 👀 #youthvgov #### Social Media Content to Share: Graphics sized for Facebook & Instagram here. Graphics sized for Twitter here. #### Dear Friend, Let us start off by saying, thank you! As you know, on June 4, the Juliana v. United States plaintiffs and their attorneys will be in Portland, Oregon for oral argument before the U.S. Court of Appeals for the Ninth Circuit. With the support of so many of you, we are only3,500 away from securing all $70,000 of the matching funds available to us to help bring the 21 Juliana plaintiffs to the Court of Appeals for this critical argument! Your support is fundamental to our preparations for this monumental moment in this constitutional youth climate lawsuit. If haven’t already, and you are able to make a donation today, please help us secure the final$3,500 necessary to meet the entire \$70,000 matching challenge. If you are one of the many who have already made a contribution, we thank you for your important support.

We hope all of you will join us on June 4, either in person in Portland or by tuning in to the livestream of the court proceedings at 2:00 pm (PST).

Find livestream information and sign up for a reminder on our website here. We need #AllEyesOnJuliana!

Thank you for your crucial support at this critical time.

The Team at Our Children’s Trust

## Marine microbes are eating plastics

The news item was reported in Science. I wrote about the possibility earlier, but, there, WHOI scientists had not confirmed that microbes were actually consuming plastics. This has been suspected since 2011, due to the work of WHOI scientist Dr Tracy John Mincer studying biofilms on plastics in oceans.

This has been suspected and, until recently, unconfirmed, because the mass balance of plastics entering the oceans greatly exceeds estimates of plastic mass in oceans based upon sampling. There is going to be a multi-day scientific workshop at WHOI addressing these questions. The process by which microbes degrade plastics — essentially using them for food — is still being studied.

## “Industrial-era decline in subarctic Atlantic productivity”, by Osman, Das, et al

### Marine phytoplankton have a crucial role in the modulation of marine-based food webs [1], fishery yields [2] and the global drawdown of atmospheric carbon dioxide [3]. However, owing to sparse measurements before satellite monitoring in the twenty-first century, the long-term response of planktonic stocks to climate forcing is unknown. Here, using a continuous, multi-century record of subarctic Atlantic marine productivity, we show that a marked 10 ± 7% decline in net primary productivity has occurred across this highly productive ocean basin over the past two centuries. We support this conclusion by the application of a marine-productivity proxy, established using the signal of the planktonic-derived aerosol methanesulfonic acid, which is commonly identified across an array of Greenlandic ice cores. Using contemporaneous satellite-era observations, we demonstrate the use of this signal as a robust and high-resolution proxy for past variations in spatially integrated marine productivity. We show that the initiation of declining subarctic Atlantic productivity broadly coincides with the onset of Arctic surface warming [4], and that productivity strongly covaries with regional sea-surface temperatures and basin-wide gyre circulation strength over recent decades. Taken together, our results suggest that the decline in industrial-era productivity may be evidence of the predicted [5] collapse of northern Atlantic planktonic stocks in response to a weakened Atlantic Meridional Overturning Circulation [6–8]. Continued weakening of this Atlantic Meridional Overturning Circulation, as projected for the twenty-first century [9,10], may therefore result in further productivity declines across this globally relevant region.

The paper using a technique devised by Chaudhuri and Marron called SiZer.

There is an R package for it.

Hannig and Marron published a more recent article, and Marron has studied its applications extensively.

The figure below is from

N. J. Abram, et al, “Early onset of industrial-era warming across the oceans and continents”, Nature, 2016, 536, with corrigendum in 2017.

where SiZer is used in a geophysical application cited by Osman, Das, et al.

Posted in science | 1 Comment

## I am joining up to support the local Green New Deal teams

As anyone who has read my posts here know, I have reservations regarding the Green New Deal, from its lack of specifics, its overly ambitious scope, and its settings of expectations for preventing climate harm which are misleading, because of damage already in the climate pipeline.

That said, like Bill Nye in the previous post, I cannot remain on the sidelines and do nothing.

So I’m jumping in, beginning with support of a Green New Deal Town Hall at the Unitarian Universalist Church in Dedham, Massachusetts, on 12th June. And I will help out organizing it.

That, my commitment as a solar revolutionary, and the upcoming course on “Climate Science for Climate Activists” I’m teaching is simply what I have to do.

## Bill Nye, properly losing patience

Tamino has been here already. But this is a different view:

Update

ClimateAdam, who I respect a lot, is critical of Bill Nye’s rant. My views on this are in a comment at his YouTube page.

## “Climate Science for Climate Activists”

I am planning to teach a course by this title online using the Zoom platform. I have a half dozen or so expressions of interest, but I wanted to put the outline up and in a place that can be accessed easily so people could have a look at it and see if they are interested. If you want to tune it, either:

Once I have this material, I’ll probably do it again. I don’t have any notion of a minimum class size, but I cannot accommodate more than 99 with my Zoom account. (Don’t think there’s much danger of that.) This is, of course, not for any kind of course credit.

The course, based upon Professor David Archer’s book and online course, Global Warming: Understanding the Forecast, and Professor Archer still has available a Coursera version. That course has enrollments, deadlines, etc.

The idea of the course is to introduce sufficient amounts of climate science into a UU and activist context so participants might be able to (1) discriminate among policy choices intelligently, (2) converse intelligently about the hows and whys of climate change, including being able to parry denier or warmist rhetoric, and (3) appreciate the marvel and beauty of the Earth system, with joy and awe.

Here’s an outline, one which I am continuing to develop and tweak. I have begun developing slides according to this. I still need to check those interested in the first round, but, tentatively, I’m thinking of a kickoff some time in early June 2019.

1. Overview and Why.
1.1 "Fun and Awe"
1.1 Continental shelf image, off Cape Cod. Our neighborhood, and a bit of history of science.
1.1.1 Seamounts.
1.1.2 Hotspots.
1.1.3 Plate motion.
1.1.4 A need for a sense of temporal scale.
1.2 The Sverdrup as a unit of flow.
1.2.1 A need to be QUANTITATIVE
1.3 The "Gulf Stream", a part of the AMOC and its flow.
1.2 Why _this_ course.
1.2.1 Now, and especially now, y'can't be an advocate for
climate action without understanding the science and
the engineering of climate change
1.2.2 This course deals with the science. The engineering will
need to be left to another day.
1.2.3 It's my judgment that many climate and environmental
activists get the idea, however way they've gotten there,
but they do not have the details. This handicaps them,
both in being able to deal with the emotional
implications, and in being able to respond with judgments
1.2.4 After all, the idea of a representative democracy is
in part that the electorate gains some understanding
of the problem at hand and expresses their take on
policy choices based upon that understanding.
1.2.5 Climate and its science is too important to leave it
to others to understand, taking it on authority.
1.2.6 I emphasize that, because if you want to go into
Professor Archer's course, online, or delve deeper
into Professor Pierrehumbert's course, I heartily
encourage you to do so. You can verify these things
for yourself, using your own calculations. That's how
Science works, or should work.
1.3 The sources and origin
1.3.1 Prof David Archer, GLOBAL WARMING: UNDERSTANDING
THE FORECAST
1.3.2 Prof Ray Pierrehumbert, PRINCIPLES OF PLANETARY CLIMATE
(https://geosci.uchicago.edu/~rtp1/PrinciplesPlanetaryClimate/)
1.3.3 B.S., Physics, 1974
1.3.4 Courses in Geology and Geophysics, 1992-1994.
1.3.5 Personal study since, lectures, online coursework, etc.
1.4 Given talks before: https://bit.ly/2VIdGEE
1.5 This course will be revised and will be repeated.
1.6 I *like* the online format: Bigger reach, encouraging online
community, fewer greenhouse gas emissions for travel.
1.7 Format and style is to circle around a few key ideas, diving
deeper on each revisit.
1.7.1 Intended to reduce overload effect.
1.7 HOMEWORK FOR THIS SECTION: Why are you taking the course?
What do you want to get from it?
1.8 There'll typically be some kind of Homework or Problem Set
given at end of each section, due by the start of the
next.  The due date system is to give me a chance to look
them over and comment. This communication and submission
will be written and by email or attaching files. If you'd
prefer some other submission mechanism in addition, let
me know.
1.9 Technical difficulties with ZOOM: I can help a bit with any
connection or setup problems, but Zoom has excellent help
resources and an online chat. They also can, I believe, "look
over your shoulder" on the ZOOM platform to see what might
be the problem. I cannot.
2. Heat, Light, Energy, Blackbody Radiation, and Atmospheric Transfers.
2.1 Let's begin at the beginning: Energy transfer through a vacuum.
How does it happen?
2.1.2 Stars, starshine, sunshine.
2.1.3 Matter as ensembles of musical instruments.
2.2 What happens when radiation interacts with matter?
2.2.1 Rocks in Space.
2.2.2 White rocks in Space.
2.2.3 Black rocks in Space.
2.2.4 Resonance and coupling
2.3 Looking closely at molecules.
2.3.1 Radiation energy interacting ("hitting") a molecule.
2.3.2 Different kinds of molecules: O2, N2, CO2, CH4.
2.3.3 Molecules as musical strings.
2.3.4 Coupled vibrations.
2.3.5 What happens to sound if you open the front doors in a
big music hall?
2.4 Other kinds of energy transfer
2.4.1 Conduction: For instance, solid Earth
2.4.2 Convection: For instance, ocean currents
2.5.1 Why don't rocks in space melt?
2.5.2 An application of Law of Conservation of Energy: Balance of energy flows
2.6 Observations of greenhouse gases and Earthlight.
2.7 HOMEWORK:
2.7.1 What do you think the average temperature of Earth's
surface would be if atmosphere was all Oxygen and Nitrogen
without trace gases or water?
2.7.3 What would happen if water were present? Physical
implications?
2.7.4 Suppose there were no oceans and, somehow, water was
tied up in the ground and did not flow. Apart from
desertification, what do you think the climate of Earth
would be like? I don't expect a definitive answer to this:
Just think on what you've learned in this section, and
reason through what might be the effects. That said,
by end of course you should be able to give a definitive
2.7.5 It's been proposed that JATROPHA CURCAS (see Wikipedia)
be planted in arid regions because it does well there,
does not require care, and produces an oil which might be
usable as biofuel. Given what you've learned in this
segment, what might happen to climate if the deserts of
the U.S. Southwest and Saudi Arabia were completely
planted with JATROPHA?
3. A Simple Climate Model.
3.1 Why models?
3.2 We've already seen a simple climate model: The bare rock.
3.2.1 Models as analogies.
3.2.2 Models as verifiable stories.
3.3 Layer models of atmosphere.
3.3.1 Single layer, and energy balances.
3.3.2 Suppose there are two layers?
3.3.3 How about 4 layers? 8 layers?
3.3.4 Towards continuity
3.3.4.1 We'll eventually see why the atmosphere
doesn't fall out the sky two sections
from now.
3.3.5 Cross-sections, mean free paths, and how far a
photon can travel.
3.4 Preparatory aside: On the variety of greenhouse gases.
3.5 HOMEWORK:
3.5.1 Maps as models. Is a map the real world? Can maps
be useful? What might make a particular map more or
less useful for a particular application or problem
than others?
3.5.2 Planets as models. Mars' atmosphere is thin, even
though the proportion of Carbon Dioxide it has by volume
is higher. Venus' atmosphere is thick and the atmosphere
is almost entirely Carbon Dioxide. If the they
were placed at the Earth's distance from the Sun, if
their atmospheres were transparent (*), and they were
initialized at the same temperature, how would their
temperatures change over time?
[(*) Verbal clarification during lecture.]
3.5.3 How do you think physical laws regarding Blackbody
Radiation were discovered?  It was codified by the same
Gustav Kirchhoff who gave us the laws of electric
circuits which are named for him.
Check out an original:
https://archive.org/details/elementarytreati00stewuoft/page/230
4. The Steady Atmosphere and the Historical Role of Natural
Greenhouse Gases.
4.1 Where does CO2 come from and where it goes: A dead simple
Carbon Cycle.
4.1.1 Respiration, in the most general sense, including
decay and plants.
4.1.1.1 CH4 + 2O2 --> CO2 + 2H2O
4.1.2 Volcanos and seeps.
4.1.3 Important to understand these reservoirs and time scales
because otherwise accounting gets done wrong.
4.2 Carbon Cycle balances and equilibration.
4.2.1 Sources of CO2.
4.2.2 Temporary sinks of CO2.
4.2.2.1 Water at surface, oceans to 1000 meters
4.2.2.2 Forests
4.2.2.3 People and their stuff
4.2.3 Long term sinks of CO2.
4.2.3.1 Forests (maybe)
4.2.3.2 Deep oceans
4.2.3.3 Calcium Carbonate in shells
4.2.3.4 Subducted tectonic plates
4.3 Time scales
4.4 Before people.
4.4.1 Ice ages.
4.4.2 Causation doesn't work well as an explanatory device
for many coupled systems. It's not a sufficiently
POWERFUL IDEA.
4.5 The occasional cosmic accident.
4.6 The occasional geologic disruption.
4.7 But weathering of rocks by tectonics is a big driver. As
is the occasional redirection of major ocean flows.
4.7.2 [Aside]
4.8 There's a lot We don't know: How did life develop in a
world lit by a dim young Sun?
4.9 HOMEWORK: ... (to be provided) ...
5. Perturbations of a Steady Atmosphere.
5.1 Earth's temperature rises in proportion to the number
of CO2 doublings.  In other words, temperature is
proportional to log(CO2 concentration).
5.1.1 Band saturation, pressure broadening, and pro-rata
effects of warming.
5.1.2 Why CH4 is more potent a GHG than CO2, as long
as it is stable.
5.2 The atmospheric lifetime of CO2 (Archer; Solomon)
5.3 Some implications and what people seem to get wrong a lot
5.3.1 Policy implications
5.3.2 "Energy intensity" is a meaningless measure for
environmental policy
5.3.3 Presentation of why, at this point, the need for
some kind of CLIMATE REPAIR seems inevitable
5.4 HOMEWORK: (Handout of problem data)  Try to calculate for
yourself the cost of reducing atmospheric CO2 by 100 ppm.
6. Structuring of the Atmosphere, Lapse Rate, and Energy
Transfers by CO2 and Water.
6.1 Energy transfers among CO2 and other atmospheric species.
6.2 What is the lapse rate?
6.3 Lapse rate and the greenhouse effect.
6.4 Surface and atmospheric water.
6.4.1 Water as a greenhouse gas.
6.4.2 Water as a heat transfer pump.
6.4.3 Clausius-Clapeyron.
6.5 HOMEWORK: Consider having a warmer atmosphere and more
water vapor aloft as a result of climate change. What
might you think are some of the implications for
weather?
7. Atmosphere, Oceans, and Land; Weather and Climate;
Slow Response
7.1 General behavior of fluids on a spinning Earth (or
any other spinning planet)
7.2 The Oceans.
7.2.1 Why oceans flow as currents
7.2.2 Circulation time
.
.
.
8. Ice sheets.
.
.
.
9. The Idea of a Feedback; Examples on Earth, Such as
Albedo and Otherwise.
9.1 Remember white rocks and black rocks?
.
.
.
10. Kinds of Carbon; Kinds of Oxygen; the Carbon Cycle.
10.1 Evidence for human tampering.
10.2 Carbon isotopes.
10.3 Oxygen isotopes
10.4 What plants do, and why.
10.5 What shellfish do and why.
10.6 Shellfish and tectonic cycles.
10.7 Carbon-14.
10.8 Fossil Carbon.
10.9 The Keeling Curves.
10.10 HOMEWORK: ...(to be provided)...
11. Perturbed Carbon Cycle, and our CO2 Legacy.
11.1 An aside about the Keeling Curve for CO2.
11.2 Long choices and our CO2 legacy
.
.
.
12. Options for Avoiding Further Impacts: Mitigation
and its Costs.
.
.
.
13. How Bad Can Things Get? How Fast? Some Reasons
for Optimism.
.
.
.
14. Choices and Options if Things All Go Wrong.
.
.
.
15. Personal Choices versus Collective Action.
.
.
.


While the course will be based upon Professor Archer’s book and course, it will be less quantitative, less technical, and will touch more upon policy than his science course. However, there will be homework assigned, and I will comment upon these, even if saying I’ll “grade them” is a bit strong.

Sessions are anticipated to be an hour apiece, with 20 minutes or so for discussion and questions thereafter.  All will be done on Zoom.us. Details on that will accompany an announcement. There is already a Zoom room for general discussions, although holding a meeting requires my participation.

Number of sessions per week and duration will depend upon the class and levels of interest regarding various parts. I estimate there will be at least 10 sessions, and at most 17. I am planning to run the class if I get at least two people interested. 10-17 sessions may seem like a lot, but attending all isn’t necessary. I would recommend at least attending

• Section 1, “Overview and Why”.
• Section 2, “Heat, Light, Energy, Blackbody Radiation, and Atmospheric Transfers”.
• Section 5, “Perturbations of a Steady Atmosphere”.
• Section 8, “Ice Sheets”.
• Section 12, “Options for Avoiding Further Impacts: Mitigation and its Costs”.

and then electing based upon interest. I will hold a session even if no one shows up, because I’ll record it and have it available for viewing later. Of course, students won’t get the benefit of interaction, and there’s only so many questions I can answer by email or at the start of the next session. On the other hand, and particularly if the class size is small, flexibility in when every session is held is something I hope we can do. It’s not like it necessarily has to be, say, Wednesday evenings at 7:00 p.m. each week. We can talk about it in the same way that committee meetings are held.

Someone suggested a more compact course format, and I want to avoid that.

First, there are a lot of compact or “crash courses” on climate out there and, if that’s done, ultimately the student needs to take something on authority. I want to avoid that. I want students to have a deep enough understanding that they can see why certain recommendations by the IPCC or U.S. NCA or deep policy people are made.

For example, if you understand the material of Section 2 (“Heat, Light, Energy, Blackbody Radiation, and Atmospheric Transfers”) and Section 4 (“The Steady Atmosphere and the Historical Role of Natural Greenhouse Gases”), I hope you’ll understand why there’s beginning to be some talk of “Climate Repair”. As Section 12 (“Options for Avoiding Further Impacts: Mitigation and its Costs”) will explain, if greenhouse gas emissions are zeroed, deterioration in climate conditions will be arrested, but they won’t get better for hundreds of years. (Even this is a tad bit oversimplified, because climate inertia means deterioration has a lag to when emissions are done, a lag that’s typically a decade or two.)

Second, the reaction I sometimes get from the “crash course” approach is that students are overwhelmed. That’s the last thing I want to do. I want to go slow enough so people can grok the material.

Finally, as mentioned, I very much intend to do this again — this is not a one-off run — and hope that if someone is interested they’ll tune in sometime.

## Handel, 2018, “As the seas rise, can we restore our coastal habitats?”

Professor Steven Handel presents:

Hint, hint: A subtle plug for allowing evolutionary dominance to advance, including permitting hearty invasive species to Do Their Thing.

Indeed, it is my opinion, that the supposed plague of “invasive species” and associated regulations is yet another manipulation on the part of herbicide-making companies to enshrine a market for their products in legislation and so-called “sustainable practice”.

By the way, Kill Your Lawn.

## David Snowball, at Mutual Fund Observer: Part 1. Part 2. Thoughts from BlackRock.

### “I amar prestar aen, the world is changed Han mathon ne nen, I feel it in the water Han mathon ne chae, I feel it in the Earth A han noston ned gwilith, I smell it in the air much that once was is lost, for none now live who remember it.”

Prologue, from “Fellowship of the Ring”, Lord of the Ring, J. R. R. Tolkien, P. Jackson

## plastic-hating environmentalists as pawns and collaborative distractors of the Trump administration

Andrew Wheeler, 45‘s head of the Environmental Protection Administration and former coal industry attorney and legal advisor to Senator Imhofe, famed climate denier of the U.S. Senate, has stated it quite simply:

### Clean drinking water is a higher priority for the Trump administration than climate change, according to Andrew Wheeler, the top US environment regulator, who called for scientific debate about the models used to assess global warming. In an interview with the Financial Times, Mr Wheeler expressed concern that the focus on climate change and the need to limit warming were detrimental to other big environmental challenges, such as potable water and affordable electricity. “We cannot lose sight of the other environmental issues facing the world,” said Mr Wheeler, confirmed in March as head of the Environmental Protection Agency. “Water issues are the number one environment crisis.” . . . Ahead of a meeting of G7 environment ministers this weekend in the French city of Metz, Mr Wheeler called for more focus on issues such as clean drinking water, access to electricity in the developing world, and the proliferation of plastic in the oceans. “I’m afraid that internationally, when people think about environmental issues, they only focus on climate change. They do not look at the other issues,” he said. Polls show climate change ranks relatively low on US voters’ list of concerns. Forty-four per cent of Americans thought climate change was a political priority, according to a Pew survey conducted this year, while 56 per cent said the environment as a whole was a priority.

##### [Emphasis added. Quote is from an article in the Financial Times. (Probable paywall.)]

So there you have it: The administration is using plastics-in-oceans to distract people from the absolute, hair-on-fire preeminent environmental issue of our time.

Okay, plastic bag ban people, how does it feel to work for Wheeler and 45‘s administration? You’ve been tricked by following the bright shiny emotionally satisfying thing.

## What’s good for each subgroup can be bad for the group: Simpson’s

There’s actually nothing odd about this. While interpretation depends upon the semantics of individual measurements, it should be expected that, at times, improving things for the overall group will mean as a matter of policy that subgroups will end up being less well off. Conversely, in some circumstances, if policy insists subgroups be more well off in each instance, the result can be that the group overall is worse off.

The obvious case is vaccination. It is true that for some subgroups the risk to the subgroup is higher than it would be were it merely exposed to the risk from the surrounding population. However, that risk increases if there is substantial abstinence from vaccination in some subgroups.