667-per-cm.net: The Podcast. Episode 1.

Commencing today, I’m offering another channel of this blog, a podcast.

This will range over the interface between people, their behavior, and the natural world. It’s primarily an opportunity for a less structured and more personal presentation of my experience of the world.

No doubt I’ll get better with the audio technology and content as I go on. It sure isn’t beginning up to the standards of any commercial podcast. I like the idea of recording while I am outdoors, and that poses special challenges.

Anyway, here it is.

And, yeah, maybe someday I’ll pick a better name for this thing.

And next time, I’m using a better microphone.

Posted in biology, earth, global warming, Nature, podcast, science | Leave a comment

Prof Nic Lewis, Reason, and a claimed criticism of Resplandy, et al

Updated 2018-11-14: See at bottom

Professor Nic Lewis has criticised the Resplandy, Keeling, et al report in Nature which I previously mentioned. A summary of his criticism appears in the somewhat libertarian ezine Reason. I have responded there, but their commenting policy limits a thorough response. Not all things can be answered in less than 150 or for that matter 2000 characters. Accordingly I have posted the response in full here, below the horizontal line.

I apologize to the readership for the poor formatting, such as lack of \LaTeX formatting which Reason, as ostentatious as it name sounds, is incapable of supporting in its comments. I didn’t feel it worth revising these here, even if WordPress is perfectly capable of doing that.


I preface by saying I’ve not read the preceding comments, and, so, I apologize if someone has already said what I’m going to say here. I have, of course, read the article above, which claims to represent Professor Lewis’ critique of Resplandy, et al (2018) fairly, I have had a quick read of the critique, although have not, for reasons that will become evident, invested the time to reproduce the calculations, and I have had a careful read of Resplandy, Keeling, et al (2018), the research paper of NATURE which is the subject of Professor Lewis’ critique.

In particular, being a quantitative engineer practiced in stochastic methods, in addition to the new use of atmospheric chemistry in the Resplandy, et al paper, I was also interested in the Delta-APO-observed uncertainty analysis described in their Methods section where, as is reported, they generated a million time series “with noise scaled to the random and systematic errors of APO data detailed in Extended Data Table 3”. Later, in the calculation Professor Lewis is apparently criticizing, Resplandy, et al report they computed the Delta-APO-climate trend using the standard deviation of these million realizations, arriving at the 1.16 +- 0.15 per meg reciprocal year value Professor Lewis so objects to. I can’t really tell from his mental arithmetic report and his least square trend report whether or not he did the million realization reproduction, but, as that is a major feature of the calculation, I rather doubt it. That’s because there are so many ways that could be set up which deserve reporting that are missing from his criticism. So either he did not calculate the result in the same way, or, if he did, he is not sharing the details in sufficient depth so we or Resplandy, et al can tell whether or not he did it the same way.

Given that this is origin of Professor Lewis’ critique and, then, the rather casual complaint about “anthropogenic aerosol deposition”, which is more present in the above (mis?)characterization of Lewis than in the original (only appears in footnote 8, and in a manner of explanation, not a criticism), the rest of Lewis’ pile-on founders if this is done wrong.

That’s the substance.

But what is really problematic is that Lewis’ critique is improper science. The way this gets done in peer review and in NATURE or SCIENCE or any other journals, including JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION or JOURNAL OF THE ROYAL STATISTICAL SOCIETY, with which I assume Professor Lewis is familiar, is that a letter is sent to the editors, with full technical details, almost akin to a research paper. Generally, original authors and the critic, in that setting, are in contact, and they agree to write a joint response, resolving the objection with more detail, or the critic presents in detail — far more than Professor Lewis did in his one-off PDF — why they believe the original to be mistaken, and then the original authors get a response.

This is why I don’t really take Professor Lewis’ criticism seriously. He hasn’t allowed the assembled, including NATURE’s technical audience, to be able to fully criticize his own criticism, by failing to document essential details. He is relying solely on his authority as a “statistician”.

In fact, there are other instances where Professor Lewis’ authority is circumscribed. For example, in 2013, Professor Lewis published a paper in JOURNAL OF CLIMATE titled “An objective Bayesian improved approach for applying optimal fingerprint techniques to estimate climate sensitivity” (vol 26, pages 2414ff) wherein he insists upon using a noninformative prior for the calculation of interest. That is certainly a permissible choice, and there is nothing technically wrong with the conclusion thus derived, However, by using citations to justify the practice, Lewis misrepresents the position of Kass and Wasserman (1996) who squarely identify proper Bayesian practice with using proper, non-uniform priors, and, moreover, identify several pitfalls with using uniform ones, pitfalls which, if Professor Lewis were faithful to his self-characterization of pursuing a Bayesian approach, should address. He does not in that paper and, so, invites the question of why. There Professor Lewis is questioning a calculating of a higher climate sensitivity from fingerprinting techniques. It appears that he’s seeking for a rationale why that might not be so. Surely invoking a device which admits uniform priors to obtain such might work, but it is hardly good Bayesian practice.

Accordingly, I wonder — for I cannot tell given what Professor Lewis has recorded in his cited objection — if the result of Resplandy, et al is what Professor Lewis’ real problem is, one where he exploits the subtle difference between doing a on-the-face-of-it linear squares on that with doing one based upon a million-fold stochastic simulation, a difference which the readers of REASON, for example, as erudite as they are, might not catch.

In my technical opinion, until Professor Lewis does the full work of a full scientific or statistical criticism, his opinion is not worth much and Resplandy, et al, have every right to ignore him.


Dr Ralph Keeling describes the smudge in the original study, and credits Prof Lewis for sending them on the right track. The details are included in a snap from the RealClimate summary below:

The revision is being submitted to Nature. Apparently, the problem is that the errors in the ensemble realization were correlated, and they did not account for this. I’ll reserve judgment until I see their corrected contribution.

One thing I’d say, however, is that if the ensemble was generated using something like a bootstrap, there’s no reason for the resulting errors to be correlated. I can’t say until I see the actual details. But, if I am correct, they could use a Politis-Romano stationary bootstrap instead, and this would have taken care of that. Note, in addition, the remark by Nordstrom.

Posted in Anthropocene, climate, climate change, global warming, luckwarmers | 3 Comments

Watch!

There’s a slew of bad news which has hit the scientific journals, the most notable being

L. Resplandy, R. F. Keeling, Y. Eddebbar, M. K. Brooks, R. Wang, L. Bopp, M. C. Long, J. P. Dunne, W. Koeve, A. Oschlies, “Quantification of ocean heat uptake from changes in atmospheric O2 and CO2 composition“, Nature, 2018, 563(7729), 105–108.

Dr Jim White puts this in context. Pay attention to what he says about what the long run temperature record says about how quickly temperatures can change, in either direction.

Addendum, 2018-11-01, 23:46 EDT

LA Times coverage of the subject. I like the quote,

Still, the system’s large number of direct measurements means any individual errors are averaged out, said Pelle Robbins, a researcher with the Massachusetts-based Woods Hole Oceanographic Institution’s department of physical oceanography, who works with the Argo program.

“The power of Argo is that we have so many instruments that we’re not reliant on any one of them,” he said. “When you average over things, you beat down the error.”
.
.
.
Robbins said the new approach is “bold,” but he still believes strongly in the accuracy of the Argo program.

“It’s an intriguing new clue,” he said, “but it’s certainly not the case that this study alone suggests that we have been systematically under-representing the oceanic warming.”

Resplandy said her discovery is not intended to replace the Argo system but rather to compliment it. “In science, we want several methods to measure things, to have several methods that converge.”

(Emphasis added in bold.)

Also, from the same article, there’s the assessment:

The new report found that emissions levels in coming decades would need to be 25% lower than laid out by the IPCC to keep warming under that 2 degree cap.

Addendum, 2018-11-02, 11:19 EDT

It’s like I just can’t put this post down. Facts are that once the Resplandy, et al (2018) paper appeared, research in to works, interviews with climate scientists, and other observations are percolating up, and we are seeing the beginning of what Dr Neil deGrasse Tyson calls an “emerging scientific truth”, that warming is not only much larger than estimated, it is accelerating. Consider this interview with Dr Lijing Cheng:

Yale Climate Connections summarized a spectrum of increasing risk, and quotes Dr White as well.

Resplandy, Keeling, et al have another article in Nature which more directly addresses the Carbon budget question. They infer that esttimated land and ocean sinks are not as large as previously estimated. This is something which has been suspected by scientists at the Global Carbon Project, but this is the first solid quantitative indication.


I should note that I, with my wife, Claire, am a strong financial supporter of Woods Hole Oceanographic Institution (WHOI), through their 1930 Society and their Fye Society.

Posted in American Association for the Advancement of Science, being carbon dioxide, carbon dioxide, children as political casualties, climate, climate change, climate disruption, global blinding, global warming, Hyper Anthropocene | 3 Comments

Our Children’s Trust

We are going to trial!

https://www.youthvgov.org/trial

Event hugely successful! 100+ people, media coverage. Photos at bottom

Update, 2018-10-19

Youth Climate Case Vs. U.S. Government Will Head to Trial in October.

Trump admin again asks Supreme Court to stop youth climate lawsuit.

Update, 2018-10-21, 2345 EDT

The Savory Tort weighs in.

Update, 2018-10-29, 19:11 EDT

Posted in Anthropocene, climate change, global warming, Hyper Anthropocene | 1 Comment

Fraunhofer ISE assessment of practicality and cost of reducing emissions by 80% in Germany by 2050

The report.

From the Summary:

Figure 1 summarizes the main results of the analysis. A future energy scenario emitting 85% less CO2 emissions than 1990 levels is compared with a reference scenario, which assumes that the German energy system operates in 2050 the same way as it does today. Results show that ii) the primary energy in the minus 85-percent scenario will drop 42 % below today’s values by 2050. iii) Assuming that no penalty is imposed on CO2 emissions and the price of fossil energy remains constant, calculations show that the cumulative total costs to maintain and operate today’s energy system will be 27% less than transforming the energy system to the targeted minus 85 percent scenario. iv) On the other hand, if the penalty for CO2 emissions increases to €100/ton by 2030 and thereafter remains constant and given that fossil fuel prices increase annually by 2 percent, then the total cumulative costs of today’s energy system (Reference) are 8% higher than the costs required for the minus 85 percent scenario up to 2050.

From the report, regarding electrical energy storage:

Electrical energy storage systems in the form of stationary and mobile (in vehicles) batteries or pumped-storage power plants are used as storage systems. Hydrogen storage systems and thermal hot water storage systems in different orders of magnitudes are considered in addition.
With respect to methane storage system, the simplified assumption is made that currently already existing storage capacities (including grid, approx. 210 TWh [9]) will also be available to the system in the future. Thus, they are not considered in the optimisation.

Pumped storage plants are not included in the optimisation. Bases on current values of an installed power of approx. 6.3 GW, and storage capacity of approx. 40 GWh, [26, 27] an increase to 8.6 GW and/or 70 GWh is assumed until 2050 for the dimensions of these plants (power and electric storage capacity) (own assumptions based on [28]).

(Note in above no change in the amount of pumped storage.)

That’s it. Feasible. Germany.

Related U.S. references:

And, quoting from the MIT report above:

The present trend toward widespread availability and decreasing cost of distributed generation and storage results in the possibility of grid defection—that is, complete disconnection from the grid.44 Grid defection may be motivated by physical conditions such as the ability to install some embedded generation within a residence or business, and economic considerations such as the desire to avoid network costs. Grid defection represents an extreme form of price elasticity and must be considered—from an efficiency perspective—in tariff design and in decisions about which regulated costs are to be included in electricity tariffs.

Pumped hydro energy storage and molten salt thermal storage account for the vast majority of installed energy storage capacity to date, but these technologies are poorly suited to distributed applications (DOE 2015).

Now, I’m done. I have real work to do.

Posted in bridge to somewhere, the energy of the people, the green century, the stack of lies | Leave a comment

Numbers, feelings, and imagination

“But numbers don’t make noises. They don’t have colours. You can’t taste them or touch them. They don’t smell of anything. They don’t have feelings. They don’t make you feel. And they make for pretty boring stories.” That’s from here, and it’s well-intended, but it is also wrong.

For people appropriately trained in science, engineering, and especially maths, numbers carry imagination, even if they don’t make noises, have smells, and don’t have feelings. And I strongly disagree they don’t make you feel. They make em feel, depending upon the context.

Consider the flow of the AMOC, known locally and colloquially as the Gulf Stream. That flow is measured in a unit called a Sverdrup. A Sverdrup is the flow corresponding of a million cubic meters per second, typically of water. To give you some idea and comparison, and to provide a feeling, all the flows into oceans of all the rivers on Earth is about 1.2 Sverdrups.

The flow in the Gulf Stream varies with season and place, but it is between 30 and 150 Sverdrups.

Astronomy and Astrophysics probably have thousands of amazing mind-stretching insights, related to numbers. But here’s another that connects the previous Oceanographic with these: All the water on Earth.

Or how about that the thermal capacity of the oceans is about a thousand that of the atmosphere. This has implications:

That’s a figure from the American Chemical Society.

Numbers. It’s how you really know anything.

Curious? Take a math course. Take a physics course. These days, take a biology course. See what I mean.

Posted in mathematics, maths, numbers, numerics, oceanography | Leave a comment

Alex Steffen on Climate Defeatism

On 31st July 2018, Alex Steffen wrote (on Twitter) that:

Reminder that climate defeatism—arguing that we are already so screwed that there’s no real point in acting to limit climate emissions or ecological damage—is absolutely a form of denialism, and one that directly aids those profiting off continued destruction.

He quoted a 2017 tweet titled “The apocalyptic is itself a form of denialism” citing what he describes as the “most popular thing he has ever written”, an essay titled Putting the Future Back in the Room.

I agree.

I agree and need to say something because I hear, directly or not, from environmentalists and not, that some consider the problem too hard, the work done so far too small, the costs too high, the magnitude of the risk too great to contemplate doing anything about climate change and that it is better now to prepare oneself for the end, withdrawing, so to speak, into a seemingly spiritual cocoon.

Frankly, that kind of thing gives the spiritual a bad name.

I also hear from environmental activists of old, that they are unwilling to compromise on their ethical integrity, and refuse to have anything at all to do with corporations, or the well-heeled, or with compromises on the environment, endangered species, social justice, or anything else, even if these compromises could be basis of progress. In this respect I share the opinion and distaste for progressives which Bill Maher sometimes expresses, and I have said so. Progressives are also often reluctant to do anything in cooperation with the military or military people.

(This is taken from a blog report of a survey of inland flooding from Hurricane Florence by Air WorldWide.)

This kind of close-mindedness and failure to realize, particularly in light of the recent report from UNFCCC that it is necessary to triage now. This doesn’t mean compromising on emissions, or natural gas, or other aspects which Science says we cannot have if we are going to succeed in containing this enormous problem. But it does mean looking seriously at distant hydropower, sensibly routed, and looking at nuclear power, as long as it can be built cheaply and quickly. (Nuclear power along the lines of the present power station models cannot be.) If that means bringing back the breeder reactor proposals of the Clinton-Gore era, maybe it does. (More on this from Dr James Hansen.)

As Steward Brand argues, there’s no time left to be an environmentalist. It’s well nigh time to be an ecopragmatist, even if I don’t heartily agree with him, e.g., his promotion of Paul Hawken’s Drawdown ideas which disregard biological reality.

But the direction and spirit are right. Bill Nye-style engineering is what’s needed. We can do that. Those who don’t know, should learn. There are many places to go, such as Environmental Business Council of New England, or the Institute for Local Self-Reliance. There are plenty of leaders, like Michael Bloomberg, and Mark Carney, and Richard Branson, and Professor Tony Seba.

Posted in American Association for the Advancement of Science, American Solar Energy Society, Anthropocene, anti-science, attribution, being carbon dioxide, Berkeley Earth Surface Temperature project, Bill Maher, Bill Nye, Bloomberg, Bloomberg New Energy Finance, BNEF, Buckminster Fuller, Bulletin of the Atomic Scientists, climate business, climate change, climate economics, corporations, denial, engineering, global warming, greenhouse gases, Hyper Anthropocene, investing, investment in wind and solar energy, investments, James Hansen, John Farrell, Kerry Emanuel, klaus lackner, liberal climate deniers, Mark Jacobson, Massachusetts Clean Energy Center, Mathematics and Climate Research Network, Michael Bloomberg, reason, reasonableness, science denier, secularism, Stewart Brand, the green century, the right to know, the tragedy of our present civilization, Tony Seba, tragedy of the horizon, unreason, zero carbon | 6 Comments

No, Senator Marco Rubio and Larry Kudlow, we know how much humans contribute to climate change, at least precisely enough for Congress and an administration

14th October 2018, quoting Senator Marco Rubio and White House economic advisor Larry Kudlow, the Washington Post reported they each claimed that the recent UN report was an `overestimate`:

“I think they overestimate,” Kudlow said of the U.N. report, which found that policy changes must proceed at an unprecedented pace in the next 12 years to stop temperatures from rising more than 1.5 degrees Celsius (2.7 degrees Fahrenheit) above Earth’s preindustrial temperature.

“I’m not denying any climate-change issues,” Kudlow said on ABC’s “This Week.” “I’m just saying, do we know precisely . . . things like how much of it is manmade, how much of it is solar, how much of it is oceanic, how much of it is rain forest and other issues?”

And:

Rubio (R-Fla.), speaking on CNN about the effects of Hurricane Michael, said that sea levels and ocean temperatures have risen in a “measurable” way and that humans have played some role. But he questioned how big that role is.

“I think many scientists would debate the percentage of what is attributable to man versus normal fluctuations,” Rubio said on “State of the Union.”

Well, actually, it is, surely with some uncertainty, but the contribution of human emissions and activity is three times that of the nearest competitor. Moreover, in the case of volcanic activity, solar irradiance, and ENSO, their forcings are not consistent, the volcanic one being intermittent, and solar being cyclical. Moreover, volcanic forcings overwhelmingly cool Earth, not heat it, and the ENSO can do the same. As is the case for most environmental problems, these kinds of considerations demand assessments be quantitative not merely qualitative or ascribing accuracy of a number to the supposed reputation of the person or group stating it.

Here are the comparisons from Judith Lean and David Rind in 2008 (“How natural and anthropogenic influences alter global and regional surface temperatures: 1889 to 2006”, Geophysical Research Letters, 35, L18701, doi:10.1029/2008GL034864):

(Click on image to see a larger figure and use browser Back Button to return to blog.)

That is, in fact, the go-to source used by, for instance NASA in its site concerning the matter (Riebeek and Simmon, 2010). The NASA page also describes methods used, and provides explanation.

Even at the most extreme positive forcings, human activity is about 1.8 times the combined effects, but that can only happen when solar and ENSO forcings align, which is uncommon. That margin is well beyond measurement uncertainty.

So, sure, these numbers are not known perfectly. But economic and governing policies are seldom based upon perfect or even complete information. This science is far more complete in knowledge than most. And these bounds on uncertainty ought to be more than enough for people to set policy.

And, frankly, the arguments from Rubio and Kudlow are sick examples of the rhetorical fallacies known as Argumentum ad Ignorantiam (Argument from Ignorance) and the Continuum Fallacy (Line Drawing, Bald Man Fallacy).

It is at least disingenuous for Rubio and Kudlow to make these claims. And, given their backgrounds and training, it is more likely to be simply untruthful.

(Click on image to see a larger figure and use browser Back Button to return to blog.)

As a statistician there is another aspect to these opinions which is being wholly neglected. Generally speaking, in order to make a rational decision about anything quantitative and complicated, the losses on either side of the question need to be considered. Indeed, the recent Nobel laureate in Economics, William Nordhaus, won his Nobel for precisely working to identify these costs and losses.

Given that specific, large losses are being realized, losses which are unprecedented in United States (and world!) history, even controlling for additional development, it is open-minded and fair for a Rubio or a Kudlow to consider that their assessments of the benefits of inaction might be incorrect. And, what this means is, that arguments and methods and processes which led to the recent UN report are demonstrating predictive strength, including attribution of cause.

A quick review of history shows Rubio and Kudlow are typical members of a pack of voices who, until recently, denied anything unusual was happening at all.

Posted in American Association for the Advancement of Science, American Meteorological Association, American Statistical Association, anomaly detection, anti-intellectualism, anti-science, being carbon dioxide, Berkeley Earth Surface Temperature project, bollocks, carbon dioxide, changepoint detection, children as political casualties, climate change, climate data, evidence, global warming, Humans have a lot to answer for, Hyper Anthropocene, Juliana v United States, leaving fossil fuels in the ground, physics, radiative forcing, science, science denier, tragedy of the horizon, UNFCCC, unreason | Leave a comment

`significance testing`

Image | Posted on by | Leave a comment

Sampling: Rejection, Reservoir, and Slice

An article by Suilou Huang for catatrophe modeler AIR-WorldWide of Boston about rejection sampling in CAT modeling got me thinking about pulling together some notes about sampling algorithms of various kinds. There are, of course, books written about this subject, including Yves Tillé’s Sampling Algorithms, 2006, Springer, which covers reservoir sampling in its section 4.4.5. Tillé does not cover rejection sampling or slice sampling, so I thought I’d put some notes and figures here, not so much to teach, but reference. And, to some degree for balance.

Note the Wikipedia article regarding slice sampling is, at least of this date and in my opinion, not up to Wikipedia‘s usual quality standards for statistical articles.

This is in large measure a technical blog, although I have a lot of material about climate activism and promotion of solar energy. Sure, I very much care about these things, but I’m at heart a quantitative problem solver, with a fondness for engineering quantitative software, and, so, this blog doesn’t reflect me fully if I leave it at activism.

The theoretical underpinnings of many of these algorithms is recounted in C. P. Robert and G. Casella’s textbook Monte Carlo Statistical Methods, 2en edition, 2004. Slice sampling receives an extended and worthwhile treatment in their Chapter 8. Robert and Casella also treat algorithms like importance sampling, not touched here, in their section 3.3, and rejection sampling in their section 2.3.

Huang covers rejection sampling pretty well. Fewer people know about slice sampling, so the remaining references will cover that.

The idea was introduced by Professor Radford Neal of the University of Toronto in 2003 (also see accompanying discussion) and it has undergone several innovations since. He offers R code to do it. There is an R package, one MfUSampler, which offers various sampling algorithms under a supervisory framework, including slice sampling. These are intended to be used in the context of a Markov Chain Monte Carlo (MCMC) stochastic search, typically exploring a Bayesian posterior density, but the cautions of Robert and Casella regarding adaptive sampling schemes in their section 7.6.3 for this purpose, which they credit Neal for identifying, are worth a look.

There are also at least a couple of instances of code for Python, including nice tutorials by Kristiadi, and Professor Rahul Dave’s AM207 course notes from 2017.

Update, 2018-09-29, 14:44 EDT

Additional study revealed

M. M. Tibbits, M. Haran, J. C. Liechty, ``Parallel multivariate slice sampling'', Statistics and Computing, 2011, 21(13), 415-430.

which, in turn, traces the origins of this idea back to

R. M. Neal, ``Markov chain Monte Carlo methods based on 'slicing' the density function'', Technical Report, Department of Statistics, University of Toronto, 1997.

and

P. Damien, J. Wakefield, S. Walker, ``Gibbs sampling for Bayesian non-conjugate and hierarchical models by using auxiliary variables'', Journal of the Royal Statistical Society, Series B (Statistical Methods), 1999, 61(2), 331-344.

and

A. Mira, L. Tierney, ``Efficiency and convergence properties of slice samplers'', Scandinavian Journal of Statististics, 2002, 29(1), 1-12..

There is also the doctoral dissertation of M. B. Thompson, “Slice Sampling with multivariate steps”, from the Graduate Department of Statistics, University of Toronto, 2011, under the supervision of Professor Radford Neal.

Also, note the R package MCMCpack uses slice sampling at selected points for implementations.

Update, 2018-09-30, 16:54 EDT

A more careful read of Professor Rahul’s page on slice sampling reveals the claim:

One of the methods that we’ll look at is Slice Sampling. Slice sampling was invented by Radfor Neal and John Skilling. It is a very elegant and simple way to solve the problems of MH and Gibbs sampling. As Pavlos would say, its conceptually so easy you would think his (very capable) grandma invented it …

(Emphasis added.)

First — and I’m admittedly nitpicking — Professor Neal’s first name is “Radford” not “Radfor”.

Second, and more seriously, I don’t see any evidence to justify the claim that Dr John Skilling co-invented slice sampling. Skilling did invent another kind of sampling, nested sampling, which has been useful in some applications, even though it has received serious criticism from some experts, technically addressed at

N. Chopin, C. P. Robert, ``Properties of nested sampling''. Biometrika, 2010, 97(3), 741-755.

E. Higson, W. Handley, M. Hobson, A. Lasenby, ``Sampling errors in Nested Sampling parameter estimation'', Bayesian Analysis, 2018, 13(3), 873-896.

Skilling and MacKay did supply a comment on Neal’s original paper in its Discussion, proposing to use integer arithmetic for a special version of slice sampling. Neal addresses that in his Rejoinder.

Third, the discussion surrounding the Python code Professor Rahul offers is a bit weak, particularly regarding the multimodal case. This is unfortunate, because that’s the interesting one. This suggests I’m talking myself into doing a tutorial on slice sampling in R some time down the road, remedying these problems. When I do so, I should really include finding the area of the Batman shape as previously promised.

Posted in accept-reject methods, American Statistical Association, Bayesian computational methods, catastrophe modeling, data science, diffusion processes, empirical likelihood, Gibbs Sampling, insurance, Markov Chain Monte Carlo, mathematics, Mathematics and Climate Research Network, maths, Monte Carlo Statistical Methods, multivariate statistics, numerical algorithms, numerical analysis, numerical software, numerics, percolation theory, Python 3 programming language, R statistical programming language, Radford Neal, sampling, slice sampling, spatial statistics, statistics, stochastic algorithms, stochastic search | Leave a comment

No ordinary lawsuit: Levi Draheim, UUMFE Guardian of the Future

No Ordinary Lawsuit


Federal climate change lawsuit plaintiff Levi Draheim in Washington, D.C. (Photo credit: Robin Loznak/Our Children’s Trust)

The full story, and in audio from PRI.

UU Ministry for Earth description of his award.

Levi is a plaintiff in the federal lawsuit which will go to trial on 29th October 2018 in Eugene, OR. The Complaint is available here, as are briefings.

In greater Boston, and in Massachusetts we are holding a rally and Meeting of Witness at the Moakley Federal Courthouse on the 29th of October, beginning at 11:30 a.m. sharp.

Go here if interested in finding out more and attending. Or find and attend a rally closer to you.

(Click on book cover for more details.)

Update, 2018-10-04

The latest news of the trial and progress is reported here.

More …

Posted in Anthropocene, atmosphere, being carbon dioxide, bridge to somewhere, carbon dioxide, carbon dioxide capture, children as political casualties, clear air capture of carbon dioxide, climate, climate change, climate disruption, climate justice, environment, environmental law, First Parish Needham, fossil fuels, global warming, Hyper Anthropocene, James Hansen, Juliana v United States, Levi Draheim, New England, Our Children's Trust, Unitarian Universalism, UU, UU Mass Action, UU Ministry for Earth, UU Needham, UUMFE | Leave a comment

Seven degrees, whaddya get, a century older and deeper in Carbon debt …

In an amazing report, the Trump administration forecasts +7F rise in global temperatures by 2100, insisting nothing can be done to prevent it happening. In the associated report, the administration claimed that the deep cuts in emissions needed to prevent this outcome “would require substantial increases in technology innovation and adoption compared to today’s levels and would require the economy and the vehicle fleet to move away from the use of fossil fuels, which is not currently technologically feasible or economically feasible.”

Details at the Washington Post.

Quoting the pertinent section:

The emissions reductions necessary to keep global emissions within this carbon budget could not be achieved solely with drastic reductions in emissions from the U.S. passenger car and light truck vehicle fleet but would also require drastic reductions in all U.S. sectors and from the rest of the developed and developing world. In addition, achieving GHG reductions from the passenger car and light truck vehicle fleet to the same degree that emissions reductions will be needed globally to avoid using all of the carbon budget would require substantial increases in technology innovation and adoption compared to today’s levels and would require the economy and the vehicle fleet to substantially move away from the use of fossil fuels, which is not currently technologically feasible or economically practicable.


[From T. A. Carleton, S. M. Hsiang, “Social and economic impacts of climate”, Science, 9 September 2016.

(Click on image to see a larger figure, and use browser Back Button to return to blog.)

http://www.popularyoutube.com/video/__Kt_oU9iss/How-would-a-warmer-world-affect-your-Saturday

Horatio Algeranon | April 28, 2014 at 6:54 pm |

“What A Carbonful World”
— Horatio’s version of What a Wonderful World (written by Bob Thiele and George Weiss and made famous by Louis Armstrong)

I see trees of brown,
Hockey sticks too.
I see dim gloom
for me and you.
And I think to myself,
what a carbonful world.

I see Hadley CRU,
And sea-ice flight.
The blistering day,
The hot muggy night.
And I think to myself,
What a carbonful world.

The ppm’s of carbon,
Increasing in the sky.
Are warming all the faces,
Of people who will die,
I see storms shaking hands.
Saying, “How do you do?”
They’re really saying,
“I’ll get you”.

I hear Stevies cry,
I watch them blow,
They’ll learn much less,
Than I already know.
And I think to myself,
What a carbonful world.

Yes, I think to myself,
What a carbonful world.

Oh yeah.

Music.

(Click on image to see a larger figure, and use browser Back Button to return to blog.)

Source.

Posted in adaptation, American Association for the Advancement of Science, Anthropocene, anti-intellectualism, anti-science, being carbon dioxide, bridge to nowhere, capitalism, carbon dioxide, climate change, climate disruption, climate economics, corruption, Cult of Carbon, dump Trump, global warming | Leave a comment

`If we don’t protect Nature, we cannot protect ourselves’

Harrison Ford‘s speech, at the Climate Action Summit, below:

“… Don’t forget Nature.”

“If we don’t stop the destruction of our natural world, nothing else will matter.”

“We need to include Nature in every corporate, state, and national climate goal.”

Posted in American Association for the Advancement of Science, Anthropocene, bridge to somewhere, carbon dioxide, climate, climate change, climate disruption, climate education, corporate responsibility, David Suzuki, ecology, Ecology Action, global warming, Harrison Ford, Hyper Anthropocene, science | Leave a comment

Press conference from Global Climate Action Summit, by the U.S. Climate Alliance

Executive Summary report:

FightingForOurFuture–GrowingOurEconomiesAndProtectingOurCommunitiesThroughClimateLeadership–UnitedStatesClimateAlliance2018–ExecutiveSummary

Full Report:

FightingForOurFuture–GrowingOurEconomiesAndProtectingOurCommunitiesThroughClimateLeadership–UnitedStatesClimateAlliance2018

Precis:

Climate-Alliance-FactSheet-June_2018

Posted in American Association for the Advancement of Science, Anthropocene, Arnold Schwarzennegger, Bloomberg, Bloomberg New Energy Finance, carbon dioxide, climate, climate business, climate change, climate disruption, climate economics, decentralized electric power generation, fossil fuel divestment, Hyper Anthropocene, Jerry Brown, Michael Bloomberg, mitigation, moral leadership | 2 Comments

Another reason we need to stop developing: `If the cement industry were a country, it would be the third largest emitter in the world.’

Much of the focus on reducing Carbon Dioxide emissions is upon reduction and elimination of fossil fuels. Many do not realize that reducing emissions to zero also means offsetting emissions from agriculture, and especially curbing use of cement. Cement production yields an enormous about of CO2:

In 2015, it generated around 2.8bn tonnes of CO2, equivalent to 8% of the global total – a greater share than any country other than China or the US.

More recent numbers here.

Zeroing it either means curbing development or finding substitutes, and probably, given how things go, a combination of the two. This is a reason why I am so virulently anti-development, in addition to ecosystem destruction and putting people, assets, and revenue base at risk in a serious flood plain.

These figures are from R. M. Andrew: Global CO2 emissions from cement production, Earth Syst. Sci. Data, 10, 195–217, 2018.

Apart from concerns for climate impacts, there are documented health effects from the production of cement near where limestone is mined and the cement is produced, and for incidental emissions, such as non-exhaust particulate matter from the heavy road traffic which carries it. See, for instance,

Thorpe, A., & Harrison, R. M. (2008). Sources and properties of non-exhaust particulate matter from road traffic: A review. Science of The Total Environment, 400(1-3), 270–282.

This is a matter pertinent to the quality of life in the Town of Westwood, Massachusetts, where I live, and nearby towns through which Route 109 (“High Street”) and Hartford Street pass:

In particular, there is a cement production facility in Medfield which accounts for a significant portion of heavy truck traffic:

The Town of Westwood is already exposed to unhealthy particulates from the nearby Interstate 95/Route 128 traffic which streams 24/7. This additional burden of particulates, produced by trucks passing several per five minutes in both directions on High Street, travelling in excess of the 30 mph speed limit, poses an unnecessary risk to the people of the Town particularly children.

It is not appropriate for this post, but, in the long term, I intend to measure this traffic, the exceedance cement truck traffic over speed limits, impacts to care of roads, and estimate health effects. That traffic travels faster than legal speed limits is no surprise (MIT), but in the case of cement trucks, this practice can be particularly dangerous, setting aside risks of particulate pollution.

Posted in Anthropocene, attribution, bridge to nowhere, carbon dioxide, cement production, civilization, climate, climate disruption, climate economics, development as anti-ecology, economic trade, emissions, extended producer responsibility, global warming, greenhouse gases, Humans have a lot to answer for, Hyper Anthropocene, planning, pollution, sustainability, the right to know, the tragedy of our present civilization, unreason, Westwood, zero carbon | Leave a comment

Dr Glen Peters on “Stylised pathways to `well below 2°C”’, and some solutions from Dr Steven Chu (but it’s late!)

Stylized pathways to “well below 2°C”

Dr Peters has also written about “Can we really limit global warming to `well below’ two degrees centigrade?” An excerpt and abstract:

Commentary: Yes, but only in a model. We have essentially emitted too much carbon dioxide already, and the most feasible pathways to stay “well below” two degrees all require removing carbon dioxide from the atmosphere at an unprecedented scale.

See the article for more details. And, note, I’ve written about how extraordinarily expensive negative emissions industry is, in two articles. Even assuming the engineering technology and practical rollout for such a huge global project are developed, something which might take decades by itself, we’re talking about multiples of Gross World Product to make an appreciable dent in atmospheric CO2 at projected prices. Dr Peters is not optimistic either.

And see the `stylized pathways’ article for how hard it is to keep emissions below some threshold — and so mean global temperature below some threshold — if there are delays in emissions reduction.


Dr Steven Chu gave a presentation recently at Amherst College on the risks, and concerning possible solutions:

(Dr Chu begins at time index 445.)

Notably, apart from reforestation and improvements in agricultural practice, Dr Chu does not address negative emissions technology as a feasible solution but he speaks to the difficulty. Reforestation and improvements in agriculture can help with 8% of CO2 emissions, specifically:

Restoring Carbon in soils has the potential to sequester 20 Gt of CO2 (~ 8% of cumulative CO2 emissions …)

Posted in American Association for the Advancement of Science, an ignorant American public, an uncaring American public, Anthropocene, atmosphere, being carbon dioxide, bridge to somewhere, carbon dioxide, carbon dioxide capture, carbon dioxide sequestration, clear air capture of carbon dioxide, climate, climate disruption, climate economics, emissions, Glen Peters, Global Carbon Project, global warming, greenhouse gases, Hyper Anthropocene, Kevin Anderson, rationality, reasonableness, risk, science, Science magazine, Stephen Chu, sustainability, The Demon Haunted World, the tragedy of our present civilization, zero carbon | Leave a comment

A quick note on modeling operational risk from count data

The blog statcompute recently featured a proposal encouraging the use of ordinal models for difficult risk regressions involving count data. This is actually a second installment of a two-part post on this problem, the first dealing with flexibility in count regression.

I was drawn to comment because of a remark in the most recent post, specifically that

This class of models require to simultaneously estimate both mean and variance functions with separate sets of parameters and often suffer from convergence difficulties in the model estimation. All four mentioned above are distributions with two parameters, on which mean and variance functions are jointly determined. Due to the complexity, these models are not even widely used in the industry.

Now, admittedly, count regression can have its issues. The traditional methods of linear regression don’t smoothly extend to the non-negative integers, even when counts are large and bounded away from zero. But in the list of proposals offered, there was a stark omission of two categories of approaches. There was also no mention of drawbacks of ordinal models, and the author’s claim that the straw man distributions offered are “not even widely used in industry” may be true, but not for the reasons that paragraph implies.

I post a brief reaction here because the blog also does not offer a facility for commenting.

First of all, as any excursion into literature and textbooks will reveal, a standard approach is to use generalized linear models (GLMs) with link functions appropriate to counts. And, in fact, the author goes there, offering a GLM version of standard Poisson regression. But dividing responses into ordinal buckets is not a prerequisite for doing that.

GLMs are useful to know about for many reasons, including smooth extensions to logistic regression and probit models. Moreover, such an approach is thoroughly modern, because it leaves behind the idea that there is a unique distribution for every problem, however complicated it might be, and embraces the idea that few actual “real world” problems or datasets will be accurately seen as drawn from some theoretical distribution. That is an insight from industrial practice. Understanding logistic regression and GLMs has other important benefits beyond applicability to binary and ordinary responses, including understanding new techniques like boosting and generalized additive models (GAM).

Second, the presentation completely ignores modern Bayesian computational methods. In fact, these can use Poisson regression as the core model of counts, but posing hierarchical priors on the Poisson means drawn from hyperpriors is an alternative mechanism for representing overdispersion (or underdispersion). Naturally, one needn’t restrict regression to the Poisson, so Negative Binomial or other core models can be used. There are many reasons for using Bayesian methods but, to push back from the argument of the blog post as represented by the quote above, allaying fear of having too many parameters is one of the best and most pertinent. To a Bayesian, many parameters are welcome, and each are seen as random variables contributing to a posterior density, and, in modern approaches, linked together with a network of hyperpriors. While specialized methods are available, the key technique is Markov Chain Monte Carlo.

There are many methods available in R for using these techniques, including the arm, MCMCpack, and MCMCglmm packages. In addition, here are some references from the literature using Bayesian methods for count regression and risk modeling:

  1. Ä Ãzmen, H. Demirhan, “A Bayesian approach for zero-inflated count regression models by using the Reversible Jump Markov Chain Monte Carlo Method and an application”, Communications in Statistics: Theory and Methods, 2010, 39(12), 2109-2127
  2. L. N. Kazembe, “A Bayesian two part model applied to analyze risk factors of adult mortality with application to data from Namibia”, PLoS ONE, 2013, 8(9): e73500.
  3. W. Wu, J. Stamey, D. Kahle, “A Bayesian approach to account for misclassification and overdispersion in count data”, Int J Environ Res Public Health, 2015, 12(9), 10648–10661.
  4. J.M. Pérez-Sánchez, E. Gómez-Déniz, “Simulating posterior distributions for zero-inflated automobile insurance data”, arXiv:1606.00361v1 [stat.AP], 16 Nov 2015 10:50:40 GMT.
  5. A. Johansson, “A Comparison of regression models for count data in third party automobile insurance”, Department of Mathematical Statistics, Royal Institute of Technology, Stockholm, Sweden, 2014.

(The above is a reproduction of Figure 2 from W. Wu, J. Stamey, D. Kahle, “A Bayesian approach to account for misclassification and overdispersion in count data”, cited above.)</h6

Third, the author of that post uses the rms package, a useful and neat compendium of regression approaches and methods. The author of the companion textbook, Regression Modeling Strategies (2nd edition, 2015) by Professor Frank Harrell, Jr, cautions in it that

It is a common belief among practitioners … that the presence of non-linearity should be dealt with by chopping continuous variables into intervals. Nothing could be more disastrous.

See Section 2.4.1 (“Avoiding Categorization”) in the text. The well-justified diatribe against premature and unwarranted categorization spans three full pages.

Indeed, this caution appears in literature:

  1. D. G. Altman, P. Royston, “The cost of dichotomising continuous variables”, The BMJ, 332(7549), 2006, 1080.
  2. J. Cohen, “The cost of dichotomization”,
    Applied Psychological Measurement, 7, June 1983, 249-253.
  3. P. Royston, D. G. Altman, W. Sauerbrei, “Dichotomizing continuous predictors in multiple regression: a bad idea.”,
    Statistics in Medicine, 25(1), January 2006, 127-141.

Note that Professor Harrell considers count data to be continuous, and deals with it by transformation, if necessary by applying splines.

Posted in American Statistical Association, Bayesian, Bayesian computational methods, count data regression, dichotomising continuous variables, dynamic generalized linear models, Frank Harrell, Frequentist, Generalize Additive Models, generalized linear mixed models, generalized linear models, GLMMs, GLMs, John Kruschke, maximum likelihood, model comparison, Monte Carlo Statistical Methods, multivariate statistics, nonlinear, numerical software, numerics, premature categorization, probit regression, statistical regression, statistics | Tagged , , , | Leave a comment

Today, now, and what of the future?

From Aldo Leopold in his A Sand County Almanac:

One of the penalties of an ecological education is that one lives alone in a world of wounds. Much of the damage inflicted on land is quite invisible to laymen. An ecologist must either harden his shell and make believe that the consequences of science are none of his business, or he must be the doctor who sees the marks of death in a community that believes itself well and does not want to be told otherwise.

(Emphasis added.)

Among many other notable efforts, Aldo Leopold attempted to reconcile Ecology with economic imperatives. I still, however, cannot stomach the encouragement of hunting which Leopold’s post mortem Foundation pursues.

Posted in adaptation, agroecology, Aldo Leopold, American Association for the Advancement of Science, argoecology, being carbon dioxide, biology, Boston Ethical Society, Botany, Buckminster Fuller, Charles Darwin, climate, climate change, David Suzuki, Earle Wilson, Ecological Society of America, Ecology Action, ethics, George Sughihara, Glen Peters, global warming, Grant Foster, Humans have a lot to answer for, Hyper Anthropocene, population biology, quantitative biology, quantitative ecology, Spaceship Earth, sustainability, The Demon Haunted World, the right to be and act stupid, the right to know, the tragedy of our present civilization, tragedy of the horizon, unreason, UU Humanists | 2 Comments

Fast means, fast moments (originally devised 1984)

There are many devices available for making numerical calculations fast. Modern datasets and computational problems apply stylized architectures, and use approaches to problems including special algorithms for just calculating dominant eigenvectors or using non-classical statistical mechanisms like shrinkage to estimate correlation matrices well. But sometimes it’s simpler than that.

In particular, some calculations can be preconditioned. What’s done varies with the problem, but often there is a one-time investment in creating a numerical data structure which is then used repeatedly to obtain fast performance of some important kind.

This post is a re-presentation of a technique I devised in 1984 for finding sums of contiguous submatrices of a given matrix in constant time. The same technique can be used to calculate moments of such submatrices, in order to support estimating statistical moments or moments of inertia, but I won’t address those in this post. Studies of these problems, given this technique, can imagine how that might be done. If there’s an interest and a need, with comments or email, I’ll some day post algorithm and code for doing moments.

The Algorithm

Assume there is a square matrix, \mathbf{A}, having flonums. Accordingly it has both m rows and m columns. Produce from it a transformed matrix \mathbf{A_{1,2}} in two steps, first producing a transformed matrix, \mathbf{A_{1}}, and then the transformed matrix, \mathbf{A_{1,2}}. Do this by first forming \mathbf{A_{1}} as \mathbf{A} but with its j^{\text{th}} column replaced by the cumulative sum of the j^{\text{th}} column of \mathbf{A}. That is,

\mathbf{A_{1}}_{i,j} = \sum_{k=1}^{i} \mathbf{A}_{k,j}

To form \mathbf{A_{1,2}} do something similar, except with rows:

\mathbf{A_{1,2}}_{i,j} = \sum_{k=1}^{j} \mathbf{A_{1}}_{i,k}

\mathbf{A_{1,2}} is then the preconditioned version of {A}. Specifically, if the sum of any contiguous rectangular submatrix of \mathbf{A} is sought, say, with its upper left coordinate within \mathbf{A} being (i,j) and its lower right being (k,l), then lookup the value of \mathbf{A_{1,2}}_{k,l} and subtract from it the value of \mathbf{A_{1,2}}_{i-1,l} and the value of \mathbf{A_{1,2}}_{k,j-1}, and then add back in, because of the principle of inclusion and exclusion, the value of \mathbf{A_{1,2}}_{i-1,j-1}. If any of these are off the premises of \mathbf{A_{1,2}} because i-1 or j-1 are too small, treat the corresponding subtrahends as zero.

Because there are only two sums involved in the calculation of the sum of any submatrix, the algorithm does this in constant time, irrespective of the sizes of the submatrices. There are dynamic range tricks that can be applied should the sums of values in the preconditioned matrix get very large in magnitude.

R code is below. I’ll put it up in my Github repository some time.


###############################################################################################
# This section of code up to the next group of #### marks is owned and copyrighted by #
# Jan Galkowski, who has placed it in the public domain, available for any purpose by anyone. #
###############################################################################################

fastMeansFastMomentsPrecondition<- function(X)
{
# # (From Jan Galkowski, ``Fast means, fast moments'', 1984,
# # IBM Federal Systems Division, Owego, NY. Released into the public domain, 1994.)
stopifnot( is.matrix(X) )
M<- nrow(X)
stopifnot( M == ncol(X) )
AX1<- apply(X=X, MARGIN=2, FUN=cumsum)
# AX2<- t(apply(X=X, MARGIN=1, FUN=cumsum))
AX12<- t(apply(X=AX1, MARGIN=1, FUN=cumsum))
return(AX12)
}

fastMeansFastMomentsBlock<- function(P, iUL, jUL, iLR, jLR)
{
# # (From Jan Galkowski, ``Fast means, fast moments'', 1984,
# # IBM Federal Systems Division, Owego, NY. Released into the public domain, 1994.)
#
# P is the preconditioned AX12 from above.
#
stopifnot( is.matrix(P) )
M<- nrow(P)
stopifnot( M == ncol(P) )
stopifnot( (1 <= iUL) && (iUL <= M) )
stopifnot( (1 <= jUL) && (jUL <= M) )
stopifnot( (1 <= iLR) && (iLR <= M) )
stopifnot( (1 <= jLR) && (jLR <= M) )
#
iUL1<- iUL-1
jUL1<- jUL-1
iLR1<- iLR-1
jLR1<- jLR-1
#
if (0 == iUL1)
{
W.AL<- 0
W.A<- 0
if (0 == jUL1)
{
W.L<- 0
} else
{
W.L<- P[iLR,jUL1]
}
} else if (0 == jUL1)
{
W.AL<- 0
W.L<- 0
if (0 == iUL1)
{
W.A<- 0
} else
{
W.A<- P[iUL1,jLR]
}
} else
{
W.AL<- P[iUL1,jUL1]
W.A<- P[iUL1,jLR]
W.L<- P[iLR,jUL1]
}
#
W<- P[iLR,jLR] + W.AL - W.A - W.L
#
return(W)
}

# Self-test FMFM ...
cat("Fast means, fast moments self-test ...\n")
Z<- matrix(round(runif(100, min=1, max=100)), 10, 10)
Z.P<- fastMeansFastMomentsPrecondition(Z)
stopifnot( sum(Z[1:4,1:5]) == fastMeansFastMomentsBlock(Z.P, 1, 1, 4, 5) )
stopifnot( sum(Z[8:10, 8:9]) == fastMeansFastMomentsBlock(Z.P, 8, 8, 10, 9) )
stopifnot( sum(Z[4:7, 3:5]) == fastMeansFastMomentsBlock(Z.P, 4, 3, 7, 5) )
rm(list=c("Z", "Z.P"))
cat("... Self-test completed.\n")

###############################################################################################
# End of public domain code. #
###############################################################################################

randomizeSeed<- function()
{
#set.seed(31415)
# Futz with the random seed
E<- proc.time()["elapsed"]
names(E)<- NULL
rf<- E - trunc(E)
set.seed(round(10000*rf))
# rm(list=c("E", "rf"))
return( sample.int(2000000, size=sample.int(2000, size=1), replace=TRUE)[1] )
}

wonkyRandom<- randomizeSeed()

So, there’s a little story of how this came to be public domain.

I used to work for IBM Federal Systems Division, in Owego, NY. I worked as both a software engineer and later, in a much more fun role, as a test engineer specializing in hardware-software which did quantitative calculations. IBM, as is common, had a strong intellectual property protection policy and framework.

Well, in 1994, Loral bought Federal Systems from IBM. As former employees, we were encouraged to join the new organization, but, naturally, were asked to sign a bunch of paperwork. To my personal surprise, there was nothing in the paperwork which had to do with IBM’s rights to intellectual property we might have originated or held. All they wanted was for us to sign onto the new division, part of Loral.

Before I signed, therefore, I approached corporate counsel and pointed out there was no restriction on intellectually property or constraints upon its disposition. They, interested in making the transition happen, said “Yes”. I went off and prepared a document of all the material and algorithms and such which I thought I had developed while on IBM time in which they hadn’t expressed any interest, including offers to put it up for patenting or whatever. It was a reasonably thick document, maybe 100 pages, and I still have a copy. I asked the attorneys to sign over the intellectual property rights of these to me, and they did. It was witnessed. I had made my coming to Loral contingent upon doing this, and they seemed happy to do it. I signed once I had this in hand. I still have a copy.

I did not know at the time, but Loral’s interest was in purchasing Federal Systems, and they had every intention of “flipping it”, as one might a house, to another buyer in a short time, and they apparently didn’t care about this.

But, as a result, this algorithm, for fast means and fast moments, which I had developed in 1984 while doing some image processing work, became mine. And I have always treated it as public domain, available to anyone for any purpose. And, here, with this post, I put it out there for your public use, for any purpose whatsoever, without constraints. Very open source. Commercial or otherwise.

Enjoy.

It would be nice to credit me, but you don’t have to do that.

Posted in image processing, mathematics, numerical algorithms, numerical software, numerics | 2 Comments

Buckminster Fuller: Spaceship Earth. This is the Future. And it will be, here.

Posted in an ignorant American public, an uncaring American public, Buckminster Fuller, Global Carbon Project, green tech, science, Spaceship Earth, Star Trek - The Next Generation, Techno Utopias, technology, the energy of the people, the green century, the value of financial assets, tragedy of the horizon, UU Humanists | 1 Comment

Heat has no hair (from Eli Rabett)

See Eli’s post.

Excerpt:

We can summarize the data in the figure above adding that ~40 W/m2 go directly from the surface to space as IR radiation of the 398 W/m2 leaving the surface. In and out in the table … [AT THE POST] … means into and out the surface the atmosphere and space respectively. In is taken as a positive addition to the heat content and negative a decrease …

… The important point is to realize that surface IR radiation absorbed in the atmosphere is rapidly (10 μs) thermalized and converted into random motion of the molecules in the atmosphere, just as is latent heat from condensation of water vapor and from sensible heat. Very little, less than a part per million, is directly radiated back to the surface and we can neglect that.

The 342 W/m2 of back radiation is OBSERVED, so this ain’t a model or a theory, where does it come from? It comes from ALL of the sources pushing heat into the atmosphere, from the convective and radiative heat transfer from the surface.

(Emphasis in the original.)

See the original for more detail.

Perhaps it’s just my physics training, but I never understood the primacy some (even scientists) put on the primacy of convection in terms of climate. I mean, sure, a Bunch of energy can come into the convective-dominated part of the climate system, and it might reside there for a duration, perhaps even long, but, really, that doesn’t matter. Eli’s point is that if a different bunch of the same amount doesn’t leave the climate system, it’ll warm. And it doesn’t matter how long the first bunch is in the climate system, or what path it takes through it, or anything of the kind.

So, to me, this idea that the various oscillations, like NAO or PDO or ENSO somehow have something substantial to do with the overall climate and problem is specious. Yeah, there are big energy flows from one part of the system to the other, just as there are big flows of Carbon to and from oceans to atmosphere, but that’s just slosh, and the point is the net balance. And human emissions of (about) 10 GtC per annum are affecting that a lot.

Posted in American Association for the Advancement of Science, American Meteorological Association, an ignorant American public, an uncaring American public, atmosphere, Blackbody radiation, carbon dioxide, chemistry, climate change, climate education, Eli Rabett, global warming, physics, science | Leave a comment

Blackbody radiation and the greenhouse effect, via plates (from Eli Rabett)

See Eli’s post.

Excerpt:

Eli can keep on adding plates, Ms. Rabett has gone out to buy some extras. Here is the red plate special. If somebunny works it through they will find that b’=3/4 a, go another plate and, as Christian pointed out, now b’ has increased to 4/5 a and so on.

Eli has not said anything about how the heat is being transferred, radiation, convection or conduction but since heat transfer, no matter the mechanism, is always proportional to temperature, the temperature of the blue plate must increase as more plates are added.

See the original for more detail.

Posted in Blackbody radiation, carbon dioxide, climate change, Eli Rabett, energy, energy flux, global warming, lapse rate, physics, science | Leave a comment

The Democrats have no plan to address Climate Change (either)

Recall an article from the 15th November 2017 issue of The Atlantic:

… [T]he Democratic Party does not have a plan to address climate change. This is true at almost every level of the policy-making process: It does not have a consensus bill on the issue waiting in the wings; it does not have a shared vision for what that bill could look like; and it does not have a guiding slogan—like “Medicare for all”—to express how it wants to stop global warming.

Many people in the party know that they want to do something about climate change, but there’s no agreement about what that something may be.

This is not for lack of trying. Democrats have struggled to formulate a post-Obama climate policy because substantive political obstacles stand in their way. They have not yet identified a mechanism that will make a dent in Earth’s costly, irreversible warming while uniting the many factions of their coalition. These problems could keep the party scrambling to face the climate crisis for years to come.

This remains true. The only Democrats in the national view who keep mitigation of climate change in focus are Senator Bernie Sanders and Senator Sheldon Whitehouse. In fact, Senator Sanders and Senator Whitehouse are the only ones with plans, this being Senator Sanders’, and this being Senator Whitehouse, quite contrary to the impression The Atlantic article gives. Also, the claim that “Unlike Clinton’s policies, Sanders would surely have required a Democratic Congress to enshrine his policies”, is completely disingenuous. Only the most limited policies can be enacted without Congress, but that never should be a reason for failing to champion them or make excuses for why they can’t be done, like President Obama’s insistence that we cannot sacrifice economic growth in the pursuit of climate mitigation.

Others have observed the same.

So, I would suggest that what The Atlantic and I mean here is that the standard, vanilla-flavored Democratic Party has no idea about what to do, and it doesn’t really care. What it cares about is winning, and it will compromise on policy in order to make that happen.

This is predominantly why Claire and I are so supportive of Bob Massie as Democratic candidate for governor of Massachusetts. See his position on climate change.

It’s more tiring to say it again than it is to listen to it, but we are running out of time and the economic costs to do something real in time to stop awesome, amazing, and recurring harm from climate change increase by the month.


We determine the point of no return (PNR) for climate change, which is the latest year to take action to reduce greenhouse gases to stay, with a certain probability, within thresholds set by the Paris Agreement. For a 67% probability and a 2K (Kelvin) threshold, the PNR is the year 2035 when the share of renewable energy rises by 2% per year. We show the impact on the PNR of the speed by which emissions are cut, the risk tolerance, climate uncertainties and the potential for negative emissions.


In short, both political parties — and especially the Democrats, since they claim to know better — are failing the United States Constitution and the people of the United States:

Preamble. We the People of the United States, in Order to form a more perfect Union, establish Justice, insure domestic Tranquility, provide for the common defence, promote the general Welfare, and secure the Blessings of Liberty to ourselves and our Posterity, do ordain and establish this Constitution for the United States of America.

(Emphasis added.)

Amendment XIV (Ratified July 9, 1868)

Section 1.
All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside. No State shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any State deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.

(Emphasis added.)

Indeed, given this situation, as I’ve mentioned before, I really wonder if the Constitution of the United States is up to this challenge, because it lacks the mechanism to achieve this. Of course, given that Congresses and Presidents disregard the Constitution, notably

Article. VI.

… This Constitution, and the Laws of the United States which shall be made in Pursuance thereof; and all Treaties made, or which shall be made, under the Authority of the United States, shall be the supreme Law of the Land; and the Judges in every State shall be bound thereby, any Thing in the Constitution or Laws of any State to the Contrary notwithstanding.

(Emphasis added.)

with respect to, say, United Nations Framework Convention on Climate Change”, which remains one of the `treaties in force’ considered so by the U.S. State Department (page 515).

This is why Juliana v United States is so essential. See the compact details.

Posted in American Meteorological Association, an ignorant American public, an uncaring American public, Anthropocene, anti-intellectualism, anti-science, being carbon dioxide, bridge to nowhere, carbon dioxide, Carbon Tax, children as political casualties, climate change, climate disruption, climate economics, consumption, corporate responsibility, Cult of Carbon, destructive economic development, environment, environmental law, games of chance, global blinding, global warming, greenhouse gases, Hyper Anthropocene, Juliana v United States | Tagged | 6 Comments

“Why we need Jean-Luc Picard in 2018”

Admiral Picard is returning.

See the story, by Daniel W Drezner. On CBS All Access.

Yes, “Make it so.”

Posted in American Association for the Advancement of Science, Bloomberg New Energy Finance, Buckminster Fuller, humanism, Jean-Luc Picard, Mathematics and Climate Research Network, open source scientific software, Our Children's Trust, Patrick Stewart, Principles of Planetary Climate, reason, reasonableness, science, Spaceship Earth, Star Trek, Star Trek - The Next Generation, STNG, The Demon Haunted World, the Final Frontier, tragedy of the horizon | Leave a comment

What will happen to fossil fuel-fired electric bills everywhere, eventually, including those fired by natural gas

See Cost of Coal: Electric Bills Skyrocket in Appalachia as Region’s Economy Collapses, by James Bruggers at Inside Climate News. Excerpt:

The common denominator is American Electric Power, one of the nation’s largest utilities. It owns Kentucky Power, along with two subsidiaries in neighboring West Virginia, Wheeling Power and Appalachian Power.

In May, Wheeling Power and Appalachian Power requested permission from the Public Service Commission of West Virginia to boost their monthly residential bill 11 percent because of declining sales. That was on top of a 29 percent increase between 2014 and 2018.

Customers in both states are furious that the regulators are going along.

“Our jobs available in this state are not a living wage, and many are working two, three jobs just to make it,” wrote Elizabeth Bland of Beckley, West Virginia, in her protest letter to the commission. “Please turn down this request from Appalachian Power for the sake of all West Virginians.”

Rising rates are just part of the problem.

Kentucky Power’s monthly bill also includes surcharges, and a line for each customer’s share of the utility’s fixed costs. These add up in precious dollars.

`They’re doubling down on coal at a time when coal is not competitive,’ said James M. Van Nostrand, a professor at the West Virginia University College of Law with decades of experience in the energy field. `It’s really tragic.’

The average bill per customer at Kentucky Power has been among the highest in the nation for an investor-owned utility, according to 2016 numbers from the U.S. Energy Information Agency, the most recent comparisons available.

`We’re hit hard,’ Alice Craft, a Whitesburg-area resident, told InsideClimate News. `The power companies, they are just greedy, greedy, greedy.’

This will inevitably happen to all regions depending primarily upon fossil-fuel fired electricity, including Massachusetts, with consequences for the public, for utility shareholders, for local real estate property values, and for local business expansion. Accordingly, the actions of the Massachusetts House on recent energy legislation is incredibly myopic to say the least, and does not support the stated goals of House leadership, especially those of Democratic House Speaker Robert DeLeo to `look out for the little guy’. His actions say he’s looking out for utility and energy companies, and the interests of AIM, whatever he says his motivations are.

But, it’s only a matter of time.

Posted in adaptation, American Solar Energy Society, Amory Lovins, an uncaring American public, Bloomberg New Energy Finance, BNEF, bridge to nowhere, bridge to somewhere, Carbon Worshipers, clean disruption, corporate responsibility, Cult of Carbon, decentralized electric power generation, decentralized energy, destructive economic development, electricity, electricity markets, energy utilities, engineering, exponential growth, fossil fuels, grid defection, Hermann Scheer, ILSR, investment in wind and solar energy, ISO-NE, John Farrell, Joseph Schumpeter, local generation, local self reliance, marginal energy sources, Massachusetts Clean Energy Center, natural gas, pipelines, public utility commissions, PUCs, rate of return regulation, rationality, reason, reasonableness, rights of the inhabitants of the Commonwealth, solar democracy, solar domination, solar energy, solar power, Sonnen community, Spaceship Earth, stranded assets, sustainability, the energy of the people, the green century, the right to be and act stupid, the right to know, the tragedy of our present civilization, the value of financial assets, tragedy of the horizon, unreason, utility company death spiral, wind energy, wind power, zero carbon | Leave a comment

Censorship of Science by the administration of President Donald Trump

See work by the Columbia Sabin Center for Climate Change Law.

… President Trump has directed EPA and DOI to reconsider regulations adopted to control greenhouse gas emissions, despite the wealth of data showing that those emissions are the key cause of climate change. Faced with this contradiction, both agencies have sought to downplay the science, including by restricting the availability of information (for example, by removing climate data from websites and deleting references to humans’ role in climate change from reports). Similar action has also been taken by a raft of other entities, with the SST indicating that at least 20 different federal bodies, including both Congress and the White House, have attempted to restrict access to scientific information or otherwise silence science …

See also reporting here on work anticipating this kind of action.

Posted in American Association for the Advancement of Science, an ignorant American public, an uncaring American public, anti-intellectualism, anti-science, Azimuth Backup Project, citizen data, Climate Science Legal Defense Fund, Donald Trump, dump Trump, Ecological Society of America, environmental law, epidemiology, global blinding, Neill deGrasse Tyson, open data, rationality, reason, reasonableness, science, secularism, The Demon Haunted World, the right to be and act stupid, the right to know, the tragedy of our present civilization, tragedy of the horizon, unreason | Leave a comment

“All of Monsanto’s problems just landed on Bayer” (by Chris Hughes at Bloomberg)

See Chris Hughes’ article.

Monsanto has touted Roundup (also known as Glyphosate but more properly as \textbf{\texttt{N-(phosphonomethyl)glycine}}) as a safe remedy for weed control, often in the taming of so-called “invasive species”. It’s used on playfields where children are exposed to it, including, apparently, in my home town of Westwood, Massachusetts.

There are more than 450 court cases in progress alleging harm from the product, and a jury in one, DEWAYNE JOHNSON VS. MONSANTO COMPANY ET AL (Case Number: CGC16550128), has found Monsanto, et al guilty with a US$289 million award. It’s long been known to affect fish and amphibians, and recently physicians have gotten concerned, particularly in its connection with cancer in humans.

Image by Benjah-bmm27Own work, Public Domain, Link

This has repercussions for Bayer, as Hughes explains.

But it is perhaps most foolish to think wishful environmental management justifies releasing such toxins where kids, adults, pets, and wildlife are exposed.

For more, check out Beyond the War on Invasive Species: A Permaculture Approach to Ecosystem Restoration by Orion and Holmgren, 2015.

Posted in agroecology, an uncaring American public, business, corporate responsibility, ecology, Ecology Action, environment, environmental law, epidemiology, evidence, invasive species, open data, Peter del Tredici, quantitative biology, quantitative ecology, rights of the inhabitants of the Commonwealth, risk, statistics, sustainability, sustainable landscaping, the right to know, Uncategorized, unreason, Westwood | Leave a comment

Local Energy Rules!

As John Farrell says, Keep your energy local. If you want to take back control of your democracy, a priority is taking back control of your energy supply. Centralized energy centralizes political power and influence.

Listen to more from a recent podcast:

There are now 52 podcasts about the primacy of local energy at ILSR.

Posted in adaptation, bridge to somewhere, Buckminster Fuller, clean disruption, climate economics, decentralized electric power generation, decentralized energy, demand-side solutions, efficiency, electricity markets, energy efficiency, energy utilities, feed-in tariff, force multiplier, fossil fuel divestment, grid defection, ILSR, investment in wind and solar energy, local generation, local self reliance, public utility commissions, solar democracy, solar domination, solar energy, solar power, Spaceship Earth, stranded assets, sustainability, the energy of the people, the green century, utility company death spiral, wind energy, wind power, zero carbon | Leave a comment

Erin Gallagher’s “#QAnon network visualizations”

See her most excellent blog post, a delve into true Data Science.

(Click on figure to see a full-size image. It is large. Use your browser Back Button to return to this blog afterwards.)

Hat tip to Bob Calder and J Berg.

Posted in data science, jibber jabber, networks | Leave a comment

On lamenting the state of the Internet or Web

From time to time, people complain about the state of the Internet or of the World Wide Web. They are sometimes parts of governments charged with mitigating crime, sometimes privacy advocates, sometimes local governments or retails lamenting loss of tax revenues, sometimes social crusaders charging it with personal isolation, bullying, vice, and other communal maladies.

Certain people have made the pointing out of Web ills their principal theme. Jaron Lanier has long done so, and has written many books on the matter. Cathy O’Neill is a more recent critic, not only of the Web but of businesses which employ it and other data collecting mechanisms to mine imperfectly and abuse their intrinsically imperfect pictures for profit.

Others have underscored the effects of what is predominantly sampling bias. The thing about that is this should be no surprise. What is a surprise is the companies involved don’t see and respond to this as the statistical problem it is. How representative a sample actually is of a population of interest is perhaps the key question in any statistical study. That these companies settle for samples of convenience rather than validated one shows they are practicing very weak data methods, no matter how many people with doctorates are associated with these projects.

There is also the criticism from Professor Lawrence Lessig who understood early the social and legal ramifications of how the Internet and Web are built, particularly in such incisive books as Code and Other Laws of Cyberspace,
Code: Version 2.0
, and Remix. In Code: Version 2.0 Lessig continued and reemphasized the warnings issued in Code and Other Laws of Cyberspace that given the way the Internet and Web were technically structured and funded, the idea of a free market of ideas was slipping away and it was becoming more regulable, more subject to influence by a few big groups, and the user-as-product problem with which Powazek, among others, has taken issue.

Perhaps Lessig has come closest to it, but what the audience of these critics should understand is that the shortcomings they articulate are inherently implied by the technical design of the Internet and Web as they are. The Internet and Web at the experiential level are constructed using the Big Ball Of Mud design anti-pattern. Accordingly, as when any ball of wet mud is subjected to sufficiently large outside forces, it deforms, and becomes arbitrary shapes. Given their present size, however, such deformation is having big social implications, whether China’s aggressive censorship, or foreign influences of United States elections, probably from Russia, or sales of private user data, whether wittingly or not, by large Internet presences.

The thing of it is, there were people who thought carefully and long about how such all-connecting networks should operate and devised specific and careful design principles for them. While there were several (see history), one of the least known, particularly today, is Theodor Holm Nelson. “Ted” Nelson conceived of a non-anonymous network of producers and consumers of content whereby, through a technical device he termed transclusion (coined in Literary Machines), readers would make micropayments to reader or otherwise access content produced by others, with the bulk of these going as compensation to producers and some used for managing the network apparatus. This was termed Xanadu and Nelson and colleagues made several attempts to realize it and several technically related and useful ideas.

This is a difficult problem. Such a structure, if it is not to be defeated or subjugated, needs mechanisms for this built into its technical structure, along with a strong authentication (public key cryptography?) built into it to both prevent theft and identify the party both sending and accepting payments and content. The Internet and Web grew up and grow in a combination of deliberate and careful crafting with haphazard, business-driven choices. Just study how companies operating their innards are paid, and how it got that way. Imposing a rigorous design would make growth expensive, slow, and difficult, demanding a large number of readers and consumers before there was anything to read. Accordingly, Xanadu not only didn’t happen, it couldn’t happen.

However, look where the Internet and Web are now? Spam, malicious attacks, election interference, theft of credit card information, identity theft, viruses, cryptocurrency-driven consumption of excess electrical energy, tracking of individuals by their phones, targeted advertising are some of the places we’ve gone.

What’s intriguing to me is the possibility that Ted Nelson was right all along, and the kind of careful design he had in mind for Xanadu may one day become necessary if the Internet and Web are to survive, and not just splinter into a hundred subsidiary networks each controlled by the biggest Local Thug, whether that is a government or telecommunications giant. Nelson himself believes we can still learn many things from Xanadu.

So, in many senses, the Internet and Web did not have to be the way they are. There were other, better ideas. In fact, considering that, and considering what we’re doing to Earth’s climate through our unmitigated worship of Carbon and growth, if humanity ever needs an epitaph, I think it ought to be:

We did a lot of things, but unfortunately we didn’t think them through.

Posted in American Association for the Advancement of Science, an ignorant American public, an uncaring American public, Anthropocene, being carbon dioxide, bollocks, Boston Ethical Society, bridge to nowhere, Buckminster Fuller, capricious gods, Carbon Worshipers, card games, civilization, climate change, consumption, corporate responsibility, Cult of Carbon, Daniel Kahneman, data centers, David Suzuki, denial, design science, ethical ideals, Faster Forward, Hyper Anthropocene, hypertext, ignorance, Internet, Joseph Schumpeter, making money, Mathbabe, networks, organizational failures, superstition, Ted Nelson, the right to know, the tragedy of our present civilization, transclusion, Xanadu, ZigZag | Leave a comment