Simple. I bring up the latest, and listen to Professor Tony Seba of Stanford University.
Simple. I bring up the latest, and listen to Professor Tony Seba of Stanford University.
Not much else needs to be said here. Hat tip to WaPo.
Thunberg accused leaders of speaking only about “green eternal economic growth because you are too scared of being unpopular.”
“You only talk about moving forward with the same bad ideas that got us into this mess even when the only sensible thing to do is pull the emergency brake,” she said. “You are not mature enough to tell it like it is.”
Thanksgiving in 2018 was cold, but it was also sunny. That means the 150,000 solar installations in Massachusetts could delivery on their combined 2.7 GW promise and actually delivered 1.5 GW towards the 15.5 GW needed. Of course, production varied during the day. With additional solar and wind, and additional storage, generation could have been banked to offset the “duck curve” seen.
What’s striking is that, despite the availability of wind and solar resources, wind generation is, at times curtailed by direction and order, since neither ISO-NE nor utilities have a way of routing some excess power or storing it when they don’t need it. This has been identified as the surest sign of a grid which is falling behind the needs of modern distributed generation. Called Do-Not-Exceed orders, they are a vestige of an attitude of central command-and-control grid management, rather than planning and moving to a design where the grid more-or-less manages itself, based upon technical window-ahead signals and predictions. Note ISO-NE has championed a market system to achieve this. That is a form of feedback control system, but it is one having appreciable lags in response. If the system errs in its predictions, whether on demand or on supply, there are energy assets wasted or shortfalls in provisioning. Such errors are why conventional grid managers put so much emphasis upon “dispatchable resources”.
People who can move predominantly off the grid, whether now or in the next 10 years, can insulate themselves from this kind of administrative fragility. Some can’t, and need to live with it, at least until the Massachusetts Department of Public Utilities and their Governor see the need to modernize.
Thanksgiving wasn’t the only banner day. Here was 21st April 2018:
Our own solar generation on those days looked as depicted in the following figures. Note there are two PV arrays reported, a 10 kW one, and a 3.4 kW one. They of course both feed our home and the neighborhood grid, but as they were installed at different times, are monitored separately. The 3.4 kW wasn’t online for 21 April 2018, yet.
The difference in generation intensity is primarily due to length of solar day and tree shading at low sun angles in Autumn. On Thanksgiving, many of our tree still had leaves on them. I’ve included a look at 2ne August for contrast.
As @dumboldguy pointed out in a comment, while the contribution of renewables to Massachusetts electrical energy looks impressive in these figures, the “Y axis of the first graph begins at” 9 GW, not zero. This shows how far Massachusetts needs to go to get any appreciable amount of renewables for electricity. And, moreover, it also shows, as I responded, how silly it is to claim that renewables are destabilizing the grid in Massachusetts: They are barely making a dent.
Also, and something which @dumboldguy did not say, but I insist upon saying again, the Massachusetts public’s dislike of onshore wind essentially means they are opting for natural gas as an electricity-generating source and this, necessarily, means they are opting for the new pipelines that come with it. That’s because:
The differential in price between offshore wind and the price of onshore and the slightly higher price of natural gas means that kWh for kWh, offshore wind won’t compete with natural gas any time soon. Here’s the Lazard Levelized Cost of Energy (unsubsidized) analysis from 2018:
I have annotated it to point out price of onshore wind versus offshore, and various gas generation prices. Note that solar is expensive (see top), without subsidies. Note, too, that because the grid is antiquated in Massachusetts, using wind and solar means relying upon peaking gas generation plants, if only for part of the time. Note how expensive they are, ignoring greenhouse gas effects.
Remember who it was that told you it would okay, just fine to continue to emit CO2 as we have been, despite over 50 years of science, scientists, from physicists, to chemists, to engineers, and biologists saying it won’t be, it can’t be.
The AAAS recounts this. So does the American Chemical Society, and do you seriously think they are going to push some agenda when their members are employed predominantly by industry? Even if you think “How can it be? go and discover.
When your wealth depletes, and your kids and grandkids suffer and even die, remember who told you it would be okay and who kept this once great country from doing something about this.
Hold them accountable. This is more important than nearly anything else.
There are people who say that climate change does not cause big disasters. I find that highly disingenuous.
“There is extraordinary frustration,” a U.S. intelligence official said. The CIA and other agencies continue to devote enormous “time, energy and resources” to ensuring that accurate intelligence is delivered to Trump, the official said, but his seeming imperviousness to such material often renders “all of that a waste.”
How do people think the United States and international scientific communities feel about government and public responses to their repeated warnings?
Influential people working to implement greenhouse gas mitigation continue to indulge in Magical Thinking. But, unfortunately, a modest fee on Carbon is worse than none. It needs to be stiff enough to hurt and change behavior: Beginning at a couple of hundred dollars per metric tonne CO2. Also, although it is more difficult and costly to enforce, it would be better to apply this to the consumption end that the source end. In any case, a source end increase of $200/tonne would increase the price of gasoline in the United States by about US$1.80, and natural gas by US$11/cubic foot.
Less frequent than I originally intended, but here’s today’s:
I’m a member of the House, and just visited it with Claire. See photos below.
The Gorey Store has a lot of exquisite items for sale for the holidays, sure to bring a smile.
And I think most of them are great for kids. See, for evidence, the Jones and Ponton Killing Monsters.
My relation with Gorey’s work began in sophomore year of high school, 1968, 50 years ago, when I came across a small work of drawings on one of those crowded shelves in a bookstore in Harvard Square. I got the notion of getting a copy for my teacher of literature and debating coach. I did. He seemed delighted. Mr Gorey’s book reminded me of him.
I, of course, have no direct experience of Edward Gorey, even to make a first impression. Docents at the Gorey House suggest Mr Gorey was shy or, if not shy, someone who thought “Why would anyone want to know me?” For comparison to Dery’s biography, there is the slim volume by Mr Gorey’s good friend, Alexander Theroux for comparison. (The Strange Case of Edward Gorey, Alexander Theroux.) I have little reason to doubt Ms Acocella’s remarks about the inconsistencies of Dery’s analysis of Mr Gorey. A senior docent at the Gorey House, which sells Mr Dery’s book in their gift shop, implied there were shortcomings, but was nevertheless appreciative that there was, at last, some biography.
While Ms Acocella’s review of Dery’s attempt might show a fondness for Mr Gorey, shortcomings are present in her treatment, too. She doesn’t mention Theroux at all. She doesn’t mention the existence of the Gorey House in Yarmouth Port, even less giving it a plug. And she’s incomplete in her assessment of Mr Gorey’s bequests, for example, to “animal-welfare societies” and she chooses to highlight Bat Conservation International. (That was too cute.) It was in The New Yorker and she’s a major writer for them, so I presume Ms Acocella did have room to mention the contrast between Mr Gorey’s animal welfare interests and cats, and his (early?) fondness for raccoon coats. (I know, “It was another time ….”) A list of such societies is presented in the photo below, from the House.
I also disagree with Ms Acocella’s ready agreement that Mr Gorey had somehow “lost his talent”. Gorey continued to produce, to struggle to find expression, to be himself. Note the remark he made upon ink and papers:
I think it remiss, too, to omit that, as minor as it might be, Mr Gorey has a small, quiet, cultish following, of which I consider myself a part. No doubt he’ll eventually be mythologized, like Tolkien, as any regalers of any life are bound to do. It’s inevitable since most records about it are not in the record. But it’s a way to continue Mr Gorey’s joy.
The curator of the Edward Gorey House kindly recommended to me another review of Dery’s biography, this one by Evan Kindley at The New Republic. (I would gladly credit the curator’s name, but I haven’t asked permission, so so don’t want to presume.) I had a read.
It’s author has offered many interesting columns. I was drawn to his profile of Kurt Vonnegut’s years at General Electric, during the time Vonnegut wrote science fiction. While I respect Mr Vonnegut’s books and ideas (but not, I think, as much as my wife, Claire, does), the important thing is to know who is writing a review of a biography. Mr Edward Gorey was, to me, a vastly more important artist than, say, Mr Vonnegut. I’ll say why before the end. Kindley picked for his choice of review another’s book on Mr Vonnegut’s time at GE. It’s clear he thought that connection both curious, even exotic, given Mr Vonnegut’s later views, and formative. Accordingly, there is a notion of some homonuclear model at work in Kindley’s head, perhaps of a preformatory artist. That’s relevant.
I like the Kindley review. It feels more honest, committed in some ways, than the view-from-afar of Acocella. But:
You can feel him pushing the limits of his chosen mediu — the illustrated book — just as Stein and Queneau pushed the novel, Beckett the play, or Duchamp the painting … He is at once essentially limited and infinitely ambitious.
I don’t buy it. That’s a major puzzler-solver being described. Mr Gorey, and again I am no expert, seems more to me the essence of the genius, which is the child forever at play, walking down a beach, picking up a shell and getting all excited about it. Then, in an hour, or a day, becoming bored with it, and moving on. I think he’s more someone who erects a frame, builds a building, and tears the frame away — and, incidentally, some of the building — leaving it stable, but barely so, and also leaving its admirers wondering how does it stand up?
I think Mr Kindley’s analysis of the Dery Gorey-was-gay proposition is spot on. I see it as a statistician: How can someone legitimately infer Mr Gorey’s interests there by simple association of friends? I’d wager the circles he encountered had a higher-than-average propensity of declared same-gender-preferring people, and, so, if he picked friends at random, that’s what he’d get. Or bisexual. Or queer. I think Mr Gorey’s own characterization should suffice. What did he really have to gain by suppressing such?
There are also minor quibbles:
Why is Mr Gorey an important artist to me?
One of the poets my sophomore literature teacher (the Gorey booklet recipient) introduced was one Wallace Stevens. This would be a life-changing introduction, and Mr Stevens has always coupled me into thought and feeling closer than nearly any formal religion. I was brought up Catholic, including a thorough Catholic education. I turned pantheist, then agnostic, then converted to Judaism. I dwelt there for years, raising two sons in the tradition. I was intrigued by Buddhism, practiced being a Jew-Bu, and then I blasted out to where I felt most at home, an atheist, nay, physical materialist. It’s not that, for instance, Catholicism or Judaism were “wrong”. It is a path. I’m (now) happily affiliated with the Unitarian Universalist congregation of Needham, Massachusetts. (You need to know who’s writing this, too.)
A singular excerpt from one of Mr Stevens’ poems (The Idea of Order at Key West) goes:
Oh! Blessed rage for order, pale Ramon,
The maker’s rage to order words of the sea,
Words of the fragrant portals, dimly-starred,
And of ourselves and of our origins,
In ghostlier demarcations, keener sounds.
Now those are words to live by. And, I think at core, Kindley didn’t miss this about Gorey when he observed about what readers and viewers might think:
He put all that work into this?
It’s true, my guidewords hew closer to those of Milton who, while being fully critical, not praising, wrote (Paradise Lost):
… or if they list to try
Conjecture, he his Fabric of the Heav’ns
Hath left to thir disputes, perhaps to move
His laughter at thir quaint Opinions wide
Hereafter, when they come to model Heav’n
And calculate the Starrs …
In one way or another, those words, and to some extent, Stevens’ Idea, are the story of my personal life.
So, Ms Acocella quotes Edward Gorey near the start of her review, and Mr Kindley underscores in his:
I’m beginning to feel that if you create something, you’re killing a lot of other things. And the way I write, since I do leave out most of the connections, and very little is pinned down, I feel that I am doing a minimum of damage to other possibilities that might arise in a reader’s mind.
And that’s it. Vonnegut is less a lover of ambiguity. He doesn’t let it flow. His stories have a point. That’s a problem.
In the end, ambiguity is all we have. Whether it’s what’s left out in a story, or what a scientific calculation implies but does not say, or is Mr Stevens a poet or an insurance company executive, was Mr Gorey a goth or not, these are unanswerable. (Well, nearly so: Gorey referred to the gothic as a costume, like his raccoon coat, as quoted by Mr Kindley.) No, not that. They should not be answered. For, art is, if anything, as the comic Gilda Radner said in a famous quote:
I wanted a perfect ending. Now I’ve learned, the hard way, that some poems don’t rhyme, and some stories don’t have a clear beginning, middle, and end. Life is about not knowing, having to change, taking the moment and making the best of it, without knowing what’s going to happen next.
So Mr Gorey reminds us with every page of sketches, every attempt at play. And we badly need reminding. It is for me, at least, a refuge, and a source of meaning.
I’ve abandoned Github to store my own code and have, instead, opted to simply dump it into a shared read-only Google Drive folder.
Too much trouble, and I don’t really need deep source control, even though Google offers it with its Google Drive, for free.
What really set me onto this choice was the apparent bias Atlassian Bitbucket has for retaining code-like material rather than datasets, and their 2 Gb ceiling. I was willing to live with that, but, then, fell into the difficulty of trying to expunge some largish datasets and having to prune them from the Github history, all doing this from Windows Sourcetree, or, rather, the command line side support it offers, which is hobbled.
I use Sourcetree to grab and keep up with others Github repositories.
David Suzuki aptly calls the corner we’ve painted ourselves into “the climate crunch”.
Why a “crunch”?
Had we heeded early warnings and had political representatives done more than talk, we likely could have addressed the problem with minimal societal disruption. But the industry-funded denial machine, which continues today, has been effective. Concern about climate change and other environmental issues has diminished as the problems have intensified. Politicians continue to think in terms of brief election cycles, focusing on short-term gains from exploiting fossil fuels rather than long-term benefits of conserving energy and shifting to cleaner sources.
Meanwhile, greenhouse gas emissions continue to rise and carbon sinks like forests and wetlands are still being destroyed. Even if we stopped using fossil fuels tomorrow, we’ve emitted so much carbon dioxide and other greenhouse gases that we wouldn’t be able to avert worsening of the consequences already happening. But we still have time — albeit very little — to ensure the problem doesn’t become catastrophic. The Intergovernmental Panel on Climate Change, which is conservative in its estimates, gives us about 12 years to take decisive action.
The thing is, circumstances are so bad now that fixing this will take large, industrial scale measures, and be triply costly, (a) to make a rapid transition away from fossil fuels, (b) adapt to the impacts that are ever increasing and weren’t anticipated to come this quickly, and (c) to remove Carbon Dioxide from the climate system so to limit further deterioration.
Even those who accept the science and the urgency are, in my opinion, pursuing pipe dreams. Some think we can jettison capitalism and solve this. Some think we need to make environmental justice our primary constraint. Some think we can solve this by pursuing marketplace measures for solar energy (which includes wind). Some think we can protect all ecosystems while rolling out the measures we need to take to fix the situation.
We need to do this fast. We don’t have a lot of time. The kind of future I see is one where the world as an economy does Carbon Dioxide removal as the central economic activity, akin to building the tombs of pharaohs was for ancient Egypt. Corporations can and must exist because, frankly, we don’t have the centuries or decades available to create an alternative structure. Government planning doesn’t work. (Look at the administrative nightmares that are the U.S. EPA or the Army Corps of Engineers as described in Mary Christina Wood’s Nature’s Trust.) We need global scale engineering and technical skills. We need capital.
Quick take from Professor Richard Alley:
Full interview with Professor Alley:
The Fourth National Climate Assessment (NCA4) fulfills that mandate in two volumes. This report, Volume II, draws on the foundational science described in Volume I, the Climate Science Special Report (CSSR).2 Volume II focuses on the human welfare, societal, and environmental elements of climate change and variability for 10 regions and 18 national topics, with particular attention paid to observed and projected risks, impacts, consideration of risk reduction, and implications under different mitigation pathways. Where possible, NCA4 Volume II provides examples of actions underway in communities across the United States to reduce the risks associated with climate change, increase resilience, and improve livelihoods.
This assessment was written to help inform decision-makers, utility and natural resource managers, public health officials, emergency planners, and other stakeholders by providing a thorough examination of the effects of climate change on the United States.
Considering the collective effort and review put into preparing this report, complete with a review by the National Academies, and a public comment period, you would think digital and visual media would spend more time on it. But no. Well, at least that’s what I thought. Actually, print and online media didn’t do too badly.
I was alerted to this by Peter Sinclair’s blog Climate Denial Crock of the Week.
Moreover, perhaps because of blowback or second thoughts, AC360 on CNN did carry an interview with Hayhoe.
The Washington Post only treated the report as part of their continuing conversation regarding President Trump.
Commonwealth Magazine offered two op-ed pieces, one by Craig Altemose on a “Green New Deal” for Massachusetts, and the other by Eric Wilkinson on how Boston needs to do more on climate change. Both are excellent, but neither alluded to the National Climate Assessment. Rather they cited Massachusetts own evaluations of needs and risks. The Magazine, on the other hand, carried a story written by Bruce Mohl featuring, once again, Gordon Van Welie of ISO-NE about the challenges of running a New England-wide power grid over the next several years, and Dan Dolan of New England Power Generators Association lamenting the “existential crisis” that faces New England wholesale markets for electricity. Unlike past articles, neither came out in favor of expanding the role of natural gas. That’s being done, in part, by the governments of Maine and New Hampshire. Were Altemose and Wilkinson “balanced reporting”?
And that’s fortunate.
The Economist carries quite a few articles regarding climate change, its impacts, and its mitigation, but these are primarily from an international perspective. They hardly mentioned NCA4. However, there was this.
I have already commented on how FiveThirtyEight covered the NCA4. Their parent, ABCNews, mentioned the report but principally focussed upon it being delivered from an Executive where the head immediately disparaged its findings
It’s odd that 538 only accepts comments from people with Facebook accounts, despite being associated with ABCNews, which has its own user accounting system. I’m commenting here instead #fivethirtyeight.
Anyway, per this post, a recent article and podcast at 538 demonstrates there is a poor understanding regarding global warming, climate change, its consequences, and these assessments, even by educated Democrats. Taking the last first, the latest National Climate Assessment is the 4th, and it’s authorized and required by an act of Congress, once every 4 years. However, there are basically two volumes produced, an updated assessment of climate science, and, then, in the next year, an updated assessment of impacts. These reports are hardly produced in isolation: In addition to being compiled and written by a large team of scientists, they are each independently reviewed by the National Academies of Science, Medicine, and Engineering. Moreover, there is a comment period where the public can comment on the reports. Comments by the Academies and by the public are addressed by the team from the U.S. Global Change Program producing the reports and these are available at the site.
All that said, there is also a misunderstanding about the scope of climate change. CO2 is not like most other pollutants in that it has a very long life. That means it accumulates, and, not only is the USA a major producer of CO2, it owns a substantial chunk of the accumulated emissions. Moreover, because of CO2’s long life and other physical aspects, such as 90% of the excess heat going into oceans, the trouble is that if we collectively stop emitting, we’ll keep damage and change from getting worse, but it won’t reverse, not for at least centuries. Moreover, there is a lag between the forcings and causes of additional energy and manifestations as effects. This is a very system, and, if we stop, it will keep getting worse for a decade or more. Some systems on Earth, like ice sheets, respond even more slowly. It’s agreed by many glaciologists, for example, that the West Antarctic Ice Sheet (WAIS) is doomed to collapse, even if that will take a couple of centuries to be realized.
To the comment that why is warming bad, the historical record which, by now, is much better established than it was for NCA2 or even NCA3, shows humanity has never lived in a time when temperatures overall were this extreme. It isn’t just temperature, it’s energy available to weather systems and moisture aloft that matters, not to mention things like loss of ice.
Also, because of temperatures and oceanic acidification, while primary productivity of oceans and forests may increase for a time, ultimately these will be limited and reverse. Experiments show that plants get used to having an abundance of CO2 and aren’t as effective sinks. There are some controlled experiments which even suggest forests and plantings could be net CO2 sources, if plant respiration exceeds rate of CO2 consumption. A lot depends upon the microbial mix in soils where plants grow, and this is sensitive to temperature, CO2 concentration in atmosphere, and available moisture. For example, arid conditions aren’t conducive to CO2 take-up. It is believed, too, that enhanced growth is limited by available Nitrogen.
And there are other impacts as well anticipate by the science, such as changes in oceanic circulation, which could have major consequences for regional weather and distribution of moisture. The trouble with these kinds of perturbations is that they are beyond direct experience by people, even if there is really solid evidence they’ve happened before.
I think the posture of the present administration that the NCA is a report produced by some fringe group really is at odds with the process and its depth. It hardly is a surprise. It’s produced on a regular schedule. It’s possible for anyone to engage with it. And the emergent understanding available on climate change and global warming is breathtaking in depth as well as breadth: It’s understood by ecologists and biologists as well as geophysicists. Even doctors and epidemiologists are seeing its effects.
FiveThirtyEight‘s political podcast on this report missed a lot of these aspects. In that respect, their journalism was disappointing here.
By the way, to the claim of 45 that the United States is among the cleanest of countries on emissions, it just ain’t:
And, since cumulative emissions are what matter, the United States has a lot it’s responsible for:
But this doesn’t prevent 45 or Forbes, for that matter, pointing their fingers elsewhere:
https://www.bbc.co.uk/programmes/m0001b1k (from 28th November 2018)
I’m afraid I need to agree with Krugman’s conclusion:
While Donald Trump is a prime example of the depravity of climate denial, this is an issue on which his whole party went over to the dark side years ago. Republicans don’t just have bad ideas; at this point, they are, necessarily, bad people.
There can be no excusing a systematic denial of reality, or of our single best means of understanding it, Science, no matter what the perceived economic consequences.
Understand, of course, I have no uncritical love for Democrats either, because they are not actual climate champions, and because they have simply assumed climate hawks, like myself, have no other choice than to support them, given the travesty that’s the Republican Party. Even Senator Elizabeth Warren supports paying people to live in high risk coastal areas and opposes properly assessing risk of re-flooding and damage.
Am I supposed to support her?
This is called denial. It’s a psychological condition.
And a providential warning …
I’ll be discussing the ramifications of:
for several posts here. Some introduction and links to proofs and explications will be provided, as well as more recent work. For instance,
There are two parts to the Lemma. Let be a random projection matrix for a dataset matrix . Generally speaking is large, perhaps even . (This is sometimes called a “small , big problem” in terms of problem characterization.) A random projection matrix mapping onto two dimensions looks like:
It is derived from an initial matrix, ,
where are each independent drawn from a or standard Normal distribution, and then
or, in other words, the -th row of is produced from the -th row of by dividing it by it’s length or norm, often called the Euclidean norm because of its coinciding with the Euclidean distance calculation.
Then, the first part is given by a Theorem 1, which bounds distortion of pairwise distances of points:
which preserves metric continuity, so that for all
A full proof won’t be given here (see references), but there is a norm-preserving intermediate result which is useful in itself:
where denotes the probability of the event .
The second part of the JL Lemma is given by a Theorem 2, which bounds distortion of pairwise inner products:
This Lemma is in many ways remarkable. But, as my son, Professor Jeff Galkowski of Northeastern University, described in response to my question regarding what concentration of measure in high dimensions means, he replied:
The concentration of measure is (essentially) just the fact that in high dimensions most of the volume of a sphere is very close to the origin in the norm.
In other words,
This Lemma is a subject reviewed in many courses. For example, Professor Zhu at the University of Wisconsin (Madison) covers the subject in his CS731 (2011), “Random Projection” (Advanced Artificial Intelligence). There are also many good Web expositions of it, like this one by Mathematics doctoral student Renan Gross. Mathematician Hein Hundal has a post from 2013 which introduces it and, more importantly, cites its connections to Machine Learning and Compressed Sensing.
Also, the JL Lemma is an example of what Schneider and Gupta describe as a data oblivious method In particular,
In its assessment of alternative approaches to dimensionality reduction, the Committee on the Analysis of Massive Data (National Research Council of the National Academies, 2013) labels random projections approaches “data oblivious“, in that the dimensionality reduction mapping can be computed without any knowledge of or use of the data. This is in contrast to “data aware” methods such as principal components analysis and its refinements, where the mapping is dependent on a given dataset. The report also identifies key benefits of random projections as follows (p. 77): “… the projection is guaranteed to work (in the sense that it preserves the distance structure or other properties) for arbitrary point-sets. In addition, generating such projections requires very little resources in terms of space and/or time, and it can be done before the data are even seen. Finally, this approach leads to results with provable guarantees”.
Here’s an illustration. Take a 7-variate Gaussian (7-dimensional), having a mean of
and a covariance matrix of:
10,000 points were drawn from this distribution, and then it was randomly projected to two dimensions in the manner described for Johnson-Lindenstrauss above. The resulting 10000-by-2 matrix of points was first subjected to a kernel density analysis on a 500-by-500 grid giving the result:
The k-NN (“k-Nearest Neighbors”) clustering algorithm was used on the projection () to produce the following summary:
Note the projection is only from 7 dimensions to 2. The random projections technique really comes into its own when the initial number of dimensions is large. The figure below shows a k-NN result from an initial dataset having 5038 rows and 20230 columns:
I’ll be giving an illustration of an application in a second post. It will address other open questions, too, such as how to pick a value for the k-NN clustering.
However, want to close with the note that there’s a recent connection to or application in Climate Science, given in the reference below:
This is the second installment of the Podcast here, hopefully with better sound quality.
Commencing today, I’m offering another channel of this blog, a podcast.
This will range over the interface between people, their behavior, and the natural world. It’s primarily an opportunity for a less structured and more personal presentation of my experience of the world.
No doubt I’ll get better with the audio technology and content as I go on. It sure isn’t beginning up to the standards of any commercial podcast. I like the idea of recording while I am outdoors, and that poses special challenges.
Anyway, here it is.
And, yeah, maybe someday I’ll pick a better name for this thing.
And next time, I’m using a better microphone.
Professor Nic Lewis has criticised the Resplandy, Keeling, et al report in Nature which I previously mentioned. A summary of his criticism appears in the somewhat libertarian ezine Reason. I have responded there, but their commenting policy limits a thorough response. Not all things can be answered in less than 150 or for that matter 2000 characters. Accordingly I have posted the response in full here, below the horizontal line.
I apologize to the readership for the poor formatting, such as lack of formatting which Reason, as ostentatious as it name sounds, is incapable of supporting in its comments. I didn’t feel it worth revising these here, even if WordPress is perfectly capable of doing that.
I preface by saying I’ve not read the preceding comments, and, so, I apologize if someone has already said what I’m going to say here. I have, of course, read the article above, which claims to represent Professor Lewis’ critique of Resplandy, et al (2018) fairly, I have had a quick read of the critique, although have not, for reasons that will become evident, invested the time to reproduce the calculations, and I have had a careful read of Resplandy, Keeling, et al (2018), the research paper of NATURE which is the subject of Professor Lewis’ critique.
In particular, being a quantitative engineer practiced in stochastic methods, in addition to the new use of atmospheric chemistry in the Resplandy, et al paper, I was also interested in the Delta-APO-observed uncertainty analysis described in their Methods section where, as is reported, they generated a million time series “with noise scaled to the random and systematic errors of APO data detailed in Extended Data Table 3”. Later, in the calculation Professor Lewis is apparently criticizing, Resplandy, et al report they computed the Delta-APO-climate trend using the standard deviation of these million realizations, arriving at the 1.16 +- 0.15 per meg reciprocal year value Professor Lewis so objects to. I can’t really tell from his mental arithmetic report and his least square trend report whether or not he did the million realization reproduction, but, as that is a major feature of the calculation, I rather doubt it. That’s because there are so many ways that could be set up which deserve reporting that are missing from his criticism. So either he did not calculate the result in the same way, or, if he did, he is not sharing the details in sufficient depth so we or Resplandy, et al can tell whether or not he did it the same way.
Given that this is origin of Professor Lewis’ critique and, then, the rather casual complaint about “anthropogenic aerosol deposition”, which is more present in the above (mis?)characterization of Lewis than in the original (only appears in footnote 8, and in a manner of explanation, not a criticism), the rest of Lewis’ pile-on founders if this is done wrong.
That’s the substance.
But what is really problematic is that Lewis’ critique is improper science. The way this gets done in peer review and in NATURE or SCIENCE or any other journals, including JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION or JOURNAL OF THE ROYAL STATISTICAL SOCIETY, with which I assume Professor Lewis is familiar, is that a letter is sent to the editors, with full technical details, almost akin to a research paper. Generally, original authors and the critic, in that setting, are in contact, and they agree to write a joint response, resolving the objection with more detail, or the critic presents in detail — far more than Professor Lewis did in his one-off PDF — why they believe the original to be mistaken, and then the original authors get a response.
This is why I don’t really take Professor Lewis’ criticism seriously. He hasn’t allowed the assembled, including NATURE’s technical audience, to be able to fully criticize his own criticism, by failing to document essential details. He is relying solely on his authority as a “statistician”.
In fact, there are other instances where Professor Lewis’ authority is circumscribed. For example, in 2013, Professor Lewis published a paper in JOURNAL OF CLIMATE titled “An objective Bayesian improved approach for applying optimal fingerprint techniques to estimate climate sensitivity” (vol 26, pages 2414ff) wherein he insists upon using a noninformative prior for the calculation of interest. That is certainly a permissible choice, and there is nothing technically wrong with the conclusion thus derived, However, by using citations to justify the practice, Lewis misrepresents the position of Kass and Wasserman (1996) who squarely identify proper Bayesian practice with using proper, non-uniform priors, and, moreover, identify several pitfalls with using uniform ones, pitfalls which, if Professor Lewis were faithful to his self-characterization of pursuing a Bayesian approach, should address. He does not in that paper and, so, invites the question of why. There Professor Lewis is questioning a calculating of a higher climate sensitivity from fingerprinting techniques. It appears that he’s seeking for a rationale why that might not be so. Surely invoking a device which admits uniform priors to obtain such might work, but it is hardly good Bayesian practice.
Accordingly, I wonder — for I cannot tell given what Professor Lewis has recorded in his cited objection — if the result of Resplandy, et al is what Professor Lewis’ real problem is, one where he exploits the subtle difference between doing a on-the-face-of-it linear squares on that with doing one based upon a million-fold stochastic simulation, a difference which the readers of REASON, for example, as erudite as they are, might not catch.
In my technical opinion, until Professor Lewis does the full work of a full scientific or statistical criticism, his opinion is not worth much and Resplandy, et al, have every right to ignore him.
Dr Ralph Keeling describes the smudge in the original study, and credits Prof Lewis for sending them on the right track. The details are included in a snap from the RealClimate summary below:
The revision is being submitted to Nature. Apparently, the problem is that the errors in the ensemble realization were correlated, and they did not account for this. I’ll reserve judgment until I see their corrected contribution.
One thing I’d say, however, is that if the ensemble was generated using something like a bootstrap, there’s no reason for the resulting errors to be correlated. I can’t say until I see the actual details. But, if I am correct, they could use a Politis-Romano stationary bootstrap instead, and this would have taken care of that. Note, in addition, the remark by Nordstrom.
There’s a slew of bad news which has hit the scientific journals, the most notable being
L. Resplandy, R. F. Keeling, Y. Eddebbar, M. K. Brooks, R. Wang, L. Bopp, M. C. Long, J. P. Dunne, W. Koeve, A. Oschlies, “Quantification of ocean heat uptake from changes in atmospheric O2 and CO2 composition“, Nature, 2018, 563(7729), 105–108.
Dr Jim White puts this in context. Pay attention to what he says about what the long run temperature record says about how quickly temperatures can change, in either direction.
LA Times coverage of the subject. I like the quote,
Still, the system’s large number of direct measurements means any individual errors are averaged out, said Pelle Robbins, a researcher with the Massachusetts-based Woods Hole Oceanographic Institution’s department of physical oceanography, who works with the Argo program.
“The power of Argo is that we have so many instruments that we’re not reliant on any one of them,” he said. “When you average over things, you beat down the error.”
Robbins said the new approach is “bold,” but he still believes strongly in the accuracy of the Argo program.
“It’s an intriguing new clue,” he said, “but it’s certainly not the case that this study alone suggests that we have been systematically under-representing the oceanic warming.”
Resplandy said her discovery is not intended to replace the Argo system but rather to compliment it. “In science, we want several methods to measure things, to have several methods that converge.”
Also, from the same article, there’s the assessment:
The new report found that emissions levels in coming decades would need to be 25% lower than laid out by the IPCC to keep warming under that 2 degree cap.
It’s like I just can’t put this post down. Facts are that once the Resplandy, et al (2018) paper appeared, research in to works, interviews with climate scientists, and other observations are percolating up, and we are seeing the beginning of what Dr Neil deGrasse Tyson calls an “emerging scientific truth”, that warming is not only much larger than estimated, it is accelerating. Consider this interview with Dr Lijing Cheng:
Yale Climate Connections summarized a spectrum of increasing risk, and quotes Dr White as well.
From the Summary:
Figure 1 summarizes the main results of the analysis. A future energy scenario emitting 85% less CO2 emissions than 1990 levels is compared with a reference scenario, which assumes that the German energy system operates in 2050 the same way as it does today. Results show that ii) the primary energy in the minus 85-percent scenario will drop 42 % below today’s values by 2050. iii) Assuming that no penalty is imposed on CO2 emissions and the price of fossil energy remains constant, calculations show that the cumulative total costs to maintain and operate today’s energy system will be 27% less than transforming the energy system to the targeted minus 85 percent scenario. iv) On the other hand, if the penalty for CO2 emissions increases to €100/ton by 2030 and thereafter remains constant and given that fossil fuel prices increase annually by 2 percent, then the total cumulative costs of today’s energy system (Reference) are 8% higher than the costs required for the minus 85 percent scenario up to 2050.
From the report, regarding electrical energy storage:
Electrical energy storage systems in the form of stationary and mobile (in vehicles) batteries or pumped-storage power plants are used as storage systems. Hydrogen storage systems and thermal hot water storage systems in different orders of magnitudes are considered in addition.
With respect to methane storage system, the simplified assumption is made that currently already existing storage capacities (including grid, approx. 210 TWh ) will also be available to the system in the future. Thus, they are not considered in the optimisation.
Pumped storage plants are not included in the optimisation. Bases on current values of an installed power of approx. 6.3 GW, and storage capacity of approx. 40 GWh, [26, 27] an increase to 8.6 GW and/or 70 GWh is assumed until 2050 for the dimensions of these plants (power and electric storage capacity) (own assumptions based on ).
That’s it. Feasible. Germany.
And, quoting from the MIT report above:
The present trend toward widespread availability and decreasing cost of distributed generation and storage results in the possibility of grid defection—that is, complete disconnection from the grid.44 Grid defection may be motivated by physical conditions such as the ability to install some embedded generation within a residence or business, and economic considerations such as the desire to avoid network costs. Grid defection represents an extreme form of price elasticity and must be considered—from an efficiency perspective—in tariff design and in decisions about which regulated costs are to be included in electricity tariffs.
Pumped hydro energy storage and molten salt thermal storage account for the vast majority of installed energy storage capacity to date, but these technologies are poorly suited to distributed applications (DOE 2015).
Now, I’m done. I have real work to do.
“But numbers don’t make noises. They don’t have colours. You can’t taste them or touch them. They don’t smell of anything. They don’t have feelings. They don’t make you feel. And they make for pretty boring stories.” That’s from here, and it’s well-intended, but it is also wrong.
For people appropriately trained in science, engineering, and especially maths, numbers carry imagination, even if they don’t make noises, have smells, and don’t have feelings. And I strongly disagree they don’t make you feel. They make em feel, depending upon the context.
Consider the flow of the AMOC, known locally and colloquially as the Gulf Stream. That flow is measured in a unit called a Sverdrup. A Sverdrup is the flow corresponding of a million cubic meters per second, typically of water. To give you some idea and comparison, and to provide a feeling, all the flows into oceans of all the rivers on Earth is about 1.2 Sverdrups.
The flow in the Gulf Stream varies with season and place, but it is between 30 and 150 Sverdrups.
Astronomy and Astrophysics probably have thousands of amazing mind-stretching insights, related to numbers. But here’s another that connects the previous Oceanographic with these: All the water on Earth.
Or how about that the thermal capacity of the oceans is about a thousand that of the atmosphere. This has implications:
That’s a figure from the American Chemical Society.
Numbers. It’s how you really know anything.
Curious? Take a math course. Take a physics course. These days, take a biology course. See what I mean.
Reminder that climate defeatism—arguing that we are already so screwed that there’s no real point in acting to limit climate emissions or ecological damage—is absolutely a form of denialism, and one that directly aids those profiting off continued destruction.
He quoted a 2017 tweet titled “The apocalyptic is itself a form of denialism” citing what he describes as the “most popular thing he has ever written”, an essay titled Putting the Future Back in the Room.
I agree and need to say something because I hear, directly or not, from environmentalists and not, that some consider the problem too hard, the work done so far too small, the costs too high, the magnitude of the risk too great to contemplate doing anything about climate change and that it is better now to prepare oneself for the end, withdrawing, so to speak, into a seemingly spiritual cocoon.
Frankly, that kind of thing gives the spiritual a bad name.
I also hear from environmental activists of old, that they are unwilling to compromise on their ethical integrity, and refuse to have anything at all to do with corporations, or the well-heeled, or with compromises on the environment, endangered species, social justice, or anything else, even if these compromises could be basis of progress. In this respect I share the opinion and distaste for progressives which Bill Maher sometimes expresses, and I have said so. Progressives are also often reluctant to do anything in cooperation with the military or military people.
This kind of close-mindedness and failure to realize, particularly in light of the recent report from UNFCCC that it is necessary to triage now. This doesn’t mean compromising on emissions, or natural gas, or other aspects which Science says we cannot have if we are going to succeed in containing this enormous problem. But it does mean looking seriously at distant hydropower, sensibly routed, and looking at nuclear power, as long as it can be built cheaply and quickly. (Nuclear power along the lines of the present power station models cannot be.) If that means bringing back the breeder reactor proposals of the Clinton-Gore era, maybe it does. (More on this from Dr James Hansen.)
As Steward Brand argues, there’s no time left to be an environmentalist. It’s well nigh time to be an ecopragmatist, even if I don’t heartily agree with him, e.g., his promotion of Paul Hawken’s Drawdown ideas which disregard biological reality.
But the direction and spirit are right. Bill Nye-style engineering is what’s needed. We can do that. Those who don’t know, should learn. There are many places to go, such as Environmental Business Council of New England, or the Institute for Local Self-Reliance. There are plenty of leaders, like Michael Bloomberg, and Mark Carney, and Richard Branson, and Professor Tony Seba.
14th October 2018, quoting Senator Marco Rubio and White House economic advisor Larry Kudlow, the Washington Post reported they each claimed that the recent UN report was an `overestimate`:
“I think they overestimate,” Kudlow said of the U.N. report, which found that policy changes must proceed at an unprecedented pace in the next 12 years to stop temperatures from rising more than 1.5 degrees Celsius (2.7 degrees Fahrenheit) above Earth’s preindustrial temperature.
“I’m not denying any climate-change issues,” Kudlow said on ABC’s “This Week.” “I’m just saying, do we know precisely . . . things like how much of it is manmade, how much of it is solar, how much of it is oceanic, how much of it is rain forest and other issues?”
Rubio (R-Fla.), speaking on CNN about the effects of Hurricane Michael, said that sea levels and ocean temperatures have risen in a “measurable” way and that humans have played some role. But he questioned how big that role is.
“I think many scientists would debate the percentage of what is attributable to man versus normal fluctuations,” Rubio said on “State of the Union.”
Well, actually, it is, surely with some uncertainty, but the contribution of human emissions and activity is three times that of the nearest competitor. Moreover, in the case of volcanic activity, solar irradiance, and ENSO, their forcings are not consistent, the volcanic one being intermittent, and solar being cyclical. Moreover, volcanic forcings overwhelmingly cool Earth, not heat it, and the ENSO can do the same. As is the case for most environmental problems, these kinds of considerations demand assessments be quantitative not merely qualitative or ascribing accuracy of a number to the supposed reputation of the person or group stating it.
Here are the comparisons from Judith Lean and David Rind in 2008 (“How natural and anthropogenic influences alter global and regional surface temperatures: 1889 to 2006”, Geophysical Research Letters, 35, L18701, doi:10.1029/2008GL034864):
That is, in fact, the go-to source used by, for instance NASA in its site concerning the matter (Riebeek and Simmon, 2010). The NASA page also describes methods used, and provides explanation.
Even at the most extreme positive forcings, human activity is about 1.8 times the combined effects, but that can only happen when solar and ENSO forcings align, which is uncommon. That margin is well beyond measurement uncertainty.
So, sure, these numbers are not known perfectly. But economic and governing policies are seldom based upon perfect or even complete information. This science is far more complete in knowledge than most. And these bounds on uncertainty ought to be more than enough for people to set policy.
And, frankly, the arguments from Rubio and Kudlow are sick examples of the rhetorical fallacies known as Argumentum ad Ignorantiam (Argument from Ignorance) and the Continuum Fallacy (Line Drawing, Bald Man Fallacy).
It is at least disingenuous for Rubio and Kudlow to make these claims. And, given their backgrounds and training, it is more likely to be simply untruthful.
As a statistician there is another aspect to these opinions which is being wholly neglected. Generally speaking, in order to make a rational decision about anything quantitative and complicated, the losses on either side of the question need to be considered. Indeed, the recent Nobel laureate in Economics, William Nordhaus, won his Nobel for precisely working to identify these costs and losses.
Given that specific, large losses are being realized, losses which are unprecedented in United States (and world!) history, even controlling for additional development, it is open-minded and fair for a Rubio or a Kudlow to consider that their assessments of the benefits of inaction might be incorrect. And, what this means is, that arguments and methods and processes which led to the recent UN report are demonstrating predictive strength, including attribution of cause.
A quick review of history shows Rubio and Kudlow are typical members of a pack of voices who, until recently, denied anything unusual was happening at all.
An article by Suilou Huang for catatrophe modeler AIR-WorldWide of Boston about rejection sampling in CAT modeling got me thinking about pulling together some notes about sampling algorithms of various kinds. There are, of course, books written about this subject, including Yves Tillé’s Sampling Algorithms, 2006, Springer, which covers reservoir sampling in its section 4.4.5. Tillé does not cover rejection sampling or slice sampling, so I thought I’d put some notes and figures here, not so much to teach, but reference. And, to some degree for balance.
This is in large measure a technical blog, although I have a lot of material about climate activism and promotion of solar energy. Sure, I very much care about these things, but I’m at heart a quantitative problem solver, with a fondness for engineering quantitative software, and, so, this blog doesn’t reflect me fully if I leave it at activism.
The theoretical underpinnings of many of these algorithms is recounted in C. P. Robert and G. Casella’s textbook Monte Carlo Statistical Methods, 2en edition, 2004. Slice sampling receives an extended and worthwhile treatment in their Chapter 8. Robert and Casella also treat algorithms like importance sampling, not touched here, in their section 3.3, and rejection sampling in their section 2.3.
Huang covers rejection sampling pretty well. Fewer people know about slice sampling, so the remaining references will cover that.
The idea was introduced by Professor Radford Neal of the University of Toronto in 2003 and it has undergone several innovations since. He offers R code to do it. There is an R package, one MfUSampler, which offers various sampling algorithms under a supervisory framework, including slice sampling. These are intended to be used in the context of a Markov Chain Monte Carlo (MCMC) stochastic search, typically exploring a , but the cautions of Robert and Casella regarding adaptive sampling schemes in their section 7.6.3 for this purpose, which they credit Neal for identifying, are worth a look.
There are also at least a couple of instances of code for Python, including nice tutorials by Kristiadi, and Professor Rahul Dave’s AM207 course notes from 2017.
Additional study revealed
M. M. Tibbits, M. Haran, J. C. Liechty, ``Parallel multivariate slice sampling'', Statistics and Computing, 2011, 21(13), 415-430.
which, in turn, traces the origins of this idea back to
R. M. Neal, ``Markov chain Monte Carlo methods based on 'slicing' the density function'', Technical Report, Department of Statistics, University of Toronto, 1997.
P. Damien, J. Wakeﬁeld, S. Walker, ``'', Journal of the Royal Statistical Society, Series B (Statistical Methods), 1999, 61(2), 331-344.
A. Mira, L. Tierney, ``Efﬁciency and convergence properties of slice samplers'', Scandinavian Journal of Statististics, 2002, 29(1), 1-12..
There is also the doctoral dissertation of M. B. Thompson, “Slice Sampling with multivariate steps”, from the Graduate Department of Statistics, University of Toronto, 2011, under the supervision of Professor Radford Neal.
Also, note the R package MCMCpack uses slice sampling at selected points for implementations.
A more careful read of Professor Rahul’s page on slice sampling reveals the claim:
One of the methods that we’ll look at is Slice Sampling. Slice sampling was invented by Radfor Neal and John Skilling. It is a very elegant and simple way to solve the problems of MH and Gibbs sampling. As Pavlos would say, its conceptually so easy you would think his (very capable) grandma invented it …
First — and I’m admittedly nitpicking — Professor Neal’s first name is “Radford” not “Radfor”.
Second, and more seriously, I don’t see any evidence to justify the claim that Dr John Skilling co-invented slice sampling. Skilling did invent another kind of sampling, nested sampling, which has been useful in some applications, even though it has received serious criticism from some experts, technically addressed at
N. Chopin, C. P. Robert, ``Properties of nested sampling''. Biometrika, 2010, 97(3), 741-755.
E. Higson, W. Handley, M. Hobson, A. Lasenby, ``Sampling errors in Nested Sampling parameter estimation'', Bayesian Analysis, 2018, 13(3), 873-896.
Skilling and MacKay did supply a comment on Neal’s original paper in its Discussion, proposing to use integer arithmetic for a special version of slice sampling. Neal addresses that in his Rejoinder.
Third, the discussion surrounding the Python code Professor Rahul offers is a bit weak, particularly regarding the multimodal case. This is unfortunate, because that’s the interesting one. This suggests I’m talking myself into doing a tutorial on slice sampling in R some time down the road, remedying these problems. When I do so, I should really include finding the area of the Batman shape as previously promised.
UU Ministry for Earth description of his award.
In greater Boston, and in Massachusetts we are holding a rally and Meeting of Witness at the Moakley Federal Courthouse on the 29th of October, beginning at 11:30 a.m. sharp.
The latest news of the trial and progress is reported here.
In an amazing report, the Trump administration forecasts +7F rise in global temperatures by 2100, insisting nothing can be done to prevent it happening. In the associated report, the administration claimed that the deep cuts in emissions needed to prevent this outcome “would require substantial increases in technology innovation and adoption compared to today’s levels and would require the economy and the vehicle fleet to move away from the use of fossil fuels, which is not currently technologically feasible or economically feasible.”
Details at the Washington Post.
Quoting the pertinent section:
The emissions reductions necessary to keep global emissions within this carbon budget could not be achieved solely with drastic reductions in emissions from the U.S. passenger car and light truck vehicle fleet but would also require drastic reductions in all U.S. sectors and from the rest of the developed and developing world. In addition, achieving GHG reductions from the passenger car and light truck vehicle fleet to the same degree that emissions reductions will be needed globally to avoid using all of the carbon budget would require substantial increases in technology innovation and adoption compared to today’s levels and would require the economy and the vehicle fleet to substantially move away from the use of fossil fuels, which is not currently technologically feasible or economically practicable.
[From T. A. Carleton, S. M. Hsiang, “Social and economic impacts of climate”, Science, 9 September 2016.
Horatio Algeranon | April 28, 2014 at 6:54 pm |
“What A Carbonful World” — Horatio’s version of What a Wonderful World (written by Bob Thiele and George Weiss and made famous by Louis Armstrong) I see trees of brown, Hockey sticks too. I see dim gloom for me and you. And I think to myself, what a carbonful world. I see Hadley CRU, And sea-ice flight. The blistering day, The hot muggy night. And I think to myself, What a carbonful world. The ppm’s of carbon, Increasing in the sky. Are warming all the faces, Of people who will die, I see storms shaking hands. Saying, “How do you do?” They’re really saying, “I’ll get you”. I hear Stevies cry, I watch them blow, They’ll learn much less, Than I already know. And I think to myself, What a carbonful world. Yes, I think to myself, What a carbonful world. Oh yeah.
Harrison Ford‘s speech, at the Climate Action Summit, below:
“… Don’t forget Nature.”
“If we don’t stop the destruction of our natural world, nothing else will matter.”
“We need to include Nature in every corporate, state, and national climate goal.”
Executive Summary report:
Stories about protecting, restoring, and enjoying your Buzzards Bay
Risk and Well-Being in a World of Changing Climate, Resources, Technology and Growth
"There Are No Solutions, Only Trade-Offs."
Environmental science of climate, carbon, and energy
Youth Education in Science
Daily News about Energy and Climate Change
Lover of math. Bad at drawing.
A quarterly series dedicated to thought leadership on adaptation to climate change.
Learning to Swim in the Data Deluge
Lecture notes for CPSC 536N "Randomized Algorithms"
Research that Informs Business and Public Policy
Energy, Environment and Policy
likhipa inhlanzi emanzini
... for when you can't solve life's problems with statistics alone.
Exploring and venting about quantitative issues
Newscasts on Global Warming, Its Consequences & Solutions
A Welcoming Congregation & A Green Sanctuary
marketing data science and analytics
Boston Area Sustainability Group
with Peter Sinclair
Engineering, Oceanography, and Innovation in Environmental Science
notes on data science, mathematical finance, algo trading and derivatives
Discover change, together
Tips and tricks on programming, evolutionary algorithms, and doing research
Experiments & Experiences in R
Critical perspectives on technology, sustainability, and the future
Astronomy, space and space travel for the non scientist
Forum for Climate Engineering Assessment
Noah Deich's blog on all things Carbon Dioxide Removal (CDR)
Multa novit vulpes
Once you have finished counting diatoms, the real fun begins
Let's talk about how we do research in the weather and climate sciences
Science, Politics, Life, the Universe, and Everything
Ponderings of science, philosophy, history, politics, and many other topics
Statistical Computing + Bayesian Modelling
The random musings of a reformed astronomer ...
Ed Nisley's Blog: shop notes, electronics, firmware, machinery, 3D printing, and curiosities
Postdoctoral Fellow, Bayesian Statistics, Aerosol Science
Evaluating and Explaining Climate Science
For Climate Science Analysis and WIPs