By Carl Safina.
By Carl Safina.
I have made an important update to an earlier post here, Getting back to 350 ppm CO2: You can’t go home again.
The message, essentially based upon recent work Tokarska and Zickfield on one hand, and by The Global Carbon Project on the other make the calculation of geoengineering through clear air capture of CO2 far more pessimistic than even the whopping cost numbers were before. (Thanks to Glen Peters for pointing me to the annual budget of Carbon compiled by the Global Carbon Project.)
See the post, but, in short, I forgot to account for CO2 “dissolved” in oceans and terrestrial ecosystems which will come back into atmosphere in order to restore pCO2 equilibrium once atmospheric concentrations are reduced.
You really can’t go home again ….
This all also reminds me of something I have written before. Forget the monuments, and the civilization, and the going to the Moon or Mars, and the libraries. In the long run, humanity’s legacy to Earth and the Universe will be the Carbon Dioxide we are emitting into the climate system. No other single action we can imagine doing in the foreseeable future will have such a widespread or a long-lasting impact. Right now, we are Carbon Dioxide.
There is an excellent piece in Ars Technica about why scientific measurements need to be adjusted, and the implications of this for climate data. It is written by Scott K Johnson and is called “Thorough, not thoroughly fabricated: The truth about global temperature data.”
Mr Johnson writes:
… In fact, removing these sorts of background influences is a common task in science. As an incredibly simple example, chemists subtract the mass of the dish when measuring out material. For a more complicated one, we can look at water levels in groundwater wells. Automatic measurements are frequently collected using a pressure sensor suspended below the water level. Because the sensor feels changes in atmospheric pressure as well as water level, a second device near the top of the well just measures atmospheric pressure so daily weather changes can be subtracted out.
If you don’t make these sorts of adjustments, you’d simply be stuck using a record you know is wrong.
This is the kind of thing that’s learned in Physics and Chemistry classes in high school these days. (Well, at least AP Physics and Chemistry, not to mention Statistics.)
Mr Johnson provides a nice sketch of the several datasets use to estimate Earth surface temperature data. There’s a similar story which attends sea-surface temperatures, which has its own dramas, also describe here, from water inadvertently heated by ship’s engines, which Mr Johnson mentions, to thermal bias and microcode errors in measurement instruments.
These are experimental adjustments, made for good reason. There are also statistical adjustments which can improve representations of datasets, like smoothing, which I have written about earlier.
But the point is, many people, encouraged by a sound-bite-oriented media, don’t know about or understand these complications, and so it is easy for people like Representative Lamar Smith to prey on their ignorance. Is it his fault? Partly. But it’s also the fault of a public which embraces representative democracy but doesn’t “want to go to school and learn their lessons” well enough to be able to fulfill their responsibility.
Interesting that Dr Schmidt has some gentle criticism of the PBS program NOVA.
Whole talk is here:
This was posted in February 2015.
It’s heading towards year’s end, so it’s natural to think about perspective.
In a post from last July, Joseph Heath asks semi-rhetorically, “Why are [proposed] carbon taxes so low?” and, then, he and commenters go on and answer that, essentially, the cost of damage is discounted to the present to obtain estimates of the Social Cost of Carbon.
Except that none of these methodologies I see incorporate the full cost of perhaps someday needing to not only decarbonize but to do clear air capture of carbon dioxide and sequestering it (effectively) permanently. They are making estimates of damage from climate disruption.
However, the costs of clear air capture includes an up-front cost of decarbonizing first, since capture is doubly and triply more expensive if we continue to pollute.
See my blog post where the estimate for one such exercise puts the price at US$1800 trillion in constant 2010 dollars. Even if not only is there no inflation in the price, but one additionally applies a discounting rate of 4%, after 100 years that’s still US$3.6 trillion. Worse, the scenario is sensitive at when we start, where we want to reduce to, and whether or not emissions are first zeroed, let alone invoking a technology we do not yet have. This is why my view has now aligned strongly with those of Glen Peters and Kevin Anderson.
I also, and personally, am very pessimistic about the wealthy nations (OECD) of the world doing enough, fast enough, in changing their behavior and their economies to make a significant difference. And I say that despite being extremely enthusiastic about the potential of zero Carbon energy technology, especially solar.
The OECD countries will eventually do it, because:
What kind of substantial things am I doing?
And, while I will engage with people online and elsewhere stating and shouting incorrect things regarding the environment, or climate science, or zero Carbon energy, or who is responsible for all this, I am disengaging emotionally, because it does not matter. Science and engineering facts, on the other hand, do matter, and are worth defending with some ferocity. These are the only hold we have on reality, as opposed to a so-called reality TV show.
And I am sorry that the people who, at least initially, are being hurt and harmed by climate disruption are people who have the least responsibility for the problem. I cannot control people in my world so they begin shed the behavior sets responsible. I can entice them with the wonders and efficiencies of zero Carbon energy. And while climate justice is important, I fear it can be counterproductive compared to, say, campaigns to boycott use of fossil fuel energy. Pursuing climate justice might be simply a nice, and very white way of soothing piqued consciences, and stops further progress where it is more important.
Whether it is to be Utopia or Oblivion will be a touch-and-go relay race right up to the final moment…. Humanity is in ‘final exam’ as to whether or not it qualifies for continuance in Universe.
Hat tip to … And Then There’s Physics for the motive to write this post.
From Katharine Hayhoe, who I deeply respect, and from John Cook (*), scientists and the quantitative community have been scolded that the reason they don’t make headway with the public and the science denier community is because their explanations are too quantitative, that they are too wrapped up with physical processes and models, and mechanisms. Instead, some of these experts at communication argue, stories should be told, which reach across what was once called the two cultures divide.
Yet, even when that is pursued, voices familiar with the quantitative and with what, to them, are sounds shrieking danger at its highest (Hosanna!), find themselves adrift in a murky sea of counter-stories.
Nothing. They will not listen.
And, to me, the only remaining event which people might pay attention is if, on otherwise fine days along the East coast of the United States, people with expensive properties, in Miami Beach, in Boston, in the Carolinas and Maryland, find those properties suddenly awash in salty water, twice a day, and their property values rushing towards zero, beyond rescue of insurance, or FEMA, or the Biggert-Waters Act.
Then, as the great doctor Neil deGrasse Tyson points out, when people with wealth begin losing that wealth, they may want to pay attention.
Dr Tyson’s patient waiting is where I feel I am these days. I am tired of trying to communicate to people who just don’t want to listen, even if if they know how.
And, then, there’s the vital, defiant spirit of Governor Jerry Brown, of California:
One of the vices in the spiritual life is called tepidity. We’ve had a lot of tepid climate fighters, people who are not really telling the full truth. But there is a paradoxical benefit when someone takes to an absurd length a completely erroneous position. That so unmasks the error that it allows everyone else to refute it.
I’m not discouraged. I can’t think of anywhere else to be than in climate science today. Fights are fun. And this fight is big. And it’s gonna be attractive, and it’s gonna take a lot of smart people.
(*) I took Dr Cook’s course. It was fine as well as it went. But it’s suggestions for moving forward were, politely put, anemic.
Just about a year ago, our home in Westwood began a march towards zero Carbon consumption, with heating, hot water heating, and even lawn mowing all converted to high efficiency electricity. As indicated at the time, our main automobile, a Toyota Prius, remained the major obstacle. We also have an old 2005 Toyota Corolla which is used, basically, to drive me 5 miles (tops) to the train station (and back 5 miles) for the 2-3 days I go into the office for work. (I work from home the other 2-3 days.)
Today, Claire took delivery of a leased 2017 Chevrolet Volt LT Hatchback, from Muzi Chevrolet in Needham Heights, with Rob Roderick providing the excellent introduction and service. Seth Fletcher of Scientific American calls the Volt “impressively unremarkable.”
The Volt is a circuit board with electrical motors mounted atop, having a gasoline-powered electrical generator under its hood. It has all the pep you’d expect of electrical drive, and the generator helps navigate a region which is yet sparse in charging stations and a limited 60-80 mile range. The Chevy Bolt promises to be better, if a bit more expensive.
Ours is a 39 month lease. We’ll see what time brings.
Thanks to Paul Lauenstein for the tip.
Owners manual for the curious.
When knowledge conquered fear …
And, what better way to celebrate than watching the National Geographic Cosmos episode, When knowledge conquered fear, hosted by the great Dr Neil deGrasse Tyson, Director of the Hayden Planetarium in New York City.
The National Center for Atmospheric Research (“NCAR”) reports on a newly substantiated teleconnection between positive sea surface temperature anomalies (“SSTA”) in the Pacific and the temperatures over the continental United States (“CONUS”) 50 days later. A teleconnection is:
A linkage between weather changes occurring in widely separated regions of the globe.
as defined by the American Meteorological Society (of which I am a member).
The basic evidence (but see the NCAR post is in the following figure:
(Click on image to see larger figure, and use browser Back Button to return to blog.)
That’s modest warming. The teleconnection predicts that will influence temperatures in CONUS in the second week of February, 2017.
Those are hot. And you’ll note that large dark blue blob at the top right … A cold Arctic outflow. That stops the Gulf Stream flow northwards, slows it down, and piles the waters up against the Northeast, contributing to sea level rise there.
And they who will not be ready, will suffer the economic consequences.
(That water is a foot deep, previously reported in a post here. Click on image to see larger picture, and use your browser Back Button to return to this blog.)
Both articles from the great Ars Technica.
More links to and comments on the same event:
- San Francisco Chronicle
- From the office of California’s governor
- From Climate Denial Crock of the Week
- On the Carbon Lobby and the Trump Gang
An interesting discussion of what a Trump administration could and could not do to repositories of climate data. The possibility of this kind of thing, and retaining control of data provenance and the sequence of transactions done to data is a reason why different flavors of trusted timestamping might be a good idea for all these sources, including blockchain techniques. It’s not clear if the courts and legal systems are up to trusting them yet. They do trust private-public key cryptography, due to changes in law. But Science might trust them.
And I think the various kinds of manipulations that geophysical and oceanographic data are subjected would pose a challenge to archivists and blockchain technologists, particularly when large sets are combined using a specific algorithm embodied in code to produce a result. It seems to me the chain of the code needs to be joined in the chain, too.
The quotation is portrayed at the very end of the 1970 film Tora! Tora! Tora! as:
I fear all we have done is to awaken a sleeping giant and fill him with a terrible resolve
Snapshots from the American Geophysical Union Fall Meeting.
Above, California Governor Brown gave a rousing and defiant talk that was well received.
Below, updated animation from the Arctic Report Card.
The year showed “a stronger, more pronounced signal of persistent warming than any other year in our observation record,” he said.
These changes have had considerable impacts…
View original post 639 more words
Your CO2, my CO2 doesn’t remain with you or me, but mixes broadly and thoroughly over the planet at large.
So, we all share responsibility for the damage.
Credit: NASA And brought to you by OCO-2.
David Puttnum (yes, the producer-director) has a very moving appeal on climate:
Hat tip to Tamino.
President Lyndon Johnson was the first to receive a briefing regarding the looming crisis presented by abrupt climate change. That was in 1965. And we’ve been “waiting for more substantial evidence“ ever since.
It’s pretty clear some people’s wishful thinking means they won’t change their minds until until they witness a sufficiently severe and unprecedented natural demonstration, and the threshold will differ. Unfortunately, by the time this happens or they succumb to acceptance, doing something about it will be horrifically expensive, and will be all the more expensive because wealth will be being destroyed by these changes in anything but a gradual manner.
My advice to kids? Get the most education, the best training you can, in the deep sciences, or engineering, or especially mathematics or medicine, and insulate yourself as much as you can.
To the aware us? We need to continue to fight for a human future.
And to the people who doubt, deny, delay, profit … you won’t have to answer to me: Nature will take your wealth, whether from you, or from your children. But you do have a lot of forgiveness to ask, of the millions around the world who suffer because of your pursuit of comfort.
Behold, this was the sin of your sister Sodom: she and her daughters had pride, surfeit of food, and prosperous ease, but did not aid the poor and the needy.
Once nice thing about having a net metered solar PV array is that, with a little diligence, you can figure out how much electricity your household is consuming each day, or at finer resolution if you like (*). Below is the report of that from mid-May through early December 2016 for ours. Recall we have a zero Carbon home, but no EV (see also). We heat and cool the house with air source heat pumps (“ductless minisplits”) as well as heat hot water with air source heat pump and sometimes hybrid electric element (50 gallon GE GeoSpring hot water heater, with hybrid element being a more efficient way at times of using electric energy).
This is the consumption as calculated by the change in meter readings day-over-day added to solar generation. Change in meter readings can be negative when we give back to the grid more than we use, and that’s when we earn towards a snowy or rainy day. Not all days were recorded, but when they were not the total energy consumed and generated in the gap was tallied, and an average energy per day used to impute the consumption.
The Thanksgiving Day spike is pronounced. Not a surprise: Claire did a bunch of baking, and roasted a turkey breast for guests. (We’re vegetarians.) We have an electric induction stovetop, and a conventional big oven. We typically use a small convection oven and our microwave really gets used a lot.
We have a dishwasher which is run nearly every day, and we shower every other day or so, depending upon our athletic exercise schedules.
Our lights are now mostly LEDs with a couple of lamps still using fluorescents. We do put up a small number of holiday lights.
Our washer and dryer are electric, and most of the year Claire air dries clothes on our deck, using the dryer for fluffing.
I’ve encountered a number of blog posts this week which seem not to understand the Bias-Variance Tradeoff in regard to Mean-Squared-Error. These arose in connection with smoothing splines, which I was studying in connection with multivariate adaptive regression splines, that is actually something different than smoothing splines. (I will have a post here soon on multivariate adaptive regression splines, or the earth procedure as it’s called.)
The general notion some people seem to have is that smoothing splines throw away information and introduce correlation where there isn’t any, and it distorts scientific data. A particularly obnoxious example of this is at science denier William Briggs’ blog. Another, milder instance is at a blog post by a blogger called “Joseph” who specializes, he says, in “A closer look at scientific data and claims, with an emphasis on anthropogenic global warming.” I was going to put in a comment at the blog, but apparently comments there are closed, or at least no longer work. (So do some links to data from that post.) So, instead, I’m putting it here. I already answered a question at Stats Stackexchange which invoked Briggs.
Smoothing is not about making a picture nicer or losing information. It is about the bias-variance tradeoff. Given that minimizing mean squared error in fitting data with a non-parametric (or, for that matter, any) model is important, introducing a bias in a model, such as smoothing in a spline can reduce variability and, so, reduce overall mean squared error of a fit.
The Wikipedia page shows the connection with bias and variance, and the proof of their relationship.
It was an important finding by Stein in 1955, which gave rise to deliberately introducing some bias via things like James-Stein estimators in order to improve overall performance. Prior to Stein’s insight, classical statistics only considered unbiased estimators, and that insight showed that procedures like maximum likelihood estimation were not optimal, even if they work well a lot of the time.
And, accordingly, “Joseph”‘s criticism of the Law Dome CO2 data is not well founded. I bring his and the reader’s attention to a paper co-authored by Etheridge, one of the co-authors of the Law Dome work, about why smoothing splines are used.
Note mean-squared-error is disguised in various powerful measures of model fit, like the Akaike Information Criterion.
Update, 2016-12-27: Smooth, yes, but don’t ever expect to see the smoothed curve realized
While the smoothed version of a series can and often does provide an estimate with the least mean-squared-error, if properly chosen, it is a different question whether the presentation of such a smoothed curve is the best to convey the series, especially if communicating with the statistically uninitiated. The smoothed version of a curve is an idealization, intended for purposes of forecasting, or prediction (they are not the same), and sometimes for helping to tease out physical mechanisms giving rise to the observed phenomenon.
For one thing, the smoothed or idealized curve has zero probability of actually being realized, even on the span of support for which it is calculated. Actual realizations of the phenomenal or observed series will have excursions from the smooth guided by the distribution of its residuals, and it is entirely a part of the series to see these excursions and, moreover, expect that if (it were possible to draw) another realization of the series, there would be a different set of excursions applied.
For another, the general public does not seem to get the idea of a data series with random excursions atop a pattern, and appear to approach these matters as if they were entirely deterministic. That’s a very classical kind of notion: The Watchmaker’s Universe. In this view, the only reason why phenomena are not perfectly predicted is because we have but imperfect knowledge of the science involved, or of Nature, or something, and only a Deity knows these (notwithstanding the Deity knowing what all individuals will choose if Free Will is posited). A different view, more modern is that even a Deity cannot predict perfectly how another realization of these stochastic phenomena will play out.
So, the best way to communicate this variability to me is to present the observed data from the series, present the smoothed realization, and then present a cloud or ensemble of draws from the smoothed curve with excursions governed by residuals atop of it. For example,
(Click on image to see larger figure, and use browser Back Button to return to blog.
By the way, the example above shows two competing models for the smooth to the data.
If dependent data are to be emphasized, then using ensembles of tracks such as the reasonably famous hurricane tracks are useful:
(Click on image to see larger figure, and use browser Back Button to return to blog.
A retrospective, something we all now need. Remember:
Well, today, we reached a landmark with our 10 kW solar array. The numbers aren’t completely in yet (*), since we don’t have a total for electrical energy consumption, but 93% of the days this year were powered by the energy generated by our 10 kW rooftop solar installation consisting of SunPower panels installed by RevoluSun.
This, admittedly, involves using the facilities of Eversource as a big energy storage facility, via net metering. It’s possible that, in the future, that role will be provided by someone or something else.
In addition, our nearly 10 MWh of generation this year will produce nearly 10 SRECs, which earn, per the Renewable Portfolio Standard incentives, an additional $2600 of income.
And, note, we are heating, cooling, and heating hot water all with high efficiency zero Carbon energy.
The additional 25 days or overage of energy we take comes from wind farms, by our choice, and paying a premium above the base rate from Eversource.
Not too shabby. Not too shabby at all.
And, yeah, we’re crazy about doin’ this. It’s awesome.
Beat that, fossil fuels.
You don’t need a Carbon Tax to be incentivized to do this. This is profitable right here and now. And this is snowy, cold Massachusetts. Think of what an Oklahoma could do?
“The Army has determined that additional discussion and analysis are warranted in light of the history of the Great Sioux Nation’s dispossessions of lands, the importance of Lake Oahe to the Tribe, our government-to-government relationship, and the statute governing easements through government property.”
November 14, 2016 Moira Kelley (DOA) 703-614-3992, firstname.lastname@example.org
Jessica Kershaw (DOI), email@example.com
Washington, D.C. — Today, the Army informed the Standing Rock Sioux Tribe, Energy Transfer Partners, and Dakota Access, LLC, that it has completed the review that it launched on September 9, 2016. The Army has determined that additional discussion and analysis are warranted in light of the history of the Great Sioux Nation’s dispossessions of lands, the importance of Lake Oahe to the Tribe, our government-to-government relationship, and the statute governing easements through government property.
The Army invites the Standing Rock Sioux Tribe to engage in discussion regarding potential conditions on an easement for the pipeline crossing that would reduce the risk of a spill or rupture, hasten detection and response to any possible spill, or otherwise enhance the protection of Lake Oahe and the Tribe’s water supplies. The Army invites discussion of the risk of a spill in light of such conditions, and whether to grant an easement for the pipeline to cross Lake Oahe at the proposed location. The Army continues to welcome any input that the Tribe believes is relevant to the proposed pipeline crossing or the granting of an easement.
While these discussions are ongoing, construction on or under Corps land bordering Lake Oahe cannot occur because the Army has not made a final decision on whether to grant an easement. The Army will work with the Tribe on a timeline that allows for robust discussion and analysis to be completed expeditiously.
We fully support the rights of all Americans to assemble and speak freely, and urge everyone involved in protest or pipeline activities to adhere to the principles of nonviolence.
Army POC: Moira Kelley (703) 614-3992, firstname.lastname@example.org
The Department of the Army will not approve an easement that would allow the proposed Dakota Access Pipeline to cross under Lake Oahe in North Dakota, the Army’s Assistant Secretary for Civil Works announced today.
Jo-Ellen Darcy said she based her decision on a need to explore alternate routes for the Dakota Access Pipeline crossing. Her office had announced on November 14, 2016 that it was delaying the decision on the easement to allow for discussions with the Standing Rock Sioux Tribe, whose reservation lies 0.5 miles south of the proposed crossing. Tribal officials have expressed repeated concerns over the risk that a pipeline rupture or spill could pose to its water supply and treaty rights.
“Although we have had continuing discussion and exchanges of new information with the Standing Rock Sioux and Dakota Access, it’s clear that there’s more work to do,” Darcy said. “The best way to complete that work responsibly and expeditiously is to explore alternate routes for the pipeline crossing.”
Darcy said that the consideration of alternative routes would be best accomplished through an Environmental Impact Statement with full public input and analysis.
The Dakota Access Pipeline is an approximately 1,172 mile pipeline that would connect the Bakken and Three Forks oil production areas in North Dakota to an existing crude oil terminal near Pakota, Illinois. The pipeline is 30 inches in diameter and is projected to transport approximately 470,000 barrels of oil per day, with a capacity as high as 570,000 barrels. The current proposed pipeline route would cross Lake Oahe, an Army Corps of Engineers project on the Missouri River.
Statement from Attorney General Loretta Lynch:
A little history, from The Daily Show:
Energy Transfer Partners is the company that is building the Dakota Access Pipeline. Their reaction in part reads:
For Energy Transfer Partners, which says the 1,170-mile pipeline is 92 percent complete, the move smacked of politics. In a statement Sunday night, the company said the “further delay is just the latest in a series of overt and transparent political actions by an administration which has abandoned the rule of law in favor of currying favor with a narrow and extreme political constituency.”
The “rule of law.” What law? A law that has bought and sold jurists to decide against indigenous peoples of North American for two centuries? A law that has enabled stripping of forests and justified destruction of animals, of habitats, of systems upon which we, collectively and ultimately, depend? What constituency? One that is hurling itself headlong off a cliff, so it can get to work five minutes faster, to earn money to Buy More Junk in a season rationalized by appeal to a mere story about a deity who justifies that constituencies’ mistreatment of other peoples and the land and of Nature because, well, the deity wouldn’t come here otherwise, would he?
Extreme? Energy companies who haphazardly subject their employees and their families to cancer? And to dangers of mine collapse? And risks of explosion? Who put delivery of energy above life? Energy forms we don’t even need?
Am I angry?
And I will delight in the day when the enablers, the stockholders of these companies lose their shirts when their assets are stranded because they’ve been beaten by technology in the open market. That will happen, no matter what party is in control or who is in the White House. (It may not happen fast enough to save our grandchildren, but, hey, a big chunk of the American public has shown they don’t give a flea’s ass care about that.) And the leaders of these companies? They’ll go off to some off-the-Florida coast island and live out their days on the legal profits of their scam, at the stockholders’ expense. Do I care? No. The stockholders deserve every moment of fear and discomfort.
Re: Meredith Fowlie, “Climate change and the post-election blues”, from The Energy Institute, BerkeleyHAAS
My only comments regard Dr Fowlie’s LCoE analysis. While correct from its perspective, LCoE depends upon the viewpoint of the cost efficiency. For example, because residential PV is generated close to the consumption point, it avoids Sankey inefficiencies from upstream, primarily due to conversion losses when stepping up and down. So, from the perspective of cost of energy, there is a benefit to local generation. Note most wind generation does not have this efficiency either. The other efficiency which an “at delivery point” LCoE fails to see is use of capital. In particular, private capital is being deployed to construct residential PV and, to some extent, wind. Now, one can argue that capital costs of wind are recovered from ratepayers, but in the case of solar PV, unless some of those incentives like the ITC are factored into the CoE locally, seen as rewards for putting up capital, the price to the consumer using the PV is exaggerated. If they are not included, it seems that the social benefits of not having to raise or bear the cost of capital for that portion of generation ought to be reflected as well.
I am living in a very blue state. The graph below charts Google searches for “stages of grief”. The spike in grief-stricken web/soul searching corresponds with- you guessed it- the 2016 election. The map shows where, in the days following the election, these searches were happening. Not surprisingly, post-election blues show up disproportionately in blue states.
Many of us who are feeling blue about what a Trump presidency could usher in (or throw…
View original post 981 more words
I heard about this study earlier this year, and queued it up for a careful examination. I got to that today. The article is:
A R Brough, J E B Wilkie, J Ma, M S Isaac, D Gal, “Is Eco-Friendly unmanly? The Green-Feminine stereotype and its effect on sustainable consumption”, Journal of Consumer Research, December 2016, 43(4) 567-582.
It’s not a study I would draw deep conclusions from, and I find their generalizations unwarranted.
The scholars draw overly strong conclusions from the limited data in hand. For example, they write:
… [W]e provide the first experimental evidence of the implicit cognitive association between the concepts of greenness and femininity (study 1), and show that this association can affect both social judgments (study 2) and self-perception (study 3) among both men and women. Focusing on the downstream consequences of this green-feminine stereotype, studies 4-6 suggest that as a result of gender identity maintenance, gender cues (e.g., those that threaten or affirm a consumer’s gender identity or that influence a brand’s gender associations) are more likely to affect men’s (vs. women’s) preferences for green products and willingness to engage in green behaviors.
Further, they have the audacity to claim “More generally, our findings also add to a growing body of research pointing to a link between identity and consumers’ tendency to engage in sustainable behavior.” What “growing body of research?” A correlation-only-based study like
J A Lee, S J S Holden, “Understanding the determinants of environmentally conscious behavior,” Psychology and Marketing, 1999, 16(5), 373-92?
Apart from the sampling issues (Mechanical Turk? Really?), the wholesale neglect of repeated uses of the same population for successive tests with no corrections (even if this passes a `smell test’ in their field, and, obviously, satisfies their peer reviewers), and the failure to estimate in-sample versus out-of-sample effects through some kind of bootstrap or cross-validation means that, for all we know, these conclusions are limited to the samples the scholars took. Since Turk was used in most of the tests, it could not have hurt to repeat the same study with another draw for each from the general population, or seeing how much their p-values varied with matched subset of the samples they had. In the very first study, the scholars reported a p-value less than 0.001. That is extraordinary with a sample size of only 127.
(Major update of this piece included below.)
You can’t. It’ll cost much more than
23 times 40 times the Gross World Product to do it.
And, in any case, you need to go to where you need to be to avoid the problem in the first place.
But I get ahead of myself …
[A qualification: The techniques described here are limited to those where the technology is identified well enough to be able to assign a cost estimate per tonne of CO2 for removal. I did not treat any speculative, as yet undeveloped techniques.]
I won’t repeat here why CO2 cannot reasonably be scrubbed by natural processes on any time scale which matters to people, or why the only way to stop the increase in concentration in atmosphere is to zero all CO2 emissions and related ones, like methane (CH4) which decompose into CO2. This is purely a look at the economic feasibility of doing something after the fact should people decide, collectively, that the consequences of emitting greenhouse gases at a rate faster than any time in a hundred million years or so was a bad idea. And I won’t address how long it would take Earth to get sane again once such a project succeeded. Needless to say there are time lags involved, and anyone with experience shooting skeet should well know what happens if time lags are not considered during the exercise.
To begin with, the idea of clear air capture or direct capture of carbon dioxide is explained and argued by the great oceanographer and climate scientist, Professor Wally Broecker, in an article titled “Does air capture constitute a viable backstop against a bad CO2 trip?” Broecker concludes in that article
Because of this very wide range, it is widely believed that the cost would lie somewhere in the middle, leading to a consensus cost of about 600 dollars a ton of CO2 (American Physical Society, 2011). If this proves to be the price, then air capture of CO2 is unlikely to be viable.
Professor Broecker does go on to urge research and development in such a massive global apparatus, concluding
As much of the world’s GNP goes into producing CO2, reversing the trend by air capture will be a very expensive proposition. But looked at in a positive way, the capture and storage of CO2 would create an industry 10 to 20 percent the size of the energy industry (i.e., lots of jobs). Once implemented, it would raise the price of fossil fuel energy, supplying an additional edge for renewable sources.
But let’s see what’s meant here in terms of investment, using the American Physical Society price of US$600/tonne (2010 dollars) as a start, and how low the price per captured tonne of CO2 needs to be in order to be plausible.
We are currently at 404 ppm CO2:
(Click on image to see a larger figure, and use browser Back Button to return to blog.)
Depending upon success with curtailing emissions, represented by concentration pathways, these are the concentrations we might see:
(Click on image to see a larger figure, and use browser Back Button to return to blog.)
What this means in terms of forcings is summarized at Wikipedia with the conservative values** presented in the table below:
(Click on image to see a larger figure, and use browser Back Button to return to blog.)
(Details about RCPs are available here.)
To complete the picture, here are the latest forcing estimates, from Potsdam:
(Click on image to see a larger figure, and use browser Back Button to return to blog.)
The sea level rise impacts are probably understated in the Wikipedia table, due to underestimates of ice sheet effects, and poor constraints on process.
The figures suggest that if RCP 8.5 (“business as usual”) is pursued, 1220 ppm CO2 by 2100 is completely within reach. But to show how expensive clear air capture is, I’ll use RCP 6.0, which ends up, in 2100, emitting per year just half per year that RCP 8.5 does. Overall, RCP 6.0 ends up with 55% of the total cumulative CO2 emissions that RCP 8.5 does, and reaches 730 ppm at 2100. I’ll assume no negative emissions technology has been deployed at that point, and then assume it is instantaneously operational at 2100. I’ll further assume that the target of direct air capture is to reduce CO2 concentrations to the relatively benign but still not completely safe 350 ppm that the hard-working proponents of 350.org espouse. (If 350 ppm had never been exceeded, we’d still witness the eventual melt of a lot of ice sheets, although this would be slower.)
The first thing to realize is that direct air capture necessarily assumes emissions of CO2 have stopped and the job is draw down the emissions that are there. While there is a natural decline of emissions, about 200 ppm in 400 years, and 250 ppm in 1000 years, it plateaus and decreases very slowly afterwards. The rule of thumb is that 40% of cumulative carbon dioxide remains in atmosphere after 1000 years. If direct air capture were deployed, it would need to counter the ongoing emissions and work to draw down preexisting concentrations of CO2. Worse, to the degree that, for instance, fugitive CH4 and other species which decompose into CO2 are released, these would not be available for removal immediately, but would continue to contribute over their decay cycles.
Accordingly, deployment of direct air capture means that the entire economic cost of going to zero Carbon emissions is borne at the outset.
Then, assuming the climatic conditions associated with 730 ppm at 2100 for RCP 6.0 are intolerable, I assume the globe deploys direct air capture at US$600/tonne CO2. Note that such scrubbing of atmosphere will not reverse sea level rise, since heat in oceans (and, in general, in water) is released only on time scales of tens of thousands of years. Moreover, there is a slow outgassing of CO2 from oceans once atmospheric concentrations diminish, and this outgassing proceeds only at a natural rate, one which may not be consistent with engineering targets.
So, 730 ppm to 350 ppm involves direct capture and permanent sequestration of 380 ppm of CO2. Each 0.127 ppm corresponds to a billion tonnes of CO2. Accordingly, = 2992 GtCO2 = 3 trillion tonnes CO2. At US$600 per tonne, that’s US$1800 trillion in 2010 dollars.
To give you an idea of the size of this number, the entire gross world product in 2014 was $78 trillion dollars. Accordingly, the cost of coming down 380 ppm after we zero CO2 emissions is times the gross world product in 2014. That’s simply not feasible in any scenario.
How much cheaper must direct air capture get in order for it to be feasible? Well, let’s take a megaproject, like the construction of the Chunnel across the English channel. This cost about US$7 billion in 1994 dollars. In 2010 dollars that’s US$10.3 billion. So, suppose we are willing to spend the equivalent of 100 Chunnel projects to make civilization viable on Earth again. That’s about US$1 trillion in 2010 dollars. Assume this measure of feasibility and plausibility, direct air capture of CO2 with sequestration needs to be or US$0.33 per tonne in 2010 dollars.
I don’t care what technology you have in mind, that ain’t gonna happen.
Direct air capture of CO2 is tough because there are so few molecules per unit volume to catch.
One important aspect the above neglects is CO2 dissolved in the oceans and captured by the soils. The point is made most directly in a 2015 paper by Tokarska and Zickfield and in its supplemment. The same idea was discussed earlier, not in the context of geoengineering through CO2 capture, but in terms of the lifetime of atmospheric CO2 and its effects after human emissions were zeroed. See Archer, et al, 2009, and Solomon, et al. The implications for global containment policy were described in a 2012 paper by Matthews, Solomon and Pierrehumbert where they argue (a) atmospheric concentrations and emissions intensity are, for physical reasons, not really useful gauges of progress in containing the effects of human-created climate change, and (b) there should be a renewed emphasis upon cumulative Carbon emissions. Their arguments seemed to have been missed by many who seem to think that if emission rates plateau we’ll see some useful response from the climate system.
In short, oceans and soils are, in the long run, in equilibrium with atmosphere with respect to any particular gaseous species like CO2. In the short run, they are not, because it takes time (decades) for oceans to take up their share of free CO2 because of complicated mixing processes. Similarly with soils, although I’ve never seen a time constant for that process. Not sure there is one. But in the end, oceans pick up 30% of human CO2 emissions. (Eli Rabett does a nice review of the chemistry here.) Soils and trees and things, primarily old growth forests and other terrestrial ecosystems, pick up another 25%. These figures are the result of careful Carbon accounting and measurements. (See also.) Assuming these continue picking excess CO2 up at these rates, should emissions stop, and then reverse with negative emissions technology, what would happen?
As concentrations of CO2 in atmosphere decrease, oceans and terrestrial ecosystems are out-of-equilibrium, so the process reverses: CO2 there eventually begins coming back out into the atmosphere. The net effect of this, and why my calculations here understate the cost of clear air capture, is that what needs to be removed is not just the concentration of CO2 in atmosphere, but essentially all that people have ever released to the climate system!
And if that isn’t bad enough, there is some evidence (see Section 2.7) that these sinks of CO2 are slowing down in their ability to temporarily hold CO2 meaning that more, as a fraction, will go into the atmosphere.
There’s an amazing group of people who keep track of Carbon accounts year over year. (Thanks to Glen Peters for pointing me to this.)
Now, it’s clear who is at fault and who, properly speaking, should bear the burden of doing these crazy things if they were needed and if they were feasible.
(Click on image to see a larger figure, and use browser Back Button to return to blog.)
Pick on China all you want, but since radiative forcing of atmosphere and oceans is due to cumulative emissions, not instantaneous emissions, Europe and the United States carry the greatest responsibility, including contributing their share of deforestation. Moreover, if China were excessively penalized, effectively this would be a tariff on products manufactured there, and, in the end, the consumers of North America and Europe would end up paying.
Note that OCO-2, the satellite system*** which produced these figures, data products which support state efforts to manage their fossil fuel emissions, may be on President-elect Trump’s “hit list” of systems to be terminated because of his commitment to shut down “the politicized science” of climate. (I’ll have more to say about that soon.)
Key quote-within-a-quote from article at … And Then There’s Physics:
If the expected negative emissions cannot ultimately be achieved, the decades in which society had allowed itself a slower, softer transition would turn out to be a dangerous delay of much-needed rapid emission reductions. Saddled with a fossil fuel-dependent energy infrastructure, society would face a much more abrupt and disruptive transition than the one it had sought to avoid. Having exceeded its available carbon budget, and unable to compensate with negative emissions, it could also face more severe climate change impacts than it had prepared for.
I went to some Departmental talks recently and discovered that some of my colleagues are researchering possible carbon sequestration technologies. This could be very important, but appealing to negative emission technologies is often quite strongly criticised. The basic argument (which has some merit) is that providing this as a possibility can provide policy makers with an argument for delaying action that might reduce emissions sooner.
Although I have some sympathy with these criticisms, I do have some issues with them. One is that it often involves criticising climate models that include negative emission pathways. The problem I have with this is that they seem to use “climate model” as a catch all for any kind of model associated with climate change. However, there are a large number of different models. Some are trying to understand how our climate responds to changes, and – typically – use concentration pathways. Others try…
View original post 442 more words
No other perspective matters, however disenfranchised some may feel. For, ignoring that perspective, they will fight Something they cannot beat.
Remember, Nature Bats Last.
The seawater in that parking lot is a foot deep.
People can deny what’s happening in any of several varied ways. They can claim it does not affect them.
In the end, Nature will speak and, as Richard Feynman insisted, the “truth will out.”
(Photo credit and story to the Scituate Wicked Local of 15th November 2016.)
When running a corporation there are various kinds of productivity measures that can be used. There are bizzare ones like return on controllable assets (ROCA), and typical ones like overall revenue, or overall profit. When judging productivity of employees and divisions, measures like revenue per division, or revenue per employee are used.
If the United States were a corporation, and we further imagine states are comparable to divisions, the most valuable states are those who have a high gross state product per person. This means that, in terms of the overall United States domestic product, these individuals are contributing more to overall wealth than are citizens in other states. Gross state product itself is interesting, but it is less a fair comparison since states having simply large numbers of people win over smaller ones.
What’s interesting is that, while there are exceptions, most of the high GSP-per-capita states voted Democratic in the recent 2016 Presidential election. In fact 70% did. These are the top 20, from the most productive per capita to the least:
It is interesting that D.C., while not a state, ranks above them all.
So, in all the explaining that Trump won because people feel left out of the discourse, there are two points to be made.
First, it’s possible that those who have influence do so because they are, in fact, more productive. There is an argument, from a capitalist mindset, that those who contribute the most to the common enterprise should indeed have the most say.
And, second, people who feel left out could choose to move to a state where they contribute and, on average, earn more. That they don’t is actually their own choice.
So, ironically, it is not that states which vote Democratic have a lock on politics at the expense of those who do not. They also contribute to the common wealth the most. Shareholders in corporations with more shares have more say. Oughtn’t voters in such states do the same?
(Figure deleted, 4 December 2016.)
I added something related to this in a comment at FT today:
I also think that the “elitism” of which some Trump supporters complain is itself a myth and a mirage. A possible response, noting that a substantial chunk of the GDP of the USA comes from those states accused of being elitist, is to, for a chance, actually exert there power and influence as elites rather than trying to get along with everyone else.
If some people are going to claim they don’t want to have certain kinds of people in their states and their communities, some of those same people should understand very very clearly that their behaviors, attitudes, and expressed opinions will not be tolerated in communities and states which try to be open to all nationalities and outlooks from everywhere, whether Muslim, gay, Jewish, atheist, trans, bisexual, queer, or otherwise.
And, should there be an attempt at some kind of punitive response from the new Presidency and Congress, they need us more than we need them. We can find other customers. And they cannot replace the skills and capabilities of these “elitist” regions without, ironically, buying those skills from outside the United States. They can develop them elsewhere in the country, but that would take years and years, and could not, without diversity in origin and outlook, be successful. Just look at the makeup of any major university: MIT, Harvard, UC Berkeley, Duke, CalTech, University of Massachusetts, University of Texas, University of Chicago, and so on.
Moreover, corporations affected by such measures can work against them.
Hopefully it won’t come to that, but, as the reason for my FT comment shows, the Trump entourage has a really thin skin, and puts “winning” and it’s own image ahead of sense. They could really misfire in a big way, say, in cybersecurity.
From The New York Times, 3rd December 2016.
Topics in Computational Neuroscience & Machine Learning
Lecture notes for CPSC 536N "Randomized Algorithms"
Random thoughts on Machine Learning, Deep Learning and (sometimes) Computer Vision
stories we tell ourselves
Research that Informs Business and Public Policy
Guyana News and news from Guyanese Associations worldwide
"For here we are not afraid to follow truth wherever it may lead, nor to tolerate any error so long as reason is left free to combat it." – Thomas Jefferson
Energy, Environment and Policy
likhipa inhlanzi emanzini
... for when you can't solve life's problems with statistics alone.
Exploring and venting about quantitative issues
Newscasts on Global Warming, Its Consequences & Solutions
A Welcoming Congregation & A Green Sanctuary
Stop the war; stop the warming.
e-commerce data science and analytics
Boston Area Sustainability Group
with Peter Sinclair
I can calculate the motion of heavenly bodies but not the madness of people. -Isaac Newton
Engineering, Oceanography, and Innovation in Environmental Science
Changing the world and its people, one post at a time
notes on mathematical finance, algo trading and derivatives
Discover change, together
Tips and tricks on programming, evolutionary algorithms, and doing research
Experiments & Experiences in R
Critical perspectives on technology, sustainability, and the future
Astronomy, space and space travel for the non scientist
How one atheist sees life
Forum for Climate Engineering Assessment
Noah Deich's blog on all things Carbon Dioxide Removal (CDR)
resources on approximate Bayesian computational methods
The Science, Economics, and Politics of Climate Change
Multa novit vulpes
Once you have finished counting diatoms, the real fun begins
weather musings for a wider audience
Let's talk about how we do research in the weather and climate sciences
science and current events d1p
``Your goodness must have some edge to it,--else it is none.'' R. W. Emerson
To inform, inspire and involve
FastMail system status announcements
Science, Politics, Life, the Universe, and Everything
Ponderings of science, philosophy, history, politics, and many other topics
Statistical Computing + Bayesian Modelling
Science - Simplified
A Photographic Journey
The random musings of a reformed astronomer ...