Since my holiday is now over, I thought I might briefly comment on a recent paper by Cheng et al., called Observed and simulated full-depth ocean heat-content changes for 1970–2005. John Abraham, o…
Source: Full-depth OHC
Since my holiday is now over, I thought I might briefly comment on a recent paper by Cheng et al., called Observed and simulated full-depth ocean heat-content changes for 1970–2005. John Abraham, o…
Source: Full-depth OHC
By Richard Somerville, emiritus professor of Oceanography from Scripps Institution of Oceanography. See the site he helps build and run regarding communication regarding change.
Amber Lin at The Bulletin of the Atomic Scientists describes the two-headed character of natural gas plants needed to implement “natural gas as a bridge fuel”, and sketches the stark reality proponents of that argument are embracing if they are serious about using natural gas, whether for electricity or heating, to reduce greenhouse gas emissions.
The basic fact is that in order to serve as a proper “bridge”, natural gas infrastructure would need to be decommissioned by 2050, including ceasing flows of the gas through the elaborate pipelines which criss-cross the United States. That’s because emission limits for CO2 dictated by Nature cannot be met otherwise, with 450 ppm CO2, just 40 ppm higher than where we are now, corresponding to the widely accepted +2°C warming limit. And, as that is unlikely, if we want to limit warming to +3°C 650 ppm is the overall limit, and +3°C brings us into a highly uncertain, dangerous, and eventually ice-free world. In particular, we might lose control of a portion of the warming process, since large natural stores of CO2 are quite likely to be breached and begin leaking at those temperatures.
The Presidential commission on the matter also sketched the key problem with using a “bridge fuel” mechanism to reach targets like this, namely, “A slow start leads to a crash finish”, meaning that to hit these targets, the abandonment of fossil fuels and their infrastructure must be pursued much more quickly than if we start early. I daresay, none of the proposals for new natural gas generation have incorporated operating lifetimes which abruptly end in 2050, or depreciation schedules which reflect that. In fact, the new Massachusetts Salem Harbor gas-powered electricity generator has a lifetime up through 2080.
Amber Lin tells how there are really two incompatible kinds of natural gas plants for electricity generation:
When constructing a new natural gas power plant, there are two options: a combined cycle or an open cycle. A combined-cycle power plant produces electricity with relatively high efficiency and low carbon emissions: When the gas burns, it heats and compresses air to spin a turbine and power a generator. A heat recovery system captures waste heat, which is routed to a nearby steam turbine to generate even more power. Combined-cycle plants have low operating costs, but because high capital costs must be offset, these plants are built to produce baseload power—available 24 hours a day. Open-cycle gas turbine plants lack the steam cycle, so their thermal efficiency is much lower, and their carbon emissions per unit of electricity generated are slightly higher. Their running costs are much higher than a combined-cycle plant, but they have a much lower start-up cost, so they are often built as “peakers,” plants that run only to support other power infrastructure during hours of high demand or when solar or wind isn’t available.
Considering the two choices in the larger context of natural gas as a “transition fuel,” a dilemma appears: To build the bridge, combined-cycle is what is needed—a consistent, efficient, power source that can effectively replace coal. But for a combined-cycle natural gas plant to be economically feasible, it would typically need 15 to 20 years to make up for start-up costs, and even longer to become profitable. This means that a combined-cycle plant built in 2016 would break even no sooner than 2031, and would have to run for several more decades to be a worthwhile investment. Levi’s 2030 limit for peak emissions, and roughly 2050 limit for zero emissions, translate to major fossil fuel reductions after 2030. Owners and backers, however, will not want to shut down gas plants that are just beginning to generate a profit. Thus, building combined-cycle plants in 2016 without an explicit understanding of their necessarily temporary nature—and with no financial incentives for early closures in the future—defeats the purpose of natural gas as a “transition fuel.”
Why not focus on open-cycle plants instead? While “peakers” make sense as backups for future renewable energy sources, they don’t make sense right now. In the current infrastructure, they can only run for a couple hundred hours a year before they cost more than they can earn; this is not nearly enough to displace coal. Closed-cycle plants can help build the bridge but cannot close it, and open-cycle plants can help close the bridge but cannot build it. Neither type of plant is both economically feasible in the long run, and powerful enough to meet today’s demand while cutting emissions in time to mitigate climate change. However, when natural gas is branded as a “transition fuel” in politics and in popular media, this crucial detail is rarely mentioned.
(Emphasis added by blog author.)
So natural gas plants are the Zaphod Beeblebrox of electricity generation, as they are duplicitous and their purpose is to distract from the true goals of natural gas infrastructure expansion, to prolong the day when fossil fuel assets are stranded because of government action to mitigate climate change, or, as increasingly plausible, it is taxed for its Carbon.
Luckily Arthur’s Betelgeusean friend, Ford Prefect, a roving researcher for that illustrious interstellar travel almanac The Hitchhikers Guide to the Galaxy, was more of an optimist. Ford saw silver linings where Arthur saw only clouds and so between them they made one prudent space traveller, unless their travels led them to the planet Junipella where the clouds actually did have silver linings. Arthur would have doubtless steered the ship straight into the nearest cloud of gloom and Ford would have almost certainly attempted to steal the silver, which would have resulted in the catastrophic combustion of the natural gas inside the lining. The explosion would have been pretty, but as a heroic ending it would lack a certain something, i.e. a hero in one piece.
Someone blatantly misrepresented the U.S. Presidential election betting markets in a Google+ comment thread tonight, and I wanted to bring these forward, here.
No doubt some supporters of Trump will argue “God is on our side, and so these heathen markets cannot be correct”.
Current odds on Betfair.
Massachusetts is supposed to be a Blue State.
Massachusetts is supposed to be concerned about the environment, full of tree-hugging eco-weenies (like myself!), and sprouting solar panels from every other rooftop.
Massachusetts is supposed to have aggressive support for zero Carbon energy, including incentives, SRECs, and so on.
But facts are different.
59% of Massachusetts electricity comes from explosive methane (“natural gas” to those of you who prefer industry adverts). This is a potent greenhouse gas which, in 20 year timeframes, is 90x worse than CO2 for climate disrupting radiative forcing. (See https://667-per-cm.net/about if you have doubts.) Natural gas ain’t granola. And the calculations which suggest it is better for the environment are, in my opinion, whacked and bupkis. Set aside upstream impacts from fracking. Not all methane is burnt when it goes up your chimney for heating, nor in generating plants. There are big leaks throughout the Boston metropolitan area which the utilities will fix “if they are dangerous”, but they don’t consider greenhouse gas emissions dangerous. And we all known we have to transition off of fossil fuels, for the good of ourselves, a coastal state as we are, and the moral good of the planet and people on it, not to mention the recently affirmed requirements of the Global Warming Solutions Act (“GWSA”). The Union of Concerned Scientists says we are getting overdependent upon natural gas. And the comparison with coal as a benefit is a logical fallacy of “the worst negates the bad”.
That’s quite different than Texas. Yes, Texas. Home of cowboy boots, and guns, and Spectra Energy.
Without going really big on offshore wind and solar, Massachusetts could be being just being a bunch of chumps. And I often wonder if Spectra Energy isn’t trying to dump their explosive methane here, because people at home know better. Or it could be Massachusetts citizens are hypocrites, claiming to be for something, until it affects their own back yards. Or it could be, Massachusetts leadership is having $100,000 spent on them, just in 2016.
In this political season, it’s useful to brush up on rhetorical skills, particularly ones involving numbers and statistics, or what John Allen Paulos called numeracy. Professor David Spiegelhalter has written a guide to some of these tricks. Read the whole thing. Highlights, though, of devices used to produce statistics which aren’t-quite-right (that is, wrong):
David Spiegelhalter is the Winton Professor of the Public Understanding of Risk at the University of Cambridge and president elect of the Royal Statistical Society. Among many other things, he’s an advocate for expressing life risks as micromorts.
I made a comment on Google+ pertaining to a report of a recent NOAA finding.
But remember that COP21 boundary is equivalent to 450 ppm CO2.
It’s one thing to oppose pipelines and continued use of fossil fuels, but there is little as effective as a boycott of the key product. This is certainly not a new idea. (I don’t do Facebook. See this 2001 article as well.) So if you want to nudge in the direction of renewables, please consider boycotting natural gas. If you want to save money in the long term, please consider leaving natural gas. Natural gas and other fossil fuel prices are inherently volatile. Complaints of their being too high some times are really complaints about this volatility. Renewable energy produces electricity at the same price decade after decade.
Natural gas ain’t granola. Despite company advertisements to the contrary, drilling and fracking natural gas wells and associated infrastructure, including pipelines, compressor stations, and piping and metering stations are invasive, disruptive, expensive, and harmful to people, the environment, and the climate. Methane, the chief component of natural gas, is many times more powerful as a greenhouse gas than is CO2, and even at the burning end, combustion of natural gas is not complete, so there is leakage, even if the raw chemistry of all the component that is burnt is much cleaner than coal. Moreover, gas leaks, from pipelines, from nearly every step along the way, and especially in distribution networks near homes. And don’t think that because you hear reassuring things from utilities and gas companies and engineers that there’s safety there. It may be out of sight, but the political process and the Natural Gas Act of 1938 rigs the federal system against all opponents of natural gas, from cities and towns, down to localities and homeowners and farmers.
Co-constituents of natural gas with methane are carcinogenic and are powerfully harmful of human breathing and lungs. Even the odorants which are added to facilitate detection of leaks are themselves harmful.
Natural gas … WE DON’T WANT YOUR PIPELINE We don’t want your damn gas.
“We don’t want a Minsky moment about climate.”
Interesting that Carney talks about “stabilizing at a temperature” when emissions are stabilized using a Carbon tax. He agrees with a Carbon tax, but he seems to have his science wrong. I did not get the impression he understands that to stabilize at any temperature, Carbon emissions need to go to zero. In his world, I wonder, does that mean that a price on Carbon needs to go to infinity? From my perspective, there is an implicit ceiling on Carbon price, and that is the realistic price per tonne to exact a unit of Carbon from atmosphere. Perhaps it would be more, before it’s not just about extracting this tonne of Carbon but this one, and another, and more. But, still, there is a kind of ceiling.
The Journal of the American Statistical Association (“JASA”) has announced in this month’s Amstat News that effective 1st September 2016 “… will require code and data as a minimum standard for reproducibility of statistical scientific research.” Trends were heading this way, but it is excellent to see a major journal insisting upon it as standard practice.
There appear to be some weasel words allowing publications having “proprietary data” to move forward, insisting upon code nevertheless. I can only imagine that publications opting for that path will be seen as less established, solid, or compelling.
It’s the thing. And it addresses how media and people forget about the actual statistics, and focus on the White Hot Bright Light.
What’s striking is that people have apparently forgotten that Statistics, as a field and a profession, was created principally as a vehicle for betterment of society. Not only was this true of creative originators like Florence Nightingale, but the very idea of creating mortality tables for mutual insurance companies.
With regard to my comment at hypergeometric | July 13, 2016 at 3:50 pm on Tamino’s blog, someone challenged me on my assertion “Believe me, the +3C-+4C worlds are not places we want to go!” there. I have replied at Tamino’s blog, and I recommend reading the excellent comments there for context, but I thought it worthwhile to state my position here as well. Below is the quote. And we are apparently heading to the +3C to +4C region, even if COP21 is fully implemented. (See NCAR’s full report.)
The basis for my assertion is that we definitely do not want to approach the region [Professor] Ray [Pierrehumbert] writes “Here there (may) be dragons.”
While I am not a climate scientist, I am enough of a dynamicist to look at the rate with which we are introducing greenhouse gases compared to natural processes we can read in the paleorecord (with the possible exception of the Permian extinction event) to wonder whether or not we are actively exploring the climate state space for dynamical bifurcations. People who have examined the question suggest we would probably never know if we were approaching one. While there’s little that can be done except to press on for rapid reductions in CO2 emissions, I can only be honest and say these realizations make me very nervous.
In addition to solar PV, wind energy of all forms (especially underutilized local wind turbines), and energy storage, Kann is right on, in my opinion, emphasizing the great potential of blockchain technology. See here for a primer.
Also, the supposed need for base load is a chimera and just as mythical, and concerns about duck curves are misplaced. The grid is a network, and like any network, including the communications network known as the Internet, load needs to be shaped some times. That’s part of what demand response is about, but what some fail to see is that, on this point, a grid is better off having a large number of spatially separated, small generators than a few large generators, even if the large generators are all zero Carbon. This is particularly true if some of the generators have their own energy storage or are entirely energy storage centers.
And, finally, whatever the road chosen on the grid, as I’ve emphasized here repeatedly, even the conservative (*) and Carbon worshipping U.S. Energy Information Administration is now projecting a great role for zero Carbon energy by 2030. And the news from REN continues to be excellent.
Some progressives lament the loss of Bernie Sanders’ run for President, arguing “we need to get our democracy back.” A necessary step in order to get your democracy back is to take back control of your energy supply. Centralized energy means centralized political power. Residential solar PV power, possibly with energy storage, in individual homes or in local communities, is a political force for good, not only because it is an element of a plan to mitigate greenhouse gas emissions. The energy supply needs to be decentralized. The late Hermann Scheer understood this perfectly, as captured in his talk above, and spells it out in his book. The late Buckminster Fuller alludes to it in his Operating Manual for Spaceship Earth.
Not only does absolute power corrupt absolutely, centralized power corrupts and gives some members of the polity and some politicians undo influence, not only because of the associated monies.
(At the site of the West Roxbury, MA, Spectra/Algonquin explosive pipeline)
Cry out! Cry out! Wail in lamentation for all that climate change has wrought … And will wreak upon the children and grandchildren, upon the poor and the disenfranchised, upon the dreams of families who wanted coastal homes to pass to future generations, upon communities, once thriving, decimated because Nature took their land back, according to Her inexorable Law.
The most foolish response to the fact of climate disruption is continuing to expand fossil fuel infrastructure. It is imperative, and it is morally imperative this be stopped.
(In the pictures below, simply clicking upon them will produce a larger image. Return to the presentation by using your browser Back Button.)
A moral group, a brave group, choose to say No, “You … shall not … pass!”. And, on 29th June 2016, a group, including my beloved Claire, were the latest in hundreds to risk arrest, to use arrest to stop a malevolent intrusive upon a peaceful community, one completely opposed to the act.
In the morning, an attempt to stop construction was met with a wall of police presence. This could not be challenged, since touching a police officer is a very serious offense.
Accordingly, the group retreated to a nearby church to regroup, and to plan.
And in the afternoon …
My beautiful wife, Claire, stopping construction, ready to get arrested (in red shirt, left):
Claire’s Support, Andrea, and Claire’s arresting officer:
In the holding cell:
And upon bail payment and release, later in the day, the brave group:
Update, 2016-06-30, 21:04 EDT
An excellent Vimeo video of the entire day:
I neglected to mention yesterday, that Karenna Gore, the daughter of former Vice President Al Gore, whose movie, “An Inconvenient Truth” was the first popular introduction to many people regarding the climate crisis, took part and was arrested at this action along with 22 others.
Update, 2016-07-01, 17:47 EDT and 23:40 EDT
Sweeet! News coverage:
Ms. Jehlen, Messrs. Keenan, Montigny, Timilty and Joyce, Ms. Creem, Mr. Brady, Ms. L’Italien, Ms. Gobi, Ms. O’Connor Ives, Ms. Chang-Diaz, Messrs. Lewis, Pacheco, Moore and Ross moved that the bill be amended by inserting the following section:-
SECTION X. Section 94A of chapter 164 of the general laws, as appearing in the 2014 official edition, is hereby amended by adding the following paragraph:-
Nothing in this section shall be construed to authorize the department to review and approve contracts for natural gas pipeline capacity filed by electric companies.
This Amendment was adopted by the Massachusetts Senate, today, unanimously.
Dr James Hansen on The Open Mind.
“Signatures won’t save the climate”, writes Danielle Ola at PVTech.
The 2⁰C scenario would require much more money. On top of the $7.8 trillion, the world would need to invest another $5.3 trillion in zero-carbon power by 2040 to prevent CO2 in the atmosphere rising above the Intergovernmental Panel on Climate Change’s ‘safe’ limit of 450 parts per million.
“Investment in renewables required to achieve global climate goals is ‘entirely possible’”. That’s from IRENA (International Renewable Energy Agency).
Story here. Graphic:
(Click on image to see a larger figure, and use browser Back Button to return to blog.)
Bloomberg: “Solar Power to Grow Sixfold as Sun Becoming Cheapest Resource”. Excerpt:
The amount of electricity generated using solar panels stands to expand as much as sixfold by 2030 as the cost of production falls below competing natural gas and coal-fired plants, according to the International Renewable Energy Agency.
Solar plants using photovoltaic technology could account for 8 percent to 13 percent of global electricity produced in 2030, compared with 1.2 percent at the end of last year, the Abu Dhabi-based industry group said in a report Wednesday. The average cost of electricity from a photovoltaic system is forecast to plunge as much as 59 percent by 2025, making solar the cheapest form of power generation “in an increasing number of cases,” it said.
Renewables are replacing nuclear energy and curbing electricity production from gas and coal in developed areas such as Europe and the U.S., according to Bloomberg New Energy Finance. California’s PG&E Corp. is proposing to close two nuclear reactors as wind and solar costs decline. Even as supply gluts depress coal and gas prices, solar and wind technologies will be the cheapest ways to produce electricity in most parts of the world in the 2030s, New Energy Finance said in a report this month.
Past references on this blog to the same subject matter:
Now, more than ever.
(The above was published in September 2015.)
Best of luck to pilot Bertrand Piccard and the entire SolarIMPULSE team!
Update, 1253 EDT, 20th June 2016
(Click on image for a larger figure, and use browser Back Button to return to blog.)
Update, 1407 EDT, 20th June 2016
In addition to winds, some difficulties with Oxygen flow, although Bertrand Piccard is using a backup, and he reports there are clouds to the southeast.
Update, 1748 EDT, 20th June 2016
The weather pattern over the Atlantic has changed, and there is a low pressure “nor’easter” moving up the coast. I have not followed the flight consistently, but it
looks like SolarIMPULSE is being apparently being diverted to a landing in Nova Scotia. It’s possible that they may try to skirt the weather and try to push on. Not clear at the moment what they intend. They probably are avoiding a premature commitment to an end. They can retract their request for landing clearance to Moncton Center later.
SolarIMPULSE has cancelled a request for approach and is pushing on into the night, continuing its crossing of the Atlantic.
And SolarIMPULSE is presently flying in clouds.
Update, 2016-06-20, 20:16 EDT
Update, 2016-06-21, 11:06 EDT
Update, 2016-06-23, 01:38 EDT
In the Spring 2016 edition of Catalyst, a periodical of the Union of Concerned Scientists, Nobel laureate and former U.S. DOE head Professor Stephen Chu offers a suggestion on what the world should do after the COP21 meeting in Paris. Below is an excerpt, with emphasis added, and slightly edited for completeness where noted:
More than half of [the allotment of global Carbon emissions needed to have a 50% chance of remaining below the 2°C target] has already been used up since the beginning of the Industrial Revolution and, at our current emissions rate, the remainder will be gone in about 30 years. Clearly, the remaining carbon budget is a precious resource, but cap-and-trade allocations start from existing levels of emissions. It is prima facie unfair to allow developed countries to pollute more because they were historically the biggest polluters.
A global carbon tax avoids the intractable problem of how to allocate carbon emissions credits between developed and developing countries, and levies the highest taxes on the biggest emitters. If countries are unwilling to levy a cost on carbon, the playing field can be leveled with suitable border tariffs on goods imported into participating countries. In addition, the wealthiest countries still need to help less developed countries in this transition.
Figure courtesy of the International Energy Agency.
R provides a helpful data structure called the “data frame” that gives the user an intuitive way to organize, view, and access data. Many of the functions that you would us…
Source: Intro to The data.table Package
Cédric Villani, does Mathematics.
“Problems worthy of attack, prove their worth by hitting back.” — Piet Hein
One of the things I find surprising, if not astonishing, is that in the rush to embrace Big Data, a lot of learning and statistical technique has been left apparently discarded along the way. I’m hardly the first to point this out. Moreover, there are remedies available. Still, there are books on predictive analytics published which, while they collect a set of interesting ad hoc techniques for rapid inference together, leave out a lot of traditional techniques and wise concerns. Many of the major software plexes, like Weka, Spark MLib, or the map-reduce framework with its strong algorithmic constraints, facilitate the organization of large datasets and their retrieval, but the standard practices necessarily omit things like subsampling, and assessing what your real sample size is. To be crude, 100 tonnes of crap is still crap, and a billion replicas of exactly the same record don’t give you any more information than what’s in the same record.
Fundamentally there seems to be this idea that traditional statistical methods are too slow for the world of Big Data. I think that’s meant in two different ways. The first, which almost everyone addresses when the matter is discussed, is that the data set sizes or the rates of streaming are so large it’s not possible to apply batch-oriented or heavy computation to them. The second, which seldom gets mentioned in my experience, is that there’s an emphasis by organizations on rapidly producing apparent results, and, so, it’s perceived that deep thinking or care in sampling is not consistent with the business mission (*). I say “apparent results” because often there are only poor ways to tell if results are adequate. Itemset methods, sometimes called association rules, which I have used in an application, have a number of frequentist statistics offered which are used for diagnosis. One, for instance, is called Confidence and is a very poor man’s estimate of a conditional probability. It’s recognized that it has limitations, but to treat the finding limitations as if they are a research result is, in my opinion, to feign ignorance. So, to me, the rapid production of apparent results is simply looking like keeping busy, without a quantitative way of knowing. And “the customers seem to continue to be happy” or “sales keep going up” which, while important of course, don’t necessary have any causal connection to what’s being done.
Facts are, of course, that there are lots of ways traditional statistics can inform efforts involving large datasets. Moreover, there are, indeed, techniques which have been known since the 1960s for keeping up with the onslaught of a large data stream, these have been greatly improved, and they are very much in use by people who know how to use them.
And I suspect that’s another thing: Most of the traditional techniques for doing prediction and inference on streams, namely dynamic linear models, state-space methods, and dynamic generalized linear models all use more mathematics, specifically, numerical linear algebra and basic multivariable calculus to do what they do. And the population of developers and managers and, even, engineers eschew use of these methods whenever they can, because they are perceived to be hard. Instead, things like inference based upon ad hoc methods of locally sensitive hashing are used because it is “standard practice”, and generalizations of clustering methods, without even inquiring if the topology of the problem admits use of these. Sure, I can see these methods have their place, but it’s not like these are the only way things can be done.
Rather than some kind of leap of faith that more and more data can make up for things like poor statistical power, I think smart organizations realize that sampling matters, and traditional critiques of business processes are important. Accordingly, I’m not interested in Big Data, I’m interested in Smart Data, no matter what its size. I think anyone who cares about what their results mean should be interested in Smart Data, too.
In addition to simply representing good practice, Smart Data techniques do things easier (in the big picture sense) than do ad hoc collections of ad hoc techniques, no matter how many times they are cross-validated. For instance, predicting consumer behavior on hypothetical products, or products they have never experienced, or products no one has ever experienced, is not something which extrapolations of existing evidence compendia can ever dream of doing. There needs to be a well-wrought model in order to do that. And the model needs to be evidence-based, too. In other words, if there’s no training data, or truth data to score it, simply observations, many of the present Big Data methods are hopeless.
(*) This is exemplified by the many competitions or hackathons where tough problems are expected to be solved in a short time by adversarial teams. Sure, speed is sometimes necessary, but does anyone seriously expect every business can be run that way and last? “Internet time” is and always was ridiculous. At least, they’ll lose their people. They could go bankrupt after making a big mistake that was not noticed due to the rush. Flexibility, yes. But careful competence akin to skunkworks, definitely. Death marches can’t work, nor can projects which feature any two or more of their characteristics: wishful thinking, escalation of commitment, optimism bias, and planning fallacy.
One of the best presentations on what can happen if someone takes a naive approach to network data. It also highlights what is, to my mind, the greatly underappreciated t-distribution, which is typically only used in connection with frequentist Student t-tests, but serves as a generator someplace between the Gaussian and the crazy Cauchy distribution. Also relevant is the Lévy flight which has significance in biology. (See also.)
The message is that a combination of multiple paths, sampling rate changes, and a glitch on one of the paths can make an event appear to occur where there is none.
Energy, Environment and Policy
likhipa inhlanzi emanzini
... for when you can't solve life's problems with statistics alone.
Exploring and venting about quantitative issues
Newscasts on Global Warming, Its Consequences & Solutions
A Welcoming Congregation & A Green Sanctuary
Stop the war; stop the warming.
data are beautiful, data are a story
Boston Area Sustainability Group
with Peter Sinclair
I can calculate the motion of heavenly bodies but not the madness of people. -Isaac Newton
Engineering, Oceanography, and Innovation in Environmental Science
Changing the world and its people, one post at a time
notes on mathematical finance, algo trading and derivatives
Discover change, together
Tips and tricks on programming, evolutionary algorithms, and doing research
Experiments & Experiences in R
Critical perspectives on technology, sustainability, and the future
Astronomy, space and space travel for the non scientist
How one atheist sees life
The Forum for Climate Engineering Assessment: Unpacking the social and political implications of climate engineering
Noah Deich's blog on all things Carbon Dioxide Removal (CDR)
resources on approximate Bayesian computational methods
The Science, Economics, and Politics of Climate Change
Multa novit vulpes
Once you have finished counting diatoms, the real fun begins
weather musings for a wider audience
* eats plants * loves science * scared of earthquakes * a bit opinionated *
Let's talk about how we do research in the weather and climate sciences
science and current events d1p
``Your goodness must have some edge to it,--else it is none.'' R. W. Emerson
To inform, inspire and involve
FastMail system status announcements
Science, Politics, Life, the Universe, and Everything
Ponderings of science, philosophy, history, politics, and many other topics
Programming quantum computers for fun and profit
Statistical Computing + Bayesian Modelling
Science - Simplified
A Photographic Journey
The random musings of a reformed astronomer ...
Site for The People Who Care to Learn With Passion-Geography Study Solutions Online
Carson C. Chow
Ed Nisley's Blog: shop notes, electronics, firmware, machinery, 3D printing, and curiosities
By Dan Satterfield