“If you’re in a hole, stop digging.”

## Uniform sampling of a disk, and implications for sampling the Internet

Suppose you want to uniformly sample from the interior of a circle of unit radius, in other words, from a unit disk. The “gut feel” way is to pick a random angle, $\theta$, in radians uniformly from 0 to $2\pi$, and then a random radius, $r$, uniformly from 0 to 1. Do this a bunch of times, and plot the result:

Oops. Something’s gone wrong! The density of points in the center of the disk are higher than at the edges, and, in fact, the density goes down as the edge is approached.

Now, there are approximate workarounds involving more computation. Rejection sampling is one that comes to mind. In that case, instead of drawing values for parameters of a polar distribution, the idea is to generate from a 2-by-2 square centered on the origin, and then reject instances outside of a circle with unit radius, also centered at the origin. But this is wasteful and really not necessary. The alternative is simple.

The key observation is that for any randomly chosen radius $r$, the number of points on that radius ought to be proportional to $r$ if, in fact, the disk is going to be uniformly dense in points. In other terms, the radius probability density function of points, $f(r)$ ought to be proportional to $r$, or, formally, $f(r) = k r$ for some positive constant $k$. Since $1 = \int_{0}^{1} f(r)\,\mathrm{d}r$ by definition of a probability density function, where the upper limit is the radius of the disk, we have:

$1 = \int_{0}^{1} f(r)\,\mathrm{d}r = \int_{0}^{1} k r\,\mathrm{d}r = [\frac{k}{2} r^2]_{0}^{1} = \frac{k}{2}$, so $k = 2$. That also gives the cumulative distribution function. That’s really what we want for a particular simulated choice of $r$. It is $r^2$. So to find the corresponding $r$, it’s calculated using the inverse cumulative distribution function is used. Specifically, $u = r^2$, so $r = \sqrt u$. And when that’s done we get:

Okay, so what does this have to do with the Internet?

A lot of present day assessment of the Internet is done using two basic tools, ping and traceroute. Both have as a key element the idea that a packet is sent towards some target. An engineering feature of such packets on the Internet is that they contain a piece of control information called time to live or TTL. This is woven into the fundamental fabric of the Internet so packets don’t just flood it and make it useless. The basic idea is that when a node on the Internet receives a packet, and it is not intended for it, it decrements the TTL by one, overwriting that field with the decremented value, and then sends the revised packet on its merry way towards the target. Should the decrement result in a TTL value of zero, however, rather than sending the packet on, the node crafts a letter to the original sender of the packet saying in effect “You did not affix sufficient postage”. That letter is called a TTL-exceeded ICMP message, and it contains, among other things, the address of the node at which the TTL-exceeded event occurred. That’s good because that tells the sender (us!) how far the packet went and that’s exactly how ping and traceroute are used explore the Internet … They don’t know these addresses, so to explore what addresses are out there and who’s connected to whom, traceroute explores by sending packets with successively greater TTLs out towards the target, until it is reached.

The transition of a packet from one node to another along its way to a target is called a hop. Traceroutes are devices for elucidating the hops taken to arrive at a target.

Now, a ping is like a traceroute except that it involves sending a packet to specific recipients who will respond back with the time the packet was received. This lets engineers do things like measure latency. But, as you might imagine, ping packets also have TTLs and in fact you can think of the interior of the loop of a traceroute as involving doing a ping.

If an engineer wants to explore the structure of the Internet, traceroutes are good for getting basic structure. (There are offline sources as well, not important for this post.) But if the engineer wants to obtain a representative set of addresses across the Internet for a study, say, a representative sample of all addresses within some number of hops of an origin or vantage point, the arguments about the disk above say that tabulating all the addresses seen within that number of hops and their frequency is going to be a biased representation of this number of addresses. If all the addresses at a TTL value of $j$ are collected, and all the addresses at $j+1$ are collected, and so on, this amounts to uniform sampling in TTL. After all, TTL is just a kind of distance and, so, given what was shown about disks above, what this means is that if this is the sampling of TTLs done or kept after a traceroutes or pings campaign, the nodes closer to the vantage point are overrepresented in comparison with ones farther away. Accordinging, whatever statistics are collected are heavily biased by the vantage point, more than they would if visibility were the only concern.

So, the argument above suggests a remedy. Rather than doing or keeping addresses associated with every TTL, the addresses associated with TTLs up to some maximum (say 40) should be retained so the TTLs are picked if they equal the rounded value of $40 \sqrt u$ for a uniform random draw $u \sim \mathcal{U}(0,1)$. Otherwise, the annular bias shown in the first figure will afflict the measurements taken. This can be done by postprocessing, or it could be incorporated into the sampling plan.

In fact, there are advantages to incorporating this into the sampling plan. Nodes and paths leading to them excessively close to the vantage point are oversampled in the original plan, and unnecessarily so. This costs in unwarranted network load, and in complaints of abuse by network nearest neighbors. If the sampling plan can incorporate the square root factor, then the effort and load applied to that section of the network can be reallocated more usefully, at addresses farther away, with no additional cost.

See? Math rules.

## Generation: Westwood Studios, September 2016

(Click on image to see a larger figure, and use browser Back Button to return to blog.)

(Click on image to see a larger figure, and use browser Back Button to return to blog.)

As mentioned before, you can watch the generation yourself.

## Who paved the roads?

Professor Tony Seba of Stanford University is a great leader, visionary, speaker, and business expert. He often starts his talks with two successive public domain images to illustrate technological and business disruption. These are shown below.

One is a photograph of Fifth Avenue in New York City on Easter morning in 1900. The second is a photograph from almost the same place on Easter morning in 1913. Professor Seba’s point, and in part mine, is that in one, transportation by the relatively wealthy is dominated by horse-and-buggy. In the second, a mere 14 years later, it is dominated by the automobile.

My point and question relate to the complaint of some, who apparently ignore or disregard the tremendous subsidies provided to fossil fuels via tax incentives, direct subsidies, permissions to drill on public lands, and giving their distribution networks the power of eminent domain, that zero Carbon energy, principally wind and solar, are unfair competitors because they are being heavily subsidized by governments, local governments, state governments, and federal governments.

To that point and complaint, I refer to these pictures and note that the road in 1913 is paved, in contrast with the dirt road of 1900. My question is Who built and paid for the paved road?

The people who owned the cars were relatively wealthy, and were not in the majority. They did not pay for the paved roads out of their own pockets. The paved roads were key to the spread of the automobile, because the rough, bumpy roads literally shook early models apart. So, in order for automobiles to spread, something had to be done about roads, and that was expensive. Facts are, governments did something about it. In this case, it was New York City.

But note this was done in the same span of time that the automobile was adopted, obsolescing the horse-and-buggy, and changing forever the way that a City, like New York, would think about transport.

And, to my mind, there is no different between that and the subsidies given to wind, solar energy, and energy storage.

## Republican Governor Charles D. Baker, The Commonwealth of Massachusetts: On CLIMATE

An Executive Order, No. 569

ESTABLISHING

AN INTEGRATED

CLIMATE CHANGE STRATEGY

FOR THE COMMONWEALTH

WHEREAS, climate change presents a serious threat to the environment and the Commonwealth’s residents, communities, and economy;

WHEREAS, extreme weather events associated with climate change present a serious threat to public safety, and the lives and property of our residents;

WHEREAS, the Global Warming Solutions Act (the “GWSA”) directs the Secretary of Energy and Environmental Affairs and the Department of Environmental Protection to take certain steps to reduce greenhouse gas emissions and prepare for the impacts of climate change, including setting statewide greenhouse gas emissions limits for 2020, 2030, 2040 and 2050;

WHEREAS, the statewide greenhouse gas emissions limit for 2020 is 25% below the 1990 level of emissions and the corresponding limit for 2050 is 80% below the 1990 level of emissions, but no interim limits have yet been set for 2030 or 2040;

WHEREAS, the Commonwealth can provide leadership by reducing its own emissions from state operations, planning and preparing for impending climate change, and enhancing the resilience of government investments;

WHEREAS, the transportation sector continues to be a significant contributor to greenhouse gas emissions in the Commonwealth, and is the only sector identified through the GWSA with a volumetric increase in greenhouse gas emissions;

WHEREAS, the generation and consumption of energy continues to be a significant contributor to greenhouse gas emissions in the Commonwealth, and there is significant potential for reducing emissions through continued diversification of our energy portfolio and adoption of a comprehensive energy plan;

WHEREAS, on May 17, 2016, the Supreme Judicial Court ruled that the steps mandated by the GWSA include promulgation of regulations by the Department of Environmental Protection “that establish volumetric limits on multiple greenhouse gas emissions sources, expressed in carbon dioxide equivalents, and that such limits must decline on an annual basis;

WHEREAS, the ambitious goals for greenhouse gas emissions established by the GWSA will help to mitigate future climate change, strong and prompt action beyond emission reductions is required to meet the serious threats presented by climate change and associated extreme weather events;

WHEREAS, our state agencies and authorities, as well as our cities and towns, must prepare for the impacts of climate change by assessing vulnerability and adopting strategies to increase the adaptive capacity and resiliency of infrastructure and other assets;

WHEREAS, the Executive Office of Public Safety and Security and its constituent agencies, including the Massachusetts Emergency Management Agency, have deep institutional expertise in preparing for, responding to, and mitigating damage from natural hazards; and

WHEREAS, only through an integrated strategy bringing together all parts of state and local government will we be able to address these threats effectively;

NOW, THEREFORE, I, CHARLES D. BAKER, Governor of the Commonwealth of Massachusetts, by virtue of the authority vested in me by the Constitution, Part 2, c. 2, Section 1, Art. 1, do hereby order as follows:

• Section 1. The Secretary of Energy and Environmental Affairs shall coordinate and make consistent new and existing efforts to mitigate and reduce greenhouse gas emissions and to build resilience and adapt to the impacts of climate change. To achieve these objectives the Secretary shall lead the efforts set out in this Executive Order, and shall:

• a. continue to consult the GWSA Implementation Advisory Committee for advice on greenhouse gas emission reduction measures, including recommendations on establishing statewide greenhouse gas emissions limits for 2030, and 2040 pursuant to Section 3(b) of Chapter 21N of
the General Laws by December 31, 2020 and December 31, 2030, respectively;
• b. expand upon existing strategies for the Commonwealth to lead by example in making new, additional reductions in greenhouse gas emissions from Government operations;
• c. work, in consultation with the Secretary of Transportation, with New England and Northeastern state transportation, environment and energy agencies to develop regional policies to reduce greenhouse gas emissions from the transportation sector consistent with meeting the GWSA’s 2050 and interim emissions limits;
• d. continue to lead on reform of regional wholesale electric energy and capacity markets to ensure that state mandates for clean energy are achieved in the most cost-effective manner;
• e. publish, within two years of this Order, and update every five years thereafter, a comprehensive energy plan which shall include and be based upon reasonable projections of the Commonwealth’s energy demands for electricity, transportation, and thermal conditioning, and include strategies for meeting these demands in a regional context, prioritizing meeting energy demand through conservation, energy efficiency, and other demand-reduction resources in a manner that contributes to the Commonwealth meeting each of these limits; and
• f. ensure that efforts to meet greenhouse gas emissions limits are consistent with and supportive of efforts to prepare for and adapt to the impacts of climate change and extreme weather events as detailed in Section 3 of this order.

• Section 2. The Department of Environmental Protection shall promulgate final regulations that satisfy the mandate of Section 3(d) of Chapter 21N of the General Laws by August 11, 2017, having designed such regulations to ensure that the Commonwealth meets the 2020 statewide emissions limit mandated by the GWSA. In order to ensure that the Department’s regulations meet this requirement on this schedule, the Department of Environmental Protection shall:

• a. establish an internet portal through which interested parties, including affected businesses and members of the public, may propose
regulatory approaches for the Department’s consideration;
• b. revise the Global Warming Solutions Act requirements for the Massachusetts Department of Transportation set forth in 310 C.M.R. 60.05 to establish declining annual aggregate emissions limits;
• c. consider limits on emissions from, among other sources or categories of sources, the following:
• (i) leaks from the natural gas distribution system;
• (ii) new, expanded, or renewed emissions permits or approvals;
• (iii) the transportation sector or subsets of the transportation sector, including the Commonwealth’s vehicle fleet; and
• (iv) gas insulated switchgear;

• d. publish, no later than December 16, 2016, the notice associated with these regulations as required by Section 5 of Chapter 30A of the General Laws; and
• e. hold, no later than February 24, 2017, a public hearing associated with these regulations as required by Section 5 of Chapter 30A of the General Laws.

• Section 3. The Secretary of Energy and Environmental Affairs and the Secretary of Public Safety shall coordinate efforts across the Commonwealth to strengthen the resilience of our communities, prepare for the impacts of climate change, and to prepare for and mitigate damage from extreme weather events. In order to facilitate this coordination, the Secretaries shall:

• a. within two years of this Order, publish a Climate Adaptation Plan that includes a statewide adaptation strategy incorporating:
• (i) observed and projected climate trends based on the best available data, including but not limited to, extreme weather events, drought, coastal and inland flooding, sea level rise and increased storm surge, wildfire, and extreme temperatures;
• (ii) guidance and strategies for state agencies and authorities, municipalities and regional planning agencies to proactively address these impacts through adaptation and resiliency measures, including guidance regarding changes to plans, by-laws, regulations, and policies;
• (iii) clear goals, expected outcomes, and a path to achieving results;
• (iv) approaches for the Commonwealth to lead by example to increase the resiliency of Government operations;
• (v) policies and strategies for ensuring that adaptation and resiliency efforts complement efforts to reduce greenhouse gas emissions and
contribute towards the Commonwealth meeting the statewide emission limits established pursuant to the GWSA; and
• (vi) strategies that conserve and sustainably employ the natural resources of the Commonwealth to enhance climate adaptation, build resilience and mitigate climate change;

• b. within one year of this Order, establish a framework for each Executive Office to assess its and its agencies’ vulnerability to climate change and extreme weather events, and to identify adaptation options for its and its agencies’ assets;
• c. within one year of this Order, establish a framework for each City and Town in the Commonwealth to assess its vulnerability to climate change and extreme weather events, and to identify adaptation options for its assets;
• d. provide technical assistance to Cities and Towns to complete vulnerability assessments, identify adaptation strategies, and begin implementation of these strategies;
• e. implement the Climate Adaptation Plan upon its completion; and
• f. update the Climate Adaptation Plan at least every five years, incorporating information learned from implementing the Plan and the
experiences of agencies, and Cities and Towns in assessing and responding to climate change vulnerability.
• Section 4. The Secretary of each Executive Office shall designate an existing employee to serve as the Secretariat’s Climate Change Coordinator. Each Climate Change Coordinator shall:

• a. serve as the Secretariat’s point person regarding climate change mitigation, adaptation and resiliency efforts;
• b. meet under the leadership of personnel from the Executive Office of Energy and Environmental Affairs and the Executive Office of Public Safety and Security to assist in the development and implementation of the Climate Adaptation Plan;
• c. within two years of this Order, assess the vulnerability to climate change and extreme weather events for the Coordinator’s Executive Office and for each agency within the Coordinator’s Executive Office and identify adaptation options for the assets of such Executive Office and agencies; and
• d. incorporate results from vulnerability assessments into existing policies and plans for the Executive Office and its agencies

• Section 5. This Executive Order shall be reviewed no later than December 31, 2019, and every five years thereafter.

Given at the Executive Chamber in Boston this 16th day of September in the year of our Lord two thousand sixteen and of the Independence of the United States of America two hundred forty-one.

CHARLES D. BAKER

GOVERNOR

Commonwealth of Massachusetts

## “Predicting annual temperatures a year ahead” (Dr Gavin Schmidt at REALCLIMATE)

Dr Schmidt is essentially betting that the trend, seen as a random variable, will regress towards the smooth mean.

I have a post at Nate Silver’s 538 site on how we can predict annual surface temperature anomalies based on El Niño and persistence – including a (by now unsurprising) prediction for a new record in 2016 and a slightly cooler, but still very warm, 2017. The key results are summarized in the figures that show how residual variations in the global temperatures (after detrending) related to the ENSO phase at the beginning …

## XKCD tells it all

Alerted to the existence of the image by Tamino. The figure is due to the irrepressible Randall Munroe.

## Bastardi’s Bust

Famous climate denialist Joe Bastari of WeatherBELL Analytics LLC, formerly of Accuweather.com made a prediction on Arctic ice recovery back in 2010 (when at AccuWeather), and observations have since made his “studies” laughable.

I have heard his colleague, Joseph D’Aleo speak at the Southern New England Meteorology Conference in 2015. Notably, he is also associated with the Heartland Institute where he was/is a “resident expert”.

## ‘A Time To Choose’

Charles Ferguson and a Time To Choose”.

(Much large image available by clicking on photo. Use browser Back Button to return to blog.)

Trailer:

## Hermine Unique among Storms’

Post-tropical storm Hermine is the story of the emergence of weather chimeras.

Simple. The forecasting precedents have changed. We cannot look to the past to anticipate the future any longer. We’re playing by different rules. And we don’t know what their implications are, because the experiment has never been run before. We’re running it. And we have no idea what will happen.

But we’re continuing it nonetheless. This hasn’t been anticipated. This hasn’t be war-gamed.

See also Eric Holthaus’ opinion, and another view he has, about risk.

Welcome to the Hyper Anthropocene.

Hermine still developing. Predictions are for it to hold in place off the East Coast for several days, due to a blocking pattern known as a “Rex Block”.  This and many gems from Eric Holthaus’ update, excerpted here.

Unusually placed, for a weather/climate piece, at election/polling guru Nate Silver’s FiveThirtyEight.com.

Eric Holthaus at FiveThirtyEight:

Based on the current forecasts, Post-Tropical Cyclone Hermine is a storm without a good historical comparison. Hermine was once a tropical cyclone that made landfall in Florida, but that seems like ages ago. It has now transitioned to its post-tropical stage after moving northeast across land, off the coast of North Carolina, where it’s partially drawing energy from the jet stream. Hermine is forecast to affect the Mid-Atlantic over the next several days as a hurricane-strength storm, with a potentially historic coastal flood.

Of the 10 or so meteorologists I’ve talked to in the last…

View original post 1,024 more words

## Once more, with feeling: Responding to Kostrzewa in The Providence Journal

It’s making the rounds. Today it’s John Kostrzewa, Assistant Managing Editor of The Providence Journal, arguing the necessity of natural gas and its pipelines with his “Why R.I.’s economy needs a natural-gas pipeline”. And my response, below, which allowed me to dig a little deeper into these matters than I had time to do yesterday with the same kind of response for Massachusetts. The thing about the response for Rhode Island was that the character count for a response is severely constrained, meaning I could not document as many of my assertions as I would have liked. I have included them in the post below.

Mr Kostrzewa piles on to the usual arguments supporting the expansion of natural gas in New England, this time focussing upon Rhode Island. Prices are high because there’s insufficient energy. Electricity prices are high, especially in winter because there’s insufficient natural gas. Businesses need energy for growth, and most importantly for creating jobs. Natural gas produces jobs.

All these are myths.

The fundamental fact about prices for a kilowatt-hour (“KWh”) of electricity in New England is that, per person, we use less electricity. The expenses of the inefficient and old grids dating from the 20th century are spread over fewer KWh, so cost per KWh is higher. If total cost of electricity paid per month is compared with other states, efficient New England and Rhode Island ride pretty low. The cost for Rhode Island is $107/month, the 11th cheapest in the entire country, compared with D.C. and New Mexico which pay$82/month and $88/month, respectively, and South Carolina and Hawaii which are tied for a whopping$177/mo. Massachusetts pays $115/month and is the 16th cheapest. (These figures are available here.) Natural gas friendly Wyoming also pays$107/month for electricity, and that’s not because electricity per KWh is expensive. It’s not. The U.S. Energy Information Administration (“EIA”) gives it at $.115 per KWh compared with Rhode Island’s$.18 per KWh. Wyoming uses more per person. (Massachusetts electricity costs \$.1906 per KWh.)

One might as well argue that natural gas is responsible for the high electric rates, since it provided 94% of Rhode Island’s electricity in 2014 and 95% in 2015. Oil actually increased its share from 2014 to 2015 from 1.4% to 1.5%. Waste-to-energy facilities produce 3%, and renewables a mere 0.4%. How much more gas can Rhode Island use? At most 6%. (See EIA data for all these.) Think building pipelines in Rhode Island are to help Rhode Island? No. This is Spectra/Algonquin madly trying to make up for the setbacks they’ve received on their Access Northeast pipeline project, before FERC shuts them down.

## Gustin and companies lack technological and business imagination

Carl Gustin, a consultant to the New England Coalition for Affordable Energy, which “includes many of New England’s major business and industry organizations and labor representatives”, wrote an op-ed in favor of additional natural gas and pipelines for Massachusetts in Commonwealth‘s online magazine today. I posted a detailed rebuttal and, after an hour or so online, it was removed. I am reproducing it below. And here, on my blog, I can say more of what I think.

Mr Gustin and his New England Coalition are shills for fossil fuel energy companies that find themselves threatened with the surge in renewable energy. Clearly, like some presidential candidates, they have really thin skins.

Mr Gustin’s depiction of the electricity generation situation in Massachusetts and Texas is misleading at best, and, judging by the actual numbers at the U.S. Energy Information Administration, consists of cherry-picking from sources and years which make his case. Of course, that does not depict reality.

Let’s take Texas, for example. Their electrical energy needs in 2015 were 14 times larger than Massachusetts. They got 10% of their energy from wind, and a miniscule amount of their energy from solar (0.09%). They got 53% of their energy from natural gas and 9% from nuclear. Yet despite their large contribution from wind and large commitment to natural gas, over 2015, there was little correlation between use of natural gas and availability of wind (-0.3). That means, no, there is no offsetting of energy with natural gas when winds don’t blow. Clearly the Wall Street Journal article was incorrect in its 16% of energy claim. And while Texas might be expecting a “huge surge in solar capacity”, that’s because it presently has none, so of course if you start from a baseline of near zero, it seems like a lot. And one wonders what that false emphasis means about the perspective and motivations of the author.

In fact, in 2015 and in contrast and despite the difference in latitude and weather, Massachusetts got a full 2% of its energy from solar In fact, in 2015 it got three times as much energy from solar as from wind, despite all the ballyhooing about offshore wind and onshore turbines. Since 2014, the amount of electrical energy Massachusetts generated from solar has doubled. In the first 6 months of 2016, not the sunniest season, it STILL got 2% of its electricity from solar. Wind generation, in contrast, remained flat from 2014 to 2015, at 0.7%. Solar could keep doubling or, at least, there’s no technical reason it could not. Any of the fears about grid instability don’t happen until it is about 14% of total general, and if it doubles each year, that’s 3-5 years away, probably 5 years, since Beacon Hill has it in a stranglehold.

And what of the offsetting of nuclear and coal with renewable energy in Massachusetts? Compared to Texas, we don’t have any to speak of. Surely, there is no evidence that even the doubling of solar caused the decrease of Massachusetts electrical generation by nuclear from 18% to 15% and by coal from 9% to 7%. In fact it is mathematically impossible that renewables affected nuclear and coal generation AT ALL. In contrast, electrical generation using natural gas increased from 58% in 2014 to 64% in 2015. There’s your nuclear- and coal-killer.

And it is no wonder that GE is not a leader in renewable energy generation, at least not any longer. That field is dominated by Vestas, Siemens, Alstom Wind, Hitachi, and the surging Chinese producers CNR, CSIC, and Ming Yang. Placing bets on goal, against the outlooks of financial scholars like Bloomberg, is foolhardy, especially if, as Mark Carney, Bank of England governor suggests, these assets are likely to be stranded by regulation and insurance costs. If the op-ed is correct — and there’s plenty of evidence in its sloppy use of numbers elsewhere that it is not — GE may be turning to “clean coal” because it does not know how to compete in any other market.

The citizens of the Commonwealth ought not to be fooled. Companies deeply invested in fossil fuels are terrified that their markets and industries will encounter “Minsky moments“, causing their assets and prices to suddenly plummet. Why else, despite natural gas stranglehold on electrical generation in Massachusetts to a full 64% are the utilities and those companies crying “the sky is falling” as is Mr Gustin? Building generation and pipelines is for them an existential struggle, and they are trying to force governments to do “sunk cost buy-ins” of their assets so that, no matter what disasters unfold, those governments will be stuck with their long term investments. They are, despite their pleas, no friends to renewable energy, they are no “bridges to the future”. Their business plans do not have any renewables-accelerated depreciation schedules or phase-out timetables. They want continued revenues.

In any case, as IBM, Kodak, and Barnes & Noble have painfully learned, if technology is on your competitors side, there is no winning against it, even if you as a company own government. And it’s silly for citizens to bet on the wrong side, even if some will.

Late breaking: Why renewables are good for business, despite some claims otherwise.

Update, 2016-09-01, 22:21 EDT

I had a look at GE’s 10-K for 2015. After Mr Gustin and, presumably, his source, Rakesh Sharma at Investopedia, quoting the Wall Street Journal, gushed about GE’s pursuit of coal, both of them neglected to read that 10-K which clearly states what GE wants from Alstom:

A new segment named Renewable Energy was created that includes GE’s legacy onshore wind business and the wind and hydro businesses acquired from Alstom.

GE Renewable Energy makes renewable power sources affordable, accessible, and reliable for the benefit of people everywhere. With one of the broadest technology portfolios in the industry, Renewable Energy creates value for customers with solutions from onshore and offshore wind, hydro, and emerging low carbon technologies. With operations in 40+ countries around the world, Renewable Energy can deliver solutions to where its customers need them most.

• Onshore Wind – provides technology and services for the onshore wind power industry by providing wind turbine platforms and hardware and software to optimize wind resources. Wind services help customers improve availability and value of their assets over the lifetime of the fleet.
• Digital Wind Farm is a site level solution, creating a dynamic, connected and adaptable ecosystem that improves our customers’ fleet operations.
• Offshore Wind – offers its high-yield offshore wind turbine, Haliade 150-6MW, which is compatible with bottom fixed and floating foundations. It uses the innovative pure torque design and the Advanced High Density direct-drive Permanent Magnet Generator. Wind services support customers o over the lifetime of their fleet.
• Hydro – provides full range of solutions, products and services to serve the hydropower industry from initial design to final commissioning, from Low Head / Medium / High Head hydropower plants to pumped storage hydropower plants, small hydropower plants, concentrated solar power plants, geothermal power plants and biomass power plants.

Renewable energy is now mainstream, able to compete with conventional options on an unsubsidized basis in many locations today. New innovations such as the digitization of renewable energy will continue to drive down costs. Worldwide competition for power generation products and services is intense. Demand for power generation is global and, as a result, is sensitive to the economic and political environments of each country in which we do business. Our Wind business is subject to certain global policies and regulation including the U.S. Production Tax Credit and incentive structures in China and various European countries. Changes in such policies may create unknown impacts or opportunities for the business.

Yep, cherry-pickin’ at its worst.

About the only business GE is not in is solar PV.

What a shame. It was a good company, that GE, in its day:

## NextGen VOICES: On data’, On setbacks’, and On discovery’

Science Magazine has a periodic column called Science in brief and occasionally that column features a set of what they call “NextGen VOICES”, meaning young scientists. They gather the survey using Twitter (of course) via the hashtag #NextGenSci. For the week of 1st July 2016, Science asked:

In April, we asked young scientists to use exactly six words to create a story about the life of a scientist in your field. We received almost 400 responses, some frustrated, some inspiring, some humorous, and all describing a life unique to a scientist. We have printed some of the most interesting responses here.

Here are some excerpts, from topics of interest to me.

On data

Big data! Clean: No statistical power.
Abhishek Noroula, Bioinformatics, Sweden

Data overload: Juggling balls, many fall.
Noa Sher, Cell Therapy, Israel

P equals 0.051? Repeat? Abandon? Bayes?
Rosa Li, Psychology and Neurosciences, USA

On setbacks

Mice eaten by cats, graduation delayed.
Chenggang Yan, Intelligent Information Processing, China

Exciting new result! No … coding mistake.
Frank X. Vasquez, Chemistry, USA

Results were promising, until they weren’t.
David Edward Gilbert, Energy and Evnrionmental Genomics, USA

On discovery

Scientist,looking closely, mistakenly finds truth.
Joshua Isaac James, Digital Forensic Science, South Korea

## The Future of Energy’

Writing in Newsweek, Kevin Maney talks about the Future of Energy and Elon Musk’s long term plan to kill Big Oil.
Hat tip to Peter Sinclair at Climate Denial Crock of the Week where I first found the mention of the article, and at Rawstory, which reprinted Maney’s article in Newsweek with persmission.

I am not posting an excerpt because Newsweek wants their income, but it’s good to read this, since I have learned and argued similar things elsewhere on this blog.

Die größte PV-Dachanlage in der Region Hannover und eine der größten Anlagen in ganz Niedersachsen zum Zeitpunkt der Fertigstellung.

## “Sharon’s Water Problem” (by Paul Lauenstein)

(Click on image to see a bigger version of this figure. Use your browser Back Button to return to this blog.)

The town of Sharon, MA, has a water problem. Click on the link and see Paul’s presentation about it.

Many places have water problems, but Sharon’s is severe, and emblematic of the poor planning and mismanagement of natural resources which characterizes local, regional, state, and federal governance.

As climate changes, it will get worse.

After all, there isn’t that much water to be had.

## “Getting our heads out of the sand: The facts about sea level rise” (Robert Young)

If current luck holds, North Carolina may well escape the 2013 hurricane season without the widespread damage that has so frequently plagued the fragile coastal region in recent years. Unfortunately, this brief respite is almost certainly only that — a temporary breather.

Experts assure us that the impacts of climate change (including rising oceans and frequent, damaging storms) are sure to remake the coast in myriad ways over the decades to come and will, quite likely, permanently submerge large tracts of real estate.

So, what does our best science predict? And what can and should we do — especially in a state in which policymakers have actually passed a law denying that sea level rise is even occurring?

Dr. Robert Young of Western Carolina University, professor of geology, an accomplished author and a nationally recognized expert on the future of our developed shorelines, explores answers to there and related questions.

NC Policy Watch presents — a Crucial Conversation Featuring Dr. Robert S. Young, professor of geology and Director of the Program for the Study of Developed Shorelines at Western Carolina University.

See their Storm Surge Viewer, especially if you are interested in buying or developing shoreline property.

## Time to turn page on natural gas – CommonWealth Magazine

Plugging In Plugging In Energy and the Environment Gas pipeline firm says it’s full-speed ahead By John Flynn, Lee Olivier and Bill YardleySee all » Plugging In Plugging In Energy and the Environment SJC nixes ‘pipeline tax’ By Bruce MohlSee all » Plugging In Plugging In Energy and the Environment Energy bill a solid step(…)

Also see this, and this.

## Eversource withdraws from the Spectra-Algonquin “Access Northeast” pipeline project

(Click on image to see a bigger copy. Use browser Back Button to return to blog.)

Yes!

Now let’s hope the remaining customers for Spectra’s Access Northeast pull out, and FERC denies permission to proceed. Their next meeting is 22nd September 2016.

Update, 2016-08-24

National Grid and the rest of the utilities have pulled out of Spectra-Algonquin’s Access Northeast.

More.

## “Naïve empiricism and what theory suggests about errors in observed global warming”

A post from one of my favorite statistics-oriented bloggers, Variable Variability, dealing with a subject too casually passed over.

## “Understanding Climate Change with Bill Nye”, on Dr Neil deGrasse Tyson’s “Star Talk”

Bill Nye hosts Dr Neil deGrasse Tyson‘s Star Talk Radio, featuring climate change and NASA’s Dr Gavin Schmidt. (See also RealClimate.)

## ECS2x, land, sea, and all that

from http://dx.doi.org/10.1126/science.1203513

P.S. I wrote more here. Reproduced below …

Practical likelihood functions are very flat-topped, so the idea that a maximum likelihood function (MLE) can be confined to a point is a theoretical mirage. See Chapter 3 of S. Konishi, G. Kitagawa, Information Criteria and Statistical Modeling, Springer, 2008. Even if you want to set aside Bayesian considerations, whose priors tend to sharpen the posteriors, the best you can do is expected likelihoods, because likelihoods in practice, just like p-values, are random variables. Accordingly, the MLE is a neighborhood, because a point has probability mass zero.

Besides, … the question of multimodality [wasn’t addressed]. Actual Expected Climate Sensitivity is a combination of the densities over oceans and land, each of which have different distributions and modes. (See https://goo.gl/pB7H24 which is from http://dx.doi.org/10.1126/science.1203513) Accordingly, their combination is (at least) bimodal. Ocean ECS has 4 modes. Land ECS has 2 modes, one slightly higher than the other, the higher being at +3.4°C and the second at about +3°C. Worse, the variance of land ECS is over twice than of oceans.

Finally, what you should be looking at is the ECS2x over land, not combined. Even if granted to want to go with the location of the highest mode, that’s +3.4°C.

## A litany of climate depression

Bill Nye talks about the show, here.

## “Disrupt climate disruption”

From Science Music Videos

And if you have the time, a 52 minute movie …

Power concedes nothing without a demand.

“No leader is coming to save us [from climate disruption].”

## Carbon Sinks in Crisis — It Looks Like the World’s Largest Rainforest is Starting to Bleed Greenhouse Gasses

This is the kind of thing that’s expected of a +3C world, although the idea of it being a threshold phenomenon is a bit unrealistic. So, it’s expected that these sinks might, overall, start releasing their sequestered Carbon, one here one year, another there another year. But if the biggest sinks start releasing theirs first, well, this is one of the Climate Surprises the IPCC and the U.S. National Climate Change assessment talk about. And they are not at all good.

Back in 2005, and again in 2010, the vast Amazon rainforest, which has been aptly described as the world’s lungs, briefly lost its ability to take in atmospheric carbon dioxide. Its drought-stressed trees were not growing and respiring enough to, on balance, draw carbon out of the air. Fires roared through the forest, transforming trees into kindling and releasing the carbon stored in their wood back into the air.

These episodes were the first times that the Amazon was documented to have lost its ability to take in atmospheric carbon on a net basis. The rainforest had become what’s called carbon-neutral. In other words, it released as much carbon as it took in. Scientists saw this as kind of a big deal.

This summer, a similar switch-off appears to be happening again in the Amazon. A severe drought is again stressing trees even as it is fanning…

View original post 1,144 more words

## An Energy Revolution

Professor Mara Prentiss speaks at Harvard on the possibility of an “energy revolution”:

Update, 2016-08-16

Although I am not a PhD professor like Professor Prentiss, nor am I associated with an institution as esteemed as Harvard University, I disagree with her point regarding the need for natural gas, based upon my studies of the solar energy business during the last two years. I also assuredly hope that is not the only way to transition to zero Carbon energy, because from what I know of the climate science, we could be in very, very serious trouble if we need to go through an intermediate CO2-spewing step which will persist for another few decades.

## Can the City of Boston adapt to and help mitigate climate disruption?

(See the major update at the bottom of this post as well.)

(On “Less Science and More Social Science” at And Then There’s Physics)

And Then There’s Physics is one of my favorite blogs discussing climate disruption and related policy (in my climate blogroll, for instance). There was a recent post regarding another post by a science blogger called Stoat (one William Connolley) on the limitations of science for dictating mitigation and adaptation policy. Read there for context.

But along the way, a Commentor, mt at 13th August 2016 at 6:25 p.m. where mt, cited the 1979 Charney Commission report’, suggests science can and has done little more, even with the IPCC. I composed a Comment which suggested at least one city, Boston, was trying to enlist science in its detailed response and planning.

That Comment apparently did not make it through moderation at ATTP, or got lost through a technical glitch, as sometimes happens. I worked on it a bit, so am reproducing it here instead. (As can be seen by the Comment below, apparently there was a technical glitch, and that Comment has now been posted.)

@mt,

Well, the City of Boston is engaged in a pretty deliberate process to ascertain climate impacts, what should City policy and planning dictate, especially with respect to sea level rise and storm surge, and needed investments. It is informed by science, and climate projections for Boston, followed by three additional reports, an Integrated Vulnerability Assessment, a detailing of Resilience Strategies, and a Final Report and Implementation Roadmap. The last three appear to be late, but there is a hard stop of sorts in the form of a Climate Vulnerabilities & Solutions Symposium on the 15th of September, which I am attending.

Attendees will include representatives from local financial firms, banks, insurers and re-insurers, as well as businesses, utilities, real estate people, government people, NGOs and attorneys. There already was a presentation of the Climate Projections Consensus at which there were many representatives of these stakeholders.

Come September, it will be interesting to see how these groups think about the problem, and where they are landing in terms of a mix of the three basic choices,

1. wait-and-see, with willingness to take on and deal with damage as it comes,
2. makes some preparations, but basically remain-in-place, or
3. preparare to abandon the present location of the City, and begin preparations to assess where to go.

This is in part because:

I like Bank of England head Mark Carney’s description that “Climate change is an economic problem.” He wants to avoid a Minsky moment.

The economy is a wholly owned subsidiary of the environment, not the other way around.

Gaylord Nelson

It should be noted as well that the City of Boston has volunteered itself to host a United States-China Climate Summit in 2017. Whether that applies additional pressure, as I suspect, or gives the City cover for delay and greenwashing is anyone’s guess. We’ll see in September.

Update, 2016-08-28

It might be slightly premature, but it seems, as of today, that the answer to the rhetorical question posed in the headline-title of this post is “No”, the City of Boston does not know or want to adapt to climate change, including sea level rise.

The basis for my conclusions is the recent history of Climate Ready Boston reports:

1. Spring 2016, “Climate Projections Consensus” (in hand and available)
2. Coming, Summer 2016, “Integrated Vulnerability Assessment: Assessing the potential impacts of climate change on Boston’s buildings, infrastructure, environmental systems, and communities” (missing in action)
3. Coming, Summer 2016, “Resilience Strategies: Developing preliminary ideas for projects, policies, and programs to help Boston’s neighborhoods and infrastructure respond to climate change and become more resilient” (missing in action)
4. Coming, Summer 2016, “Final Report And Implementation Roadmap: Pulling the findings and initiatives together with a roadmap to address major vulnerabilities” (missing in action)

There is a 5.5 hour “symposium” scheduled for 15th September 2016 titled the “Boston’s Climate Vulnerabilities & Solutions Symposium, long planned. Presumably these reports were to be completed in order to be able to inform this symposium. Instead, the agenda for the symposium consists of:

1. Opening remarks
2. An overview of Climate Ready Boston
3. Resilience Interventions In Boston: Existing Buildings, New Construction & District Solutions. This includes the following speakers and panel members:
• John Cleveland, Director, Boston Green Ribbon Commission
• John Messervy, Director of Capital & Facility Planning, Partners Healthcare
• Ben Myers, Sustainability Manager, Boston Properties
• Jeff Wechsler, Marketing Director-Acquisitions, Tishman Speyer
4. Refreshments & Vendor Expo
5. Financing And Policy Solutions For Resilience. This includes the following speakers and panel members:
• Michael E. Mooney, Chairman, Nutter McClennen & Fish LLP
• Rebecca Davis, Deputy Director, MAPC
• John Markowitz, Vice President – Infrastructure Finance, MassDevelopment
• Sara Myerson, Director of Planning, BRA
6. Closing Remarks, by Austin Blackmon, Chief of Environment, Energy & Open Space, City of Boston

Note:

• No major political figures
• No representatives from large financial firms, or insurance firms. The financial district is located a block or two from the Atlantic Avenue Wharf, and is more or less downhill from it
• No local property owners from Atlantic Avenue, people who were in attendance at a HUCE presentation and panel discussion on getting Boston ready for climate change. That meeting included discussions of planning to move the City.

I can only speculate why this process is deflating. Even at the time of the HUCE meeting, it was clear Boston was not taking measures for preparedness as much as, say, the City of Cambridge is. Of the three possible responses to sea-level rise, in the absence of any other statement, Boston has made a commitment to wait-and-see and remaining-in-place. Unfortunately, this also means that commercial and other development along the Boston waterfront will continue as if nothing is going to happen.

So, in retrospect, the commenter @mt was correct and I was wrong. And Boston is taking the path of the lottery player.

I have cancelled my registration for the symposium and will, instead, be attending the 2016 Cleantech Energy Storage Finance Forum that evening.

## Repaired R code for Markov spatial simulation of hurricane tracks from historical trajectories

I’m currently studying random walk and diffusion processes and their connections with random fields. I’m interested in this because at the core of dynamic linear models, Kalman filters, and state-space methods there is a random walk in a parameter space. The near-term goal is to understand how to apply these techniques to series of categorical data, and understand the relationship between, say,
 W. Li, Markov chain random fields for estimation of categorical variables'', Mathematical Geology (2007) 39: 321–335 DOI 10.1007/s11004-007-9081-0. 
and
 K. V. Mardia, C. Goodall, E. J. Redfern, F. J. Alonso, The kriged Kalman filter'', Test, December 1998, 7(2), 217–282 DOI 10.1007/BF02565111. 
Along the way, I came across a mention of work done by Christophe Denuse-Baillon in his actuarial thesis which appears to study and describe a method of simulating North Atlantic hurricane tracks using a Markov spatial process calibrated with historical tracks. The idea, essentially, is to make a random walk from a suitably chosen starting point, taking steps conditional upon relative frequencies of directions taken from the current step. In other words, the Markovian assumption of conditional independence is made so the next step depends only upon the probabilities of taking a step in directions centered on the current one.

I would not know the details, since Denuse-Baillon’s thesis is in French, but Professor Arthur Charpentier introduced the idea and provided some R code for implementing Denuse-Baillon’s idea on actual hurricane tracks. Now the thing is the datasets he drew upon have naming mistakes in them which need to be repaired in order to use them, and his blog post did not identify these, although he did note they existed. But, more seriously, he did not provide a complete, runnable set of R code to reproduce his results, something which would be quite useful to students of the technique and the problem.

This blog posting remedies that providing runnable R code for replicating his results (a tarball), and fixing the mistakes in the datasets on the fly. Note that the R relies upon the XML, maps, ks, and RColorBrewer packages which also need to be installed.

The general technique is interesting in that it offers a non-parametric, indeed, model free way of forecasting hurricane tracks.

I have reproduced the figures Professor Charpentier showed below:

Larger versions of these figures can be seen by clicking on any one of them, and then using your browser Back Button to return to the blog.

The R code has been cleaned up some, and made a bit more robust.

Robert Grant also remarked upon Professor Charpentier’s work.

Note that it is still possible that when trajectories are generated, if a poor starting point is chosen, the generation step can take a while.

I also want to make note of the spMC package by Luca Sartore, available via CRAN, which he describes here. Abstract:

Currently, a part of the R statistical software is developed in order to deal with spatial models. More specifically, some available packages allow the user to analyse categorical spatial random patterns. However, only the spMC package considers a viewpoint based on transition probabilities between locations. Through the use of this package it is possible to analyse the spatial variability of data, make inference, predict and simulate the categorical classes in unobserved sites. An example is presented by analysing the well-known Swiss Jura data set.

## Living deliberately in Washington, D.C. (courtesy of The Atlantic magazine)

The adventures of Keya Chatterjee and her family living free of Pepco. Courtesy of The Atlantic magazine.

## Energy Democracy

I’ve actually written about this before, but John Farrell of the ILSR (“Institute for Local Self-Reliance” a famous Emerson essay, by the way) presents an up-to-date synthesis of developments, incorporating policy as well as Tony Seba-like, Hermann Scheer-like, and Michael Osborne-like insights.

By the way, the dreaded duck curve which utilities wonks and even some IEEE engineers from the PES I’ve listened to (more details here) does not seem to be materializing in two of the strongest renewables markets in the United States.

Update, 2016-08-21

From CleanTechnica:

Distributed Power

It’s a matter of political and philosophical debate, but I agree with the idea that society is generally better off when socioeconomic and political power are distributed. (Granted, a benevolent dictator can be a wonderful gift for a society, but most dictatorships don’t tend to be very benevolent from what I’ve seen.)

While we do live in a somewhat democratic society (in the US, Canada, UK, Europe, Australia, India, Korea, Japan, or wherever you are probably reading from), there’s no doubt that money = power, and people with more money or representing more money have more power in politics and society.

With regard to this matter, we often think of powerful people and companies in the telecommunications, media, banking, and real estate industries. Clearly, though, these aren’t the only ones trying to steer more cash to their executives than to society as a whole.

Of course, with many utilities being regulated monopolies, these are powerful giants as well (no pun intended). Despite the fact that they are regulated, the vast majority of us can’t name the people who regulate them, and there is rampant corruption in the sector. Largely, we don’t even know what they’re doing. I think we typically take utilities for granted and leave their work almost invisible — they’re there, we have to pay them to keep the lights and computers on, someone is watching over them to make sure they don’t fleece us (too much), etc.

A more obvious “enemy of the societal good” is the fossil energy industry. Burning coal and natural gas kills millions of people prematurely every year. We somehow accept burning these fossils as a necessity of modern life (though, given the state of clean technologies like solar and EVs, we no longer should), but we also know that these industries work hard to not clean up their processes and emit less pollution. They lobby government and fight huge wars against regulation with millions and millions of dollars that could have just gone toward protecting more lives from pollution. But hey, what can we do?

To find out, read their article.

Update: 2016-08-23

From ILSR, again:

I wouldn’t presume to define energy democracy for all those using the term, but I think those of us that use it share these common principles:

• Energy democracy means both the sources (e.g. solar panels) and ownership of energy generation are distributed widely.
• Energy democracy means that the management of the energy system be governed by democratic principles (e.g. by a public, transparent, accountable authority) that allows ordinary citizens to have a say. This means that communities that wish greater control over their energy system (via municipalization of utilities, for example) should have minimal barriers to doing so.
• Energy democracy means that the wide distribution of power generation and ownership, and access to governance of the energy system be equitable by race and socioeconomic status.