“Build way more wind and solar ‘than needed'”

Many people familiar with traditional energy networks, including the electrical grids of utilities, come with strong preconceptions to considering zero Carbon energy sources. This is particularly true of and for experts in traditional energy, including engineers. They focus upon the intermittency of such renewable sources. Translated, they don’t really mean intermittency, since all power sources can go offline, they mean they can’t flip a switch and deliver energy, or back up a faulted energy source upon demand. It is a mindset.

Some traditional energy sources, particularly large nuclear power plants, can go offline in short notice, and, because of their size, the rest of an electrical power grid needs to jump through hoops to respond, maintaining load.

The preconceptions extend to having what’s called in the consumer electronics business, a closed ecosystem (e.g., Apple products), where hardware and software options meet standards set by a central point. With the advent of distributed energy, notably solar PV on home and commercial rooftops, and demand response options, end users are becoming generators of electricity and, by withholding demand, can influence and help grids respond to loads. This changes the business relationship between “the edge” of electrical networks and their operators, something which utilities and RTOs/ISOs have only come to appreciate over several years, some more slowly than others.

Part of the problem of these attitudes is that they leak out into the mostly uneducated public and even among policymakers. The former head of EOER in Massachusetts, Matthew Beaton, refused to address issues of why Massachusetts couldn’t be more like Texas when directly questioned, appealing to the silly aphorism that renewables couldn’t be relied upon when “the sun don’t shine and the wind don’t blow”. But, in fact, that is a gross simplification, grounded in expectations set by years of operating conventional fossil fuel energy in the late 20th century.

The fact of renewables is that they have zero marginal cost. This means that, once constructed, they deliver energy for 2-3 decades with little additional financial inputs, governed solely by the circumstances of sun and wind. In comparison to traditional fossil fuel plants, they are also much less expensive to build. They do require more space, and they work best if scattered over a wide area, although exploiting rich, high wind areas like the offshore New England coast offers competing advantages.

There is an idea that a region or a state or a town or a home or business has a certain amount of energy “it needs”. That’s true, but to the degree demand can be adjusted or shaped in time and by choice means these aren’t inflexible. It is also true that not all consumers of electrical energy need the same quality of electrical power, although as far as I know, there are presently no plans to relax that standard, possibly because it trades an economy for a greater headache in managing different mixes of electrical energy.

One way to deal with having things like capacity factors, where a resource isn’t always available, is to massively overbuild the resource. This means building 3X to 8X the capacity typically expected based upon such “needs”. This is possible only because renewables are inexpensive to build, and they can be built rapidly. These costs, as I’ve recently observed, as have others like Professor Tony Seba and Haegel, Atwater, and colleagues have observed, are dramatically decreasing.

But Professors Richard Perez and Karl Rabago have underscored the idea of overbuild as a new principle of operation. The idea is not new, because it’s engineering commonsense. Indeed, the idea is part of the mix which Professor Mark Jacobson and colleagues proposed. Indeed, overprovisioning has long been a tactic used in the design of supercomputers.

In fact, when I worked with installers to buy a PV system for our home, I ran into this kind of counterproductive thinking. Installers wanted to maximize utilization of the PV panels across a year, and, when picking the number of panels, chose the number which maximized that measure. I, on the other hand, wanted to generate enough Watt-hours across the year to offset our entire electricity demand. We have a shaded situation, partly there from trees, so panels were not going to be uniformly illuminated. I finally found a terrific installer, RevoluSun, and they understood and wanted to do exactly what I thought and wanted. I ran into the same kind of thinking when I worked at IBM Federal Systems in upstate New York: Managers had a hard time understanding that if the objective was to get a calculation done as quickly as possible, there were times when only a portion of the computers in a multicomputer could be used, but all of them were needed for other parts of the calculation. They wanted to pick a number which maximized utilization at the price of slower computation. These were otherwise sharp people, but they were trapped in a certain way of thinking.

About ecoquant

See https://wordpress.com/view/667-per-cm.net/ Retired data scientist and statistician. Now working projects in quantitative ecology and, specifically, phenology of Bryophyta and technical methods for their study.
This entry was posted in Anthropocene, climate, climate change, environment, global warming, Hyper Anthropocene, rationality, reasonableness, solar democracy, solar domination, solar energy, solar power, solar revolution, wind energy, wind power, zero carbon. Bookmark the permalink.

Leave a reply. Commenting standards are described in the About section linked from banner.

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.