Covid Impacts on PJM Demand

Like everyone else, PJM has had to react to the impacts of Covid-19 on their operations. Their Systems Operations Subcommittee has created an Operations Pandemic Coordination Team to discuss pandemic coordination operations. In addition, weekly on Fridays, PJM is providing status updates for each state and other members/stakeholders.

Estimated Impact Daily Peak and Energy.jpg

PJM Planning Committee has also been generating weekly updates on Covid-19 impacts to load. The graph above was taken from the most recent (April 14, 2020) presentation. Here are the findings:

  • On weekdays last week (week of 4/6), peak came in on average 8-9% lower (~7500 MW) than anticipated.

  • Largest impacts so far were around 10-11% (~9500 MW) on 3/26.

  • Energy has been less affected, with average weekday reduction since mid-March being 7%.

  • Weekends have been impacted less (~2-4%)

Take-aways: Obviously the change in work patterns via stay-at-home/shelter-in-place orders have impacted demand, but how so? What about the huge increase in unemployment? If the 8% reduction in peak is coincidental load, does that mean the energy footprint of our office employees while at work is higher than when working from home? What about the weekend load, is that all retail closures? Difficult to say at this point what the root cause is with so many variables.

Is Weather Becoming More Extreme? Let's look at the data...

There’s been a lot of talk on various media outlets about “extreme” weather events, their ferocity and frequency, and how this is the “new normal”. And of course these days you can’t talk about weather without also talking about climate change. Regardless of whether you’re a believer or skeptic, I wanted to see what the data has to say regarding extreme weather: on average, is our (national) weather becoming more extreme, less, or about the same as it was?

Fortunately, our government has a comprehensive website detailing extreme weather events dating back to 1910. The National Oceanic and Atmospheric Administration (NOAA) publishes all sorts of great data on weather events via the National Center for Environmental Information (formerly the National Climatic Data Center).

Courtesy of the NOAA

Courtesy of the NOAA

In the graph above, the data compiled for each year (red columns) is “based on an aggregate set of conventional climate extreme indicators which include monthly maximum and minimum temperature, daily precipitation, monthly Palmer Drought Severity Index (PDSI), and landfalling tropical storm and hurricane wind velocity.” Additional background information on their methodology and data can be found here.

If we follow the 9-pt binomial filter (a recognized statistical smoothing technique), it’s apparent that extreme weather events have increased steadily since 1970 and have peaked in the last 10 years. We also notice that extreme weather events between 1910 and 1970 gradually decreased 5-8% points. For this post we’re not examining the root cause, only resultant data, so how or why the events trend downward then shoot up is for another time. In any case, there is a clear spike in events starting in the mid-1990s.

So why does this matter? Increasing extreme weather events matter for various reasons, the most obvious of which is the cost to rebuild/reconstruct, a majority of which is paid by taxpayers and via insurance premiums (the greater the risk, the more we all pay). Other consequences are the loss of economic activity when priorities are shifted to reconstruction, the costs to reinforce existing infrastructure in preparation for future events, and an increase in uncertainty for strategic planners who rely on steady data to manage risk.

Oh, and if you were curious how much these extreme events were costing us, the NOAA was kind enough to graph that as well. 2011, 2017, and 2016 were the most expensive years ever recorded, with 2018 already exceeding the fourth largest with three months left to go…

Courtesy of NOAA

Courtesy of NOAA

Distributed Energy Resource (DER) Siting

Interesting article in the new SolarPro about how states (California) are pushing utilities to use their capacity and capital planning information to optimize the siting of PV and storage projects.  As many of us know, the interconnection process for large commercial and utility projects can be a game of chance.  What the line capacity is (and therefore how big the system can be) and whether there are required upgrades is determined after a lengthy review.  The initial responses generally include hefty cost estimates to proceed with what amounts to a nameplate size significantly less than what was proposed.

LNBA Demo B.jpg

The CPUC (California Public Utility Commission) created two working groups to address the issue: the Integration Capacity Analysis (ICA) and the Locational Net Benefits Analysis (LNBA).  Above is the heat map for the LNBA demo.  Per CPUC, "the goal is to ensure DERs are deployed at optimal locations, times, and quantities so that their benefits to the grid are maximized and utility customer costs are reduced."

Why is this important?  Consider the $2.6 billion planned transmission project upgrades in California that were recently revised down, accounting for higher forecasts of PV and energy efficiency projects.  And besides the avoided costs, don't forget all the grid upgrades DER developers are paying for that benefit all consumers.  This is a key factor in the debate over whether PV owners pay their fair share, or rather whether non-DER owners are subsidizing DER projects.  With the federal investment tax credit decreasing to 26% in 2020, 21% in 2021, and 10% in 2022, the models generated in the CPUC exercise can be used to reduce development costs, defer distribution and transmission capital improvements, and lay the groundwork for incentivizing grid-beneficial siting.