Last year around this time in my article California Wildfires: Undefeated Mother Nature vs. Human (and artificial) Intelligence, I wrote about the challenges of using current technology for wildfire prevention, and discussed the knowledge gap between electric utilities and wildfire mitigation technology.
Since that time, California has experienced its largest ever wildfire complex (which was caused by lightning) along with other large wildfires that were allegedly caused by various utilities around the state.
In other words, just another year in California.
The State of California has created the Wildfire Safety Division and Wildfire Safety Advisory Board to oversee California utilities’ efforts to reduce the risk of wildfire ignition in the state. Furthermore, the research, development and deployment arm of the California Public Utilities Commission (CPUC) called EPIC (Electric Program Investment Charge) has focused more of its efforts on technology that is geared towards helping utilities mitigate wildfire ignitions.
While some promising technology has been evaluated through EPIC, I still see two major trends happening:
- Most technology companies are still offering incomplete preventative solutions.
- Utilities are inundated by technology offerings so much so that progress is being slowed by their own internal challenges to quickly evaluate and deploy new technology.
If we believe that wildfire ignitions can be reduced or prevented using technology, we must do something to make technology solutions more complete AND utilities must deploy these solutions more quickly.
Obviously, for a regulated utility, this is easier said than done. However, for tech companies, the opportunity is wide open. Unfortunately, I see many tech companies waiting to rely on client data to make their solution better. This is not going to work when it comes to wildfires because, aside from wildfire modeling, utilities do not have much of the data that technology vendors need. There are, in my opinion, some backdoors to a better solution that can be created if a tech company is willing to make the investment of time and money.
Proactive Data Collection
While data collection is slowly becoming more programmatic among California utilities, it is mostly relegated to weather and environmental condition data (e.g. fuel moisture content). But weather and fuel are only two components to the wildfire issue and focusing data collection on them alone is not enough to solve the problem. Utility tech vendors on the other hand are very reluctant to proactively collect data and instead, want the utility to pay for the data collection.
Based on my experience working at a utility, tech vendors NEED to start investing in the collection of their own data and bring more complete solutions to the client. The autonomous driving vehicle industry proactively collects data so that they can figure out how to teach cars to drive unassisted. Utility tech vendors should take the same approach and proactively collect data so that a solution to the wildfire problem might be discovered.
Understanding the Problem Better: A case for proactive data collection
There are two main protagonists of California’s wildfire story: mother nature and utilities. The public, and the State of California, blame utilities for causing some wildfires while the utilities blame climate change. However, focusing on who or what to blame does not help us understand WHAT is happening and WHY.
With utility-related wildfire ignitions, one of the alleged causes is vegetation contacting a power line. Trees contact power lines every year…this is nothing new. What is new, however, is that now more than ever large fires are seemingly more likely to occur when a tree contacts a power line. Why is this? What is happening now that is different than in the past?
To offer one scenario explaining part of the “why”, let us look at the past two decades of California rainfall, or rather, the lack thereof.
Figure 1 below shows that since 2000, California has experienced three major drought episodes (we are likely experiencing a fourth currently). While droughts are common, the past two decades show that much of California has been in some drought condition more often than not.
Figure 1. Percentage of Area in California Experiencing Drought Conditions 2000-2020
From 2012-2016, California purportedly experienced the worst drought in 1,200 years. If we look at the three years from Jan. 2013-Dec 2015 in Figure 2 , almost all of California was in Severe, Extreme, or Exceptional drought conditions during that time.
Figure 2. Percentage of Area in California Experiencing Drought Conditions 2013-2015
Given the recent drought trends, trees have been increasingly stressed since 2000. To make matters worse, the proverbial nail in the coffin for many trees was the unprecedented drought from 2012-2016.
Over an estimated 130 million trees (and counting) have died in California forests in the past decade…this is well documented (see Figure 3 below). In fact, California utilities are still using the state’s special accounting mechanism called the Catastrophic Event Memorandum Account (CEMA) to recover the incremental cost of mitigating the extraordinary influx of dead/dying trees that pose a risk to power line infrastructure. If we look at California’s Tree Mortality Viewer, we can see how tree mortality increases throughout the state from 2012 to 2018.
Figure 3. Tree Mortality per Acre
For an electric utility, such macro-level information is good for a reactive approach to a problem like tree mortality, but it does not provide much help to proactively mitigate a problem. To approach the problem proactively, more granular data is needed.
Publicly available tree failure data shows some interesting data points and helps provide a case for a technology company or utility to proactively collect data.
For example, from 2004-2017, Figure 4 shows that Species A is clearly the most problematic, experiencing a failure at twice the rate of Species C.
Figure 4. Tree Failures between 2004-2017
However, Figure 4 does not tell the whole story. It only tells us that Species A has the highest number of failures. If we double-click on the type of failures, we can see that Species A is more prone to branch failure than to trunk or roots failure.
Figure 5. Failure Type of Species A
If we look at the data a different way, Figure 6 shows that the trunk diameter of failed trees (which can be a proxy for age) provides some insight into the size of trees that are most problematic.
Figure 6. Annual Tree Failure by Trunk Diameter
And if we isolate Species A and filter the data by different failure types, we can see there is a slight difference in the size of trees that shed branches versus failing at the trunk or roots.Â
Â
Putting it All Together
While not overly compelling, the information above provides examples of some of the insights that can be gleaned when one has the appropriate data. More importantly, granular information of this type could help technology vendors build out more useful solutions and, thus, help utilities immediately take a more proactive approach to vegetation risk reduction.
To that end, however, data collection is not enough. Technology companies need to understand how to create a useful solution for a utility, and to accomplish that, they need utility subject matter experts who understand both the technology and the utility industry. And remember: Do not hire these experts for a sales role. They are experts. Hire them to help build an expert product.
One superpower of humans is our ability to unite around a common problem. Wildfire is our generation's problem. Will utilities and technology vendors find a better way to unite to combat wildfires? Or will we continue on the current path of minimal progress?