The mission of this group is to bring together utility professionals in the power industry who are in the thick of the digital utility transformation. 


Starting small can lead to big wins for utility data analytics, utility panel advises

image credit: Credit: Wikipedia
DW Keefer's picture
Journalist Independent Journalist and Analyst

DW Keefer is a Denver-based energy journalist who writes extensively for national and international publications on all forms of electric power generation, utility regulation, business models...

  • Member since 2017
  • 277 items added with 267,804 views
  • Oct 21, 2020

Faced with greater risk from events such as fires, floods and intense storms, electric power utilities are making greater use of analytical tools to enhance reliability and keep the lights on for customers.

A key component is a digital twin, which is a digital representation of a utility’s physical assets that enables utilities to leverage their enormous data resources to streamline customer requests, analyze capital investment options and break down silos that traditionally have isolated functional units such as planning, customer service, generation, and transmission, among others.

The topics of digital twins and increasing use of technology tools was the subject of a media day hosted by Bentley Systems. A panel of experts took part in a session entitled, “Digital Twins: Why Every Utility Should Take Note.”

During the session, Stephen Cooper CMRP and EAM Implementation head at Gainesville Regional Utilities in Florida explained how operations professionals compare vegetation maps with outage data to pinpoint potential problem spots. Then, by dispatching aerial drones, maintenance teams can rapidly inspect distribution and transmission lines to look for possible vegetation encroachment. Not only is regular tree-trimming better focused as a result, but—in the event of a severe storm or hurricane—repair teams can be pre-positioned to respond quickly to restore service.

“Utilities like customer and load stability,” Cooper said. Digital tools deployed by the utility in recent years better enable teams to analyze data related to operations, asset health and—with the growth in customer-owned distributed generating resources—power flows across the system.

At the same time, long-standing data silos within the utility are being broken down. This is enabling the  organization to move toward the goal if a utility-wide data network, he said. The analytical intelligence enables the utility to provide better customer service by offering alerts before a possible service disruption occurs.

Scott Alford, Predictive Maintenance Supervisor and CRL Trainer at Arizona Public Service, outlined how digital tools have changed the way that the utility manages its transmission and distribution inspection program. When he started more than two decades ago, Alford said that the inspection team amounted to himself, a digital camera and a Chevy Blazer. Today, the team has expanded to some three dozen members who use drones to patrol distribution lines and robots to inspect confined and hazardous spaces.

With wildfires a concern across the West, Alford said that Arizona Public Service is increasing its use of drones and artificial intelligence to inspect its transmission network. By early next year, the utility expects to have received permission to operate the drones beyond visual line of sight, enabling autonomous drone inspections.

“Fire mitigation is a big deal,” Alford said.

It’s also one of at least four dimensions where utilities can apply analytics to improve operations, commented Martin Runge, who heads Innovation, Portfolio and Partnerships at Siemens Digital Grid. During the roundtable session he listed those dimensions as planning, building, operations and maintenance. As analytical tools are deployed across each dimension, workflows can be streamlined and coordination across the enterprise can be enhanced.

But because utilities have so much data, it can be easy to move too fast as analytical tools gain wider adoption, said Vonnie Smith, vice president of Energy Infrastructure at Bentley Systems. Data needs to be maintained, too, if it is to remain up-to-date and accurate.

“Utilities have massive amounts of data,” some of which may be incomplete or misaligned, she said during the Media Day briefing. A helpful strategy is to focus on achieving “small wins” so that work and pertinent data can be adequately supported.

As one illustration, Alford said a recent train derailment in Tempe, Arizona, raised fears that hazardous material had leaked into an underground vault containing a 230kv line. With human access to the area restricted during the cleanup, the utility sent its robot into the vault to collect data to help response teams assess the possible threat to the line.

The deployment represented a simple, specific application of technology combined with data analytics that provided decision-making insights that otherwise might not have been possible.

“The robot’s ROI is already on its way,” Alford said.

Smith emphasized that in addition to identifying small wins, a successful digital twin deployment also depends on an “open, scalable cloud platform” that is able to support data acquisition from the multiple technology vendors that may be found at a typical utility. Ideas such as open software tools and a design that enables both artificial intelligence and machine learning are key to deploying a digital twin platform that is able to meet the demands of an evolving electric utility.


These speakers were featured at the 'TwinTalks: Energy Utilities' feature at the 2020 Bentley Year in Infrastructure Conference on October 20, 2020. The event featured an interview with Scott Alford of Arizona Public Service and an panel with Martin Runge of Siemens Digital Grid, Stephen Cooper of Gainesville REgional Utilities, and Vonnie Smith of Bentley Systems. More information on this feature and the Bentley event as a whole can be found here

Mark  Damm's picture
Mark Damm on Oct 23, 2020

Thanks David,


I recently shared some insights about ‘starting small’ for utilities when working towards advanced analytics. The train derailment example is great example of using data to assist decision making.


Regardless of where they are on their digital journey, most utilities will have a repository of assets, and some field or sensor data. The first step I recommend to utilities embarking on a digital / analytics journey is to augment their existing EAM system with a data mart. This includes ‘cloudify-ing’ the work order system and making multiple data sources available to it, such as mapping or performance information. This is a small win that can set a utility on a path towards more advanced performance analysis.


Curious if there were any other small steps to share?

Matt Chester's picture
Matt Chester on Oct 23, 2020

Here's Mark's recent post on those starting small steps towards advanced analytics, for anyone curious to read further:

DW Keefer's picture
Thank DW for the Post!
Energy Central contributors share their experience and insights for the benefit of other Members (like you). Please show them your appreciation by leaving a comment, 'liking' this post, or following this Member.
More posts from this member

Get Published - Build a Following

The Energy Central Power Industry Network is based on one core idea - power industry professionals helping each other and advancing the industry by sharing and learning from each other.

If you have an experience or insight to share or have learned something from a conference or seminar, your peers and colleagues on Energy Central want to hear about it. It's also easy to share a link to an article you've liked or an industry resource that you think would be helpful.

                 Learn more about posting on Energy Central »