The mission of this group is to bring together utility professionals in the power industry who are in the thick of the digital utility transformation. 

Post

Kx Insights: Benefits of Utility Predictive Maintenance Analytics

Przemek Tomczak's picture
SVP IoT and Utilities Kx

I am a CPA, CA with over 20 years business and technology experience in software, consulting, IT management, infrastructure with leaders such as Kx Systems, First Derivatives (FD), Ontario’s...

  • Member since 2016
  • 3 items added with 3,509 views
  • Aug 31, 2018
  • 1351 views

Utilities are going through significant modernization, with the adoption of smart grid technologies such as advanced metering, advanced distribution management, outage management, customer engagement and analytics. This modernization is creating a wealth of diverse data about assets, operations, and customers.

At the same time, the job of utilities is becoming more challenging with pressure to reduce costs, competition from new forms of technologies and energy providers as well as the need to integrate renewable energy resources. These new challenges, competition and pressures are leading to innovation and transformation in the utilities industry.

Navigant has estimated that cumulative utility spending on asset management and condition monitoring systems for the power grid will total $49.2 billion during the ten-year period ending in 2023. These investments in preventative repairs are required to forestall the higher costs associated with letting assets run to failure.

How Utilities Benefit

Recognizing the need for industry-wide standards for asset management programs the international standard ISO 55000 for Asset Management and the Publicly Available Specification (PAS) 55 have been developed for optimal management of physical assets. These are great resources for establishing a data-driven asset management program. They promote generally agreed upon best-practices for a data- and risk-based approach across the industry.

For example, one energy provider was able to predict failures of equipment weeks in advance with over 98% confidence by analyzing sensor data from equipment including temperature, vibration and sound, together with maintenance records and equipment manufacturer specifications. These types of data and analytics programs enable utilities to have better visibility of their entire system’s assets, and to incorporate this information into their predictive maintenance program and operations going forward.

Some examples of the use of data and analytics for improving operations and assets include:

  • Assessing the probability and consequences of asset failures by collating asset health and network model data with information on outages and previous failures.
  • Improving the accuracy of short- to long-term forecasts of demand by developing customer specific load profiles which incorporate disaggregated consumption information, weather, demographic and firmographic data.
  • Identifying risks to safety, such as energized wires, by correlating data from different sources in and around an outage.
  • Identifying anomalies and predicting events, failures or failure modes, to achieve proactive and prioritized maintenance and repair activity.
  • Identifying the root cause of failures or issues on systems and then undertaking targeted repairs.

The realization of these benefits is predicated on deciding what assets are important, what needs to be optimized, what needs to be measured and the analyses to be performed. These all depend on having the appropriate data.

Fortunately, utilities have a lot of data sources — from SCADA, power-line sensors, GIS, outage management, systems, and smart meters. When the relevant data is integrated and collated in meaningful formats, and time intervals, then operators and asset managers can then use it to  synchronize predictive maintenance planning across many systems in near real-time.

Utilities adapting to Big Data

As utilities integrate new data streams, increased data volumes may begin to put a strain on existing operational and IT systems not designed for high volumes and frequencies of data. For example, systems setup for smart meter readings once every few minutes to once per hour, will need to accommodate synchrophasors and power-line sensors that generate measurements at many times per second.

The combined streaming, real-time and historical data analytics capabilities of technologies like Kx are helping companies accelerate their use of diagnostic and predictive analytics for extremely large datasets. They do this by supporting the increasing velocity and volume of data, providing a consistent and integrated view of operations, while also supporting real-time anomaly detection and decision making.

Kx is augmenting existing data collection, data historian and asset management systems. Kx  is also helping companies travel backwards in time to replay and investigate  historical events and conditions, as well as enabling machine learning algorithms for making better predictions of events and system conditions.

The utility business model is being challenged and transformed by new sources of data. With the cost of sensors dropping and the availability and adoption of Big Data technologies increasing, critical asset maintenance can now largely be monitored remotely with Big Data predictive analytics. The benefits for utilities are enhanced reliability and uptime, cost reduction and improved safety.

Co-authored with Kate Jory, a Business Development Executive for Kx currently based in Seoul, South Korea. Her academic and business experience is in physics, marketing and sales.

Przemek Tomczak's picture
Thank Przemek for the Post!
Energy Central contributors share their experience and insights for the benefit of other Members (like you). Please show them your appreciation by leaving a comment, 'liking' this post, or following this Member.
More posts from this member
Discussions
Spell checking: Press the CTRL or COMMAND key then click on the underlined misspelled word.
Bob Meinetz's picture
Bob Meinetz on Sep 3, 2018

Przemek and Kate, the latest focus on grid modernization and security is long overdue. Though some of it would be necessary anyway, that distributed generation is responsible for a significant part of added costs is undeniable.

Utility generation remains the most efficient way to make electricity available to everyone in society, and increasingly, a flat grid connection fee forces all grid customers to accept responsiblity for grid modernization and security. Whether distributed generation customers are paying their fair share for maintaining the grid is debatable. Less-affluent customers, without their own solar arrays and storage, argue self-generators overestimate their environmental contribution. They argue they shouldn’t be penalized by high grid connection fees if they’re using less electricity, and they have a point.

Soon (hopefully) Big Data technologies will be able to track the other side of the equation - emissions - that increase as a result of the shift to distributed generation. If we’re freely assuming “distributed generation” and “cleaner generation” are synonymous, we shouldn’t be. It’s easy for society to clean up utility electricity, and track its emissions, as technology improves. It’s not so easy to track emissions of those who rely on tanks of propane to back up their solar arrays, or even generate all their electricity with natural gas generators.

Get Published - Build a Following

The Energy Central Power Industry Network is based on one core idea - power industry professionals helping each other and advancing the industry by sharing and learning from each other.

If you have an experience or insight to share or have learned something from a conference or seminar, your peers and colleagues on Energy Central want to hear about it. It's also easy to share a link to an article you've liked or an industry resource that you think would be helpful.

                 Learn more about posting on Energy Central »