Expert Interview with Kevin Maroney of Keyrus on How a Flexible Data Solution Drives a Utility’s Modernization- [an Energy Central Power Perspectives™ Interview]

Posted to Energy Central in the Utility Management Group
image credit: Energy Central
Energy Central Contributor's picture
Staff Contributor Energy Central

Contributor to Energy Central

  • Member since 2020
  • 1,766 items added with 722,571 views
  • Mar 2, 2022

The fact that utilities must embrace a data-centric future is no longer a surprise to those in the industry, but that doesn’t mean the process of modernizing operations and analytics practices has gotten any easier. For utilities earlier on in their data and digitalization journeys, there may be an inertia towards really diving in due to the enormity of the task at hand.

However, the power sector is evolving regardless of whether an individual utility is keeping pace. Customer demands have become more agile and digital, regulators are looking to push more grid modernization and clean energy technologies, and leaders at the utilities must keep up with this changing landscape. One of the key ways they must do so is by embracing the new solutions becoming available every day, especially in the collection, analysis, and use of data.

Your access to Member Features is limited.

Kevin Maroney is a Director at Keyrus where he’s helped numerous utilities to grow towards this data-filled future. Energy Central was fortunate to sit down with Kevin as a part of our Power Perspectives interview series to pick Kevin’s brain about this key area of rapid development: 

Energy Central: Thanks for chatting with Energy Central today. Kevin, let’s start with the basics—as a Director at Keyrus, can you explain what role you’re playing in the modernization in utilities? Are you assisting the utilities directly to meet their goals?

Kevin Maroney: I work directly with our utility clients to build modern platforms and solutions that fulfill their current business needs and set them up for success in the future.

Utilities often have a lot of data across several different systems, which can complicate reporting processes and make it difficult to ensure that they have clean data. Our job is to deliver platforms that connect those different systems, such as SCADA, PI, Salesforce, Weather, and Oracle, and centralize that data into one system. Then, we craft and model that information together to ensure good data quality and easy-to-use analytics tools. Essentially, our aim is to empower the business to comfortably and reliably use this integrated data in their day-to-day operations.

What we’ve found is that many utilities face very similar modernization challenges, and after working through these with multiple clients, we’ve developed a framework to help expedite the development of these solutions. Our framework greatly accelerates the time it takes to stand up a modernized data platform while still providing flexibility so that a utility client can attain their specific goals.


EC: So, when you get brought into a new project with a utility partner, what range of starting points do you see? Do you find some of the utilities need significantly more work to get to a modernized solution than others? And if so, what are the early indicators that that may be the case?

KM: There is a wide spectrum of technical readiness that we’ve seen across our clients, but there are a few common themes:

  • Drowning in Excel files: Excel can be a really valuable tool in some cases, but when we talk about scaling data visualization and analytics processes, static tools like Excel can be a major barrier to modernization. You might have a separate Excel extract that you’ve downloaded from five different sources, another two Excels that are riddled with vlookups and macros, and a final output sheet that is 30 columns wide and 100,000 rows tall. These types of solutions offer countless opportunities for human error, and take significant amounts of employee time just to prepare.
  • Data overload: There is so much data within the utility space. Operational teams get slammed with requests from across the business to answer questions and it takes time to extract, compile, synthesize, and distribute information to the right people. The amount of time it takes to sift through massive amounts of information creates a real bottleneck that can impact the entire organization.

The organizations that fare better are those that have a manager or internal advocate who is willing to step back and explore new and better ways to do their jobs.



EC: Specific to utility data solutions, what are the common practices you see? What are the specific shortcomings they need assistance overcoming?

KM: We’ve found a handful of systems really dominate the utilities space when it comes to generating and organizing data. Some of these products include their own out-of-the-box operational analytics. While a one-stop shop for data entry, collection, and analytics sounds like a dream, we’ve found that these solutions ultimately limit your ability to get at the information that you really need. If you use any additional products, your out-of-the-box reporting solution won’t be able to pull in that outside data. As soon as you begin to explore those crucial questions and use cases that will add value to the organization, finding answers across platforms takes ten times longer than it should, and processes are not easily repeatable when a similar question arises a week later.

To provide a more tangible example, we have a municipal utility that wants to track meter health. At a high level, the AMI system can tell you all the meters you have and have not gotten readings from. But that’s only the beginning of the journey - there are a lot of business and operational questions that need to be answered. How long has it been since I’ve heard from this meter? What is the service point and customer information for this meter? Have we alerted the customer? How is this impacting their billing? Do we have an engineer in the field actively assigned a ticket to work on this meter? And the list goes on. Without a tailored data solution, this becomes a wild goose chase from system to system to get a full picture of what’s going on.


EC: Most utilities will recognize the need for tapping into the power of data, but there are a myriad of reasons standing in the way—from costs to lack of experience to simply not knowing where to start. What do you find to be the most common sources of resistance?

KM: A lot of times, people doubt that they can build something more efficient - they think it’s just a pipedream. But the biggest resistance I see to these types of solutions is getting business and end user buy-in. If IT is the sole driver of these solutions, it rarely hits the mark in terms of the value it provides and, because of this, doesn’t get adopted by the larger organization.

It’s important to get buy-in not only from the folks that are going to be using this platform operationally, but from their managers too. The manager should communicate how they expect this to change the employees’ day-to-day. The messaging and marketing of how this solution will provide value to the business is exponentially more significant when it comes from business leaders as opposed to IT (or from a consultant like myself).

While we often work with the business as a group, it’s also important to take time to communicate how this solution will provide value on an individual level as well. It’s critical to provide the answer to “What’s in it for me?” Otherwise, individuals typically won’t change.


EC: Tapping into that data can be a bit of a Pandora’s Box for utilities—how do you recommend they handle the volume and velocity of that data while still managing costs?

KM: Historically, a data platform was on-prem and used traditional means of capturing and storing data. This meant utility clients had to decide the necessary capacity needed from day one. If estimating your platform needs from day one wasn’t challenging enough, many utility clients would struggle to determine what makes the most sense financially because many vendors would offer additional incentives for pre-purchasing capacity in multi-year packages.

So, what’s changed? Cloud computing providers like AWS have created a much lower barrier to entry to begin building a data platform. Managed services like AWS Glue, DMS, Redshift, and others remove the extensive maintenance and administrative tasks that were historically needed. Native services all within AWS address many of the advanced needs of utilities like IoT device management (AWS IoT) and streaming (AWS Kinesis).

From a cost perspective, the ability to “rent” not only the services but the infrastructure that supports them allows you to prove concepts out without any substantial long-term investment. Also, storage has become incredibly cheap using cloud-based object storage (AWS S3). For example, you can store a TB of data in S3 for roughly $23 per month. Modern cloud-based data platforms like Snowflake provide up to three times compression on that data. For context, a TB of data in Snowflake can store a year of 15-minute interval data across +200,000 AMI meters for roughly $300 of storage costs.

All of this allows us to process immense volumes of data by scaling our resources out while paying only for what is needed.


EC: Let’s say utilities overcome the resistance and the challenges, what new opportunities are unlocked for them? What types of business problems can they newly solve with a data platform framework?

KM: In my experience working with operational and engineering teams, there is never an end to responding to requests from other departments. Teams can become more proactive and less reactive, and when they need to react to requests they can do so quickly because the information is already at their fingertips. We want to work with our business end users and help provide the answers before the user knows they are going to ask the question.

For example, we worked with a municipal utility to provide overall grid health by providing access to view historical energy consumption across 400,000 meters and providing the capability to drill down into the service life of a single meter. We then delivered advanced analysis into consumption patterns across service points to analyze things like theft detection, water leaks, ​​meter-to-transformer mapping, and other patterns that informed operational decisions.

We worked with large cooperative power suppliers to build a framework to help them assess generator health. This framework made the data more accessible faster, which helped them to avoid costly malfunctions, and it made their data more flexible so they could tweak the weights and scores of the health model and run what-if analysis to better optimize that equation.

We have seen regional transmission organizations consolidate their tech stack and enable modern data sharing to all of their constituents. A main offering of this RTO was to provide energy pricing across its grid for purchase. This process also included the creation of better monitoring and controls across the energy market to ensure no single account was overly exposed and ran the risk of defaulting on their positions.


EC: If there’s a utility decision-maker who is reading this interview and is on the fence about starting, what one piece of advice or word of wisdom would you offer to help them make their decision?

KM: I think the most important thing is to understand that this type of initiative is a joint effort between business and IT. Speak with your different business units. Get them involved. More engagement will create more buy-in, ultimately increasing the chances for success when you take off on your data analytics journey.

I know you said one piece of advice, but I couldn’t leave without saying that when the platform goes live, the work is not over. An extremely well-built technical solution can be a complete failure if no one uses it. Create a strategy for user adoption, track usage, interview users for feedback, and create KPIs to track the success of the platform.




Thanks to Kevin Maroney for sharing his insights with the Energy Central Community in this Power Perspectives Interview. You can trust that Kevin will be available for you to reach out and connect, ask questions, and more as an Energy Central member, so be sure to make him feel welcome when you see him across the platform. If you have any specific questions for him based on what we discussed above, be sure to leave a comment below!

Energy Central Contributor's picture
Thank Energy Central for the Post!
Energy Central contributors share their experience and insights for the benefit of other Members (like you). Please show them your appreciation by leaving a comment, 'liking' this post, or following this Member.
More posts from this member


Spell checking: Press the CTRL or COMMAND key then click on the underlined misspelled word.

No discussions yet. Start a discussion below.

Get Published - Build a Following

The Energy Central Power Industry Network is based on one core idea - power industry professionals helping each other and advancing the industry by sharing and learning from each other.

If you have an experience or insight to share or have learned something from a conference or seminar, your peers and colleagues on Energy Central want to hear about it. It's also easy to share a link to an article you've liked or an industry resource that you think would be helpful.

                 Learn more about posting on Energy Central »