Welcome to the new Energy Central — same great community, now with a smoother experience. To login, use your Energy Central email and reset your password.

The “Early & Often” Approach to Analytics

After over a century of stability and incremental change, the utility industry is now undergoing massive changes. Distributed energy resources (DERs) are changing how the grid operates and how customers engage with their local utility. The pace of growth of electric vehicle (EV) usage is accelerating. Generation sources are no longer exclusively centralized and are becoming decentralized and localized.

One common thread across these and other changes in the utility industry is the presence and role of data and analytics, which present their own challenges and benefits for utility leaders navigating this evolving landscape. As the utility industry posture towards data and analytics has evolved from simply managing masses of data to leveraging that data for predictive and prescriptive insights, utility leaders can see that not only is analytics a “nice to have” it is quickly becoming a requirement to meet the needs of the changes in their operational and customer landscape.

A leader articulating that he or she wants their organization to embrace data and analytics as a core competency to run the utility is one thing. Recognizing how the organization needs to change to make that embrace last is often quite another. Let’s look at a few examples.

            Starting with the basics, one observation is that a foundational ingredient that is often missing is a data governance plan. A study of the utility data and analytics space conducted by TMG Research in 2021 found that just over half of the respondents (53%) have a formal data governance plan. If an organization wants to scale its analytics use cases and does so without a data governance plan, it becomes a “fake it until you make it,” and when that no longer works, things can get ugly. Data access, security formats, standards, sharing…these all need to be established early and revisited often.

            Two technology areas to also consider early and often are the use of open source tools and the cloud. Looking first at open source tools, these are presenting some interesting opportunities and challenges. On the plus side, there is a lot of new talent coming into the utility industry that has built its collective resume using open source tools. The industry clearly needs to on-board this new talent to meet the needs of that changing landscape noted above. One other facet of open source tools that attracts many utility leaders is that…it’s free!

            A few thoughts on the other side of the open source ledger merit some thought. First is that yes it’s free, but in reality it’s free like a puppy is free. The professional services, frameworks, and additional software and hardware should be considered when looking at open source TCO (Total Cost of Ownership) over a multi-year period. A second consideration is the limitation of open source tools in scaling to the large enterprise use cases that utilities need to execute to be viable in the very near future.

            Similarly, use of the cloud is growing in the utility industry, but this has its share of plusses and minuses, as well. That same TMG study referenced above also points to limited but growing use of the cloud for data storage, and good thing that. The growth of data sets, especially image- and video-based data sets that are now becoming part of many asset inspection, vegetation management, and fire mitigation strategies, would quickly put the traditional corporate server farm out of business very quickly. In addition to using the cloud to store these massive data sets, applications are starting to reside here with increasing frequency. This can enable greater cyber-security and afford more flexibility for the utility, especially with the “burst” capability that cloud solutions provide.

            One area to watch for in migrating to the cloud are costs, which can be less than the traditional on-prem approach on the front end, but over time utilization can really drive costs. One case in point is what might be considered a “dirty little secret” in the cloud marketplace; namely, that cloud sales teams are often compensated on utilization. So, encouraging use of the cloud can be an economical way to meet business goals, but keep an eye on the meter.

            So, what should utility leaders be looking for in this era of data-rich operations that will hopefully yield business value? Here are a couple of ideas that might help. First is alignment. We still find utilities engaging in data and analytics initiatives that are not aligned with the corporate strategy. The best way for an analytics initiative to fail is to start or become misaligned with core corporate goals. More often than not, those high-level goals or strategic imperatives are there – they just need the proper attention from those that are connecting that strategy with the actual analytics work being done.

            Second is to apply the old Steven Covey principal and “start with the end in mind.” Many analytics initiatives start with a single use case as opposed to a large enterprise undertaking, but even in relatively “simple” use case-driven initiatives the solution must be built and delivered to scale to the enterprise. Is there adequate governance? What about data standards and formats? Are there data or analytic sharing opportunities across the traditional organizational or system boundaries? This is another opportunity to hit these and other issues “early and often.”

            Analytics initiatives are not small undertakings. Mistakes will happen, lessons will be learned, and ultimately value will be realized. Just be certain to prepare and focus on the key tasks early and often.

7 replies