Artificial intelligence (AI) has quicky become the hottest topic in utility digital transformation space. Whether you view AI as an over-hyped phenomenon, or a major game-changer that has crossed its tipping point, the massive attention it is garnering is undeniable.
It’s becoming hard to find examples of utilities who are not embarking on some type of AI initiative- be that a small isolated application of machine learning, or a comprehensive organization-wide AI strategy with large numbers of dedicated staff and active initiatives cutting across multiple lines of business.
But wherever you are on that AI continuum, your success will depend on data – and lots of it!
AI Data Considerations
Large volumes of historical data are essential for the training and fine tuning of learning models – those you will build upon as your AI initiative ramps up. Data needs will grow exponentially from there, becoming more dynamic and complex as organizations embrace true “Agentic AI” – where layers of autonomous agents converge to make intelligence more actionable and scalable inside and outside the organization.
It's important that organizations possess a comprehensive framework and approach that can supply the required data needed to support the size and scope of your AI journey. Developing that approach needs to be thoughtful and deliberate, considering:
- The environments/use cases being considered for AI applications, and the initial objectives for AI in each
- Specific data requirements to support those objectives
- Current readiness of the data in terms of availability, access, quality and usability
- Compliance with governance principles, policies, and operating practices
- Potential for scaling the AI use cases to other business areas or applications
One Byte at a Time
For those just starting out, thinking about these data implications of AI can be daunting. How can an organization with limited resources take on a data strategy comprehensive enough to support an agenda as large and broad sweeping as AI?
The good news is that AI is evolving quite modularly, with most organizations starting with small use cases/ quick wins and building from there. That means your data strategy can track in much the same way, starting small and evolving to match the scope and pace of your AI strategy. It need not be so overwhelming from the start, in way that paralyzes or stalls innovation.
Getting Started
Here are 5 areas to consider when evaluating and positioning data within your AI initiative:
1. Target use cases and focus area: Consider carefully the business environments and use cases being considered, and how you expect AI to contribute to each use case. These expectations for AI (e.g. generating information vs predicting behaviors, facilitating/ assisting decisions vs automating and executing decisions, etc.) will define the depth and breadth of the data needed. For those just staring out, there are a host of basic AI applications and use cases that can create quick wins without requiring mountains of data from disparate data sources. Limiting your scope (at least initially) to areas that are designed to inform, assist and facilitate (vs predict, optimize and replace functions) will get you out of the starting gate and put your data to work faster.
2. Define data categories and needs: Evaluate the categories of data needed to support those use cases, along with the specific data elements needed to drive the minimum viable product (MVP) or application of the use case. Make sure to separate short and long-term needs so you don’t bog yourself down and miss the quick win. Your goal here is not to ‘boil the ocean’, but rather create a data framework that is aligned with the scale and scope of the initial use cases you select.
3. Assess data readiness: Assess each data category and element at 5 levels - availability, accuracy, usability, access, and quality. These factors determine the degree to which your use case would be able to function today, and how your initial AI MVP should be scoped. Define a plan for addressing issues around availability and access, along with any special conditioning of data needed to make it usable for its initial applications. This should include how data moves between your systems of record, data lakes, and the AI models and automations within your use case.
4. Assess governance implications: Work with internal stakeholders to account for the ethical and governance implications of data use. Make sure to evaluate each category and data element considered for the specific use case both short and long term. Establishing these governance and ethical guardrails is not only essential for regulatory compliance, but also for maintaining trust with customers and stakeholders.
5. Plan for growth – Frame your vision for how the use case could evolve. Consider how the use case can be scaled fully in the initial areas you’ve targeted, as well as opportunities to extend the AI functions you’ve incorporated to other areas across the enterprise and/or externally in conjunction with strategic partners and vendors. This will help prepare a more comprehensive roadmap for your data environment to build toward.
Of course, as your program grows, so too will the implications for data. New challenges will emerge around privacy, security, and ethical factors requiring more complex solutions. But regardless of the size and scale of your program, spending time upfront understanding data readiness will help you avoid pitfalls and delays, and accelerate success of your AI initiatives. It will improve the reliability and usability of your underlying models, and help scale that intelligence across a wider array of use cases and downstream automations.
BHC Global can provide the guidance and coaching needed to get you started on the right foot, while providing a variety of tools for accelerating your initial use cases and providing a data foundation for future growth. Contact us to learn more!
* * * * *
Bob Champagne is Chief Digital Transformation Officer for BHC Global, leading their Digital Advisory and Innovation Practice. Bob has served in the energy and utilities sector for over 35 years, helping leaders navigate challenges associated with industry transition and market reform. Throughout his career, he has worked with over 120 organizations globally in both regulated and competitive energy markets, focusing on customer strategy, business transformation, and digital innovation. Bob has also been at the center of several successful technology start-ups focused on harnessing the value of data and digital insights across the energy and utilities value chain through the application of advanced analytics, machine learning and AI. Bob can be contacted at [email protected]