Energy Central Power Perspectives: Bringing Utility Analytics into the Future with Artificial Intelligence: Exclusive Interview with Robin Hagemans of Infiniot
image credit: Robin Hagemans
- Sep 5, 2019 11:15 am GMT
- 1035 views
As smart meters and data analytics become more of a standard cornerstone of utility operations, no longer a niche tool, leaders at these energy providers must become comfortable quickly adapting to new ways of doing business. Utility analytics is one of these rapidly expanding areas, and one of the most exciting tools to enhance the work being done here is through artificial intelligence.
The use of artificial intelligence, or AI, is key to processing and taking full advantage of the new and growing sets of data that are already one of the utility industry’s most important assets. Robin Hagemans, Managing Partner at Infiniot, will be sharing his insights in this important field as Chair for the third day of the Smart Grid Big Data 2019 Conference from SmartGrid Forums, the day in which Utility Analytics and Artificial Intelligence take center stage. In advance of this exciting event, Robin was kind enough to share some of his insights in a Q&A with Energy Central:
Matt Chester: You’re going to be chairing the third day at the conference that is dealing with artificial intelligence in the utility space. Can you start by giving some background about how you got involved in this field and why you think it’s such an important area of study?
Robin Hagemans: I’ve been working for more than 10 years in the utility sector in the Netherlands in business and IT roles on several digitalization related topics such as automation & control, telecommunications, and data science. I had the opportunity, based on my education in chemical engineering and the strong belief in data within my company, to build a strong team to develop thought leadership on how data science modelling and simulation could add value to complex energy transition utility challenges. We were able to develop several models for strategic and operational data driven questions. The team is still working on a day-to-day basis on utility analytics solutions and will be presenting at the AI conference that I’m chairing.
I have made the move to entrepreneurship and started last year as Managing Partner and Co-owner with Infiniot Data Intelligence in the Netherlands with the mission to help critical infrastructure companies to develop solutions and capabilities on IT/OT integration.
MC: When talking about artificial intelligence, you’ve described it as a co-creation between data science and software development. How has this union played out in the utility sector specifically? What has been the track record of data science in energy and in software development separately, and when and why did they end up coming together for artificial intelligence solutions?
RH: Both competencies, data science and software engineering, strengthen each other in AI.
Let me illustrate this with the situation for utilities which own very complex infrastructures, or complex systems. Energy utilities face new challenges with the large-scale introduction of sustainable energy, while the dependency of the community to their systems grows year by year. Sustainable applications introduce more volatile energy generation from wind and solar. It also introduces higher demand levels of energy supply with electrical vehicles and heat pumps. Data science adds the ability to learn the volatile system behavior and new risks that are introduced to the grid stability. Data science generates the algorithms that will describe the behavior of parts of the system. These algorithms are input for AI systems that can control parts of these challenges for example in how to charge electrical vehicles in the smartest way or how to maintain the grid stability in areas with many PV panels and heat pumps. When we will introduce this algorithmic-based artificial intelligence to reduce or control some volatility effects, we need to trust on reliable software systems. Software engineering habits with good development practice, test scripts and continuous integration and continuous delivery processes are mandatory, especially for the mission-critical utility systems. AI solutions that deliver the Data Science algorithms in a reliable package of software protocols.
The first steps in data-driven decision making in utilities that I have seen and developed in the past years were just recommendation systems operational utility problems. Valuable, absolutely, but not really AI yet. These are the fundamental staples necessary to start developing AI for utilities based on knowledge and outcomes both from data science and software engineering. The combination of both competencies will become able to deliver AI for complex decision processes in mission critical utility processes. AI has to prove its value the following years where the volatile effects of the energy transition for example could be better understand and controlled with a data driven toolset like AI.
MC: As you look at the coming years in the energy space where the topics most often discussed include renewable energy, cybersecurity of the grid, and grid modernization, how do you see artificial intelligence fitting in? How will AI drive the priorities based on what’s possible in utilities?
RH: My expectation is that the introduction of AI solutions in critical infrastructure systems will follow a similar pattern as the development of data science recommendation models. Data science started as static models and evolved into dynamic and real-time recommendation models. AI will start with singular learning questions with a central system focus. Because these AI models will need a lot of streaming data to learn and grow, these data need to be moved through a widely spread telecommunications network to these central systems. This first step approach would become too expensive for a positive business case on a certain moment.
It will probably be smarter to change direction and move the AI to the data and to develop a more decentralized AI system. But therefore, trust in central AI solutions needs to be harvested first before costs of moving all the data will become an issue.
AI can also handle capacity problems based on volatility of renewable energy and both recognize cybersecurity anomalies, probably both decentral and maybe on the same chip in a future energy device. Complex long-term grid modernization questions, that need a more holistic view, I would suggest solving these in central systems with both AI and data science. While local real-time problems better should be handled locally with embedded AI on these energy devices, a smart combination of data science and software engineering.
MC: Artificial intelligence has the ability to make possible great solutions in energy, but many great solutions sometimes come first with missteps. Have you seen any ways in which you disagreed with the implementation of AI in utilities, whether it was the wrong approach or AI wasn’t properly implemented, that have provided lessons learned of what not to do next time? Are there opportunities for experts in the field to learn from each other like this?
RH: I have seen that gathering data without a data analyzing strategy does not bring the expected value. Electrical engineers are familiar with the system borders and problems for many years, so better not approach the utility system as a black box. It’s better to search for the expected new patterns, the treats of the current system, the volatile patterns that are really new and consist of learning opportunities.
The lesson I learned is that you better start to design your sensing plan and place your new devices in the system driven by what you need to know. Even then there is enough uncertainty in where data science and AI can help you. Traditionally, and unfortunately, the data registration of measuring devices and datapoints in the utility systems is poor. This weak metadata organization also is an important source for lack of success in implementation of data driven decision solutions.
More positively, most of the new data that will be generated form future utility systems still needs to be designed and placed, so today is the opportunity to prepare for valuable AI systems in the future.
MC: You’ve helped to gather the speakers on this topic at the upcoming SmartGrid conference—when you look for thought leaders in this space whose ideas you want to shine a spotlight on, what do you look for? What creates leaders in this fast-moving and modern field?
RH: I see that many of the speakers on this conference are really innovation-minded people who all have a very strong belief in the power of data, and a strong idea and knowledge in how AI is the tool to make a next step in decision systems for complex questions in utility firms. They have to fight against traditional thinkers in their own companies, build their own coalitions and arrange conditions to do their job. Their thought-leadership, and I have worked with some of them, is very inspirational. More and more they are able to enter the boardroom and not only show value with their AI solutions, but also to influence the company roadmaps to create better conditional aspects for the data functions in their organizations, as I mentioned earlier in this interview. The better data-thinking & design integrations that the traditional company processes, the more value will be harvested in the future, the easier AI could be developed, and the better complex problems could be managed.
If you're interested in learning more about how utilities can and do utilize artificial intelligence, be sure to check out Day 3 of the Smart Grid Big Data 2019 conference that Robin is chairing. The conference is from September 17 to 19 in Berlin, Germany. You can learn more about the agenda and register for the conference here.