For those who haven’t met Chad Rycenga, he’s a long time technologist and product visionary. Currently, Chad serves dual roles as the Executive Vice President for Product at Ibex Global where he oversees product vision and development for some of the company’s advanced telephony, chat and digital transformation technology. Chad also serves as the Chief Technology Officer for ibex Digital, where he oversees the company’s digital CX and customer engagement technology. Chad’s been at the heart of every major technical innovation at ibex including the flagship order and enrollment orchestration technology that supports clients who want deliver modern digital experiences for their customers. He’s also been expanding the capabilities of a “CX as a Service” platform for voice of the customer solutions that harness the power of machine learning and advanced AI to deliver enterprise class insights from customer surveys including text analytics, enhanced branching and cross-tabs, and sentiment analysis that’s delivered “as a service” to a number of clients in highly regulated industries. Chad really does it all.
With so much attention in the industry and at trade shows about the coming wave of AI and chatbot technology, I wanted to get his take on the impact of AI on various technologies, as well as hear his thoughts on the potential value and advantages for AI for utilities as they evolve their customer engagement and self-service functions.
I recently spoke to Chad for our podcast and wanted to share some of his insights with the Energy Central audience. The following article provides excerpts of that conversation.
Wilkinson: Welcome Chad and thanks for joining me today. Let’s jump right in, as no one can escape references for Generative AI and ChatGPT lately. Apparently, AI is going to take over the word and impact nearly every area of our daily life. For those in the energy utility industry who may be unfamiliar with some of the operational facts of LLMs and generative AI, can you explain what these chatbots are and how they can enhance customer engagement and service, compared to traditional chatbots?
Chad: Absolutely. No, chatbots aren’t going to solve all of the world’s problems tomorrow, but they certainly have an impressive opportunity to support Customer Service teams and the ultimate consumer of our services. And, they get smarter all the time.
At their core, the traditional or legacy chatbots that we’re all familiar with follow a predefined script or decision tree that people program and update continually. While those chatbots can be moderately effective for very specific tasks, their ability to handle diverse or unexpected user queries is limited. And, traditional chatbots and related tech like IVAs require a lot of support to update and program or condition with feedback from actual customer engagements and “exception handling” when the customer “breaks” the process.
Large Language Models and Generative AI, which I’ll talk about interchangeably for the sake of this conversation, get trained with a massive amount of text data including a company’s policies and processes, along with a lot of conversations with customers to develop more of a conversational tone and the ability to engage with customers in more of a normal language manner. Generative AI chatbots provide a lot of advantages for customers and CS teams.
Adaptability: Rather than being constrained by a script, generative AI chatbots can adapt to a wide range of customer questions and comments, making them especially beneficial for industries like energy utilities where customers might have complex or unique questions about their bills, energy usage, or service interruptions. Generative AI adapts to the questions and learns from the growing volume of customer engagements in order to provide the most accurate information in a way customers engage and synthesize easily.
24/7 Availability: AI powered chatbots can operate round the clock, ensuring that customers can get answers to their queries instantly, anytime. This is especially beneficial during outages or service disruptions when there's a spike in customer inquiries. A chatbot won’t replace a customer service center, but can easily support a growing number of “after hours” customer questions, and take the basic questions from customers during an outage so that live CS teams can handle the most critical or sensitive customer enquiries.
Scalability: Traditional chatbots used to be sold as a cost-reduction solution, but most companies found that they had to hire a team to manage content in the chatbot and regularly update and manage changes in the processes or content. With the modern LLM or Generative AI chatbots, utility companies can deploy these AI chatbots to handle routine queries, while human agents can focus on more intricate issues, and they scale very effectively as they learn, which reduces the cost of support and “training” or tuning that those legacy applications required.
Data-Driven Insights: One of the most exciting scenarios for the new chatbots is integrating them to other sources of data and analytical tools. For instance, if there's a sudden influx of queries about a particular service or program, utilities can quickly identify and address the root cause. And, chatbots with access to real time data can direct customers to more personalized solutions as they train and learn.
Mark: Your points about legacy chatbots really rings true. It’s easy for customers to recognize when they are connected to a traditional bot and for those conversations to go astray quickly. I’ve been caught in a chat “doom loop” too often to mention, and it always leaves customers more frustrated than when they started their experience. The new generative AI powered chatbots finally offer companies a method to deliver a more customer-friendly and modern support experience.
Let’s pivot to implementation. There’s always a perception that cutting-edge technologies require extensive overhauls or are complex to implement. Can you provide some insight into the relative ease or challenge of integrating an LLM-powered chatbot into a utility company's existing infrastructure?
Chad: Agreed, and that’s especially the case for energy and utilities who are all too familiar with technology projects measured in months of time and millions of budget dollars. The good news is that the nature of AI and LLM architecture and platforms changes those implementation frameworks for the better. The work favors planning and project management over deep development, so fits the organizational strengths of a modern utility. Here’s a rundown of how we might organize a project:
Integration with Current Systems: Modern LLM-powered chatbots are designed with integration in mind. Many come with APIs or integration capabilities that allow them to easily tie into existing customer management systems, databases, or other key tools the utility might be using. This reduces the need for a complete system overhaul or long and expensive premise-based implementations and custom code.
Customization and Training: While these models are pre-trained on vast datasets, they often benefit from fine-tuning specific to the utility industry or even the particular company. This involves training the model industry-specific data or company FAQs to make it more attuned to users' needs. This step can vary in complexity based on the desired specificity of the chatbot's knowledge. For utilities, it’s a matter of training the AI model with data from the website, from customer programs and processes, even from call transcripts from actual customers to help the LLM learn what’s important to customers and how to answer their questions.
User Interface & Experience: Implementing the chatbot on the company's website or app usually involves adding a simple chat interface, which many third-party providers offer out-of-the-box solutions for. A chatbot should be seamless to the digital customer experience and immediately intuitive so customers just start using it without fanfare or instruction. The good news is that customers have so much experience with chatbots that a seamless integration has become commonplace and helps deliver a modern utility CX.
Testing & Iteration: Before a full-fledged launch, it's crucial to test the chatbot in real-world conditions. Gathering feedback from initial users can help identify areas of improvement, which can then be iteratively refined. While this step requires effort, it's a necessary part of ensuring the chatbot meets user expectations.
Maintenance & Updates: Like all software, chatbots require occasional updates. However, the beauty of AI-driven models is that they continuously learn and adapt from interactions, so they naturally get better over time and require lower maintenance and support costs than traditional applications.
So, success rolling out an AI powered chatbot on a website requires some planning and solid project management, but that’s already a core competency of a utility. The work comes down to organizing the data and content, training the AI and then thoroughly testing the platform for accuracy and consistency. For utilities have a purpose built QA team to help. We recommend testing the chatbot with the customer service team to judge the affinity for customer questions and support resources. That team already interacts constantly with your customers, so getting their help with testing makes perfect sense.
Wilkinson: That’s a great idea that doesn’t get enough attention - a chatbot can be customer service facing as part of a testing process before it ever has to be turned on for customers directly. What a great way to ensure customer affinity and security, not to mention a perfect segue to another key question. The energy utility sector is quite strict in terms of data privacy and protection, especially in regard to their regulatory scrutiny. How do LLM-powered chatbots ensure that sensitive customer data isn't compromised? Are these chatbots designed to comply with energy industry-specific regulations and standards?
Chad: Similar to the utility industry’s attention to planning and project management, the attention on privacy and security actually makes it the perfect place to employ a new generation chatbot. The industry’s attention to privacy and security means that they already have the internal compliance and technology protocols and resources to ensure proper implementation and handling of the chatbot solutions. Consider it this way:
Data Handling and Privacy: LLM-powered chatbots, by design, do not inherently retain specific user interactions or personal data unless configured to do so. We would prioritize privacy, ensuring that the chatbots operate under a "stateless" design where each interaction is independent, and previous user data isn't stored or remembered unless it would conform to the utility partner’s existing security and compliance standards, much like their voice call data requires.
Regulatory Compliance: AI partners now provide solutions tailored to regulated sectors, ensuring that the chatbot operations comply with industry-specific standards and regulations, such as CDPA for various states, as well as any local regulatory requirements. It’s a core component of the planning and implementation to manage data inputs and “fence” the chatbot interactions according to compliance standards of the utility.
Active Monitoring: One advantage of these chatbots is the ability to integrate them with monitoring tools, which allows for real-time oversight and enables swift action in the rare event of any inconsistencies or potential breaches. The technology already fits a modern monitoring and cybersecurity infrastructure, and the use by other highly regulated industries like fintech, health and automotive means that utilities will have ample expertise to reference for monitoring solutions in the future.
Secure Infrastructure: Most LLM-powered chatbot solutions operate within highly secure cloud environments. These environments undergo regular audits and are fortified against external threats. Moreover, end-to-end encryption ensures that data in transit between the user and the chatbot remains confidential.
Customizability for Data Handling: Utility companies can often customize how the chatbot handles and processes data, ensuring that any specific data residency or handling requirements are met. For instance, if a utility operates in a region where data must remain on-premises, chatbot solutions can easily be adjusted to respect those compliance demands.
Continuous Training on Regulations: As regulations evolve, it's vital to keep the chatbot updated. This involves not only software updates but also occasional retraining to ensure the model understands and adheres to any new regulatory nuances.
Ultimately, while the potential risks associated with digital interactions can never be wholly eliminated, modern LLM powered chatbot solutions already possess the design, infrastructure, and operational practices to maximize data security and regulatory compliance. Proper implementation and diligent monitoring further ensure that utility companies can offer advanced customer service without compromising on safety.
Wilkinson: Let’s shift the conversation to a topic of interest to every utility Customer Officer and CX leader - customer centricity. Given the unique challenges and questions customers might have about their energy use, bills, solar, EV charging and energy efficiency, how adaptable are these LLM chatbots to the needs of the utility industry? Can they be tailored to offer information on energy-saving tips, payment options, and other frequently asked questions in a context-sensitive manner?
Chad: Of course. In fact, that’s the real power of this technology, the ability to train the AI for specific contexts. Think of it like sending the AI to college for a major in industry specific knowledge, processes, and scenarios important to customers.
While LLM models start with a broad knowledge base, then the utility trains them using industry-specific data. We update the LLM with website content, policies and procedures, knowledge of customer programs like energy efficiency, payment assistance programs, or Time of Use plans. We can even include data from customer surveys, transcripts of calls, and other feedback loops to ensure the utility chatbot has a wide knowledge base of information similar to a live agent, but with the advantage of perfect recall, quick access, and consistency across multiple customer interactions.
Unlike traditional chatbots that rely on fixed responses, LLM chatbots generate responses in real-time. This means they can offer context-sensitive information. For instance, if a user inquires about high energy bills during winter, the chatbot could provide energy-saving tips suited for colder months, or even personalized for the customer’s billing history or knowledge of smart devices ordered from the utility marketplace.
AI powered chatbots can be integrated with a company's CIS systems to pull real-time data. So, if a customer has a question about their current bill or payment history, the chatbot can securely fetch that specific information, offering responses personalized for each customer.
I think that the adaptability and customizability of LLM chatbots make them exceptionally suited to serve the nuanced needs of the utility industry. With the right setup and continuous refinement, chatbots can offer highly personalized and context-sensitive assistance, enhancing customer engagement and trust around the clock exactly when and where customers want to connect. And they do it with the consistency and control that utilities often require but sometimes struggle to deliver in a live agent environment.
Wilkinson: Thanks Chad. I can’t think of a better way to end this conversation than what you just said, the utility industry already has the operational data and process discipline, not to mention the security and compliance maturity to take advantage of these Generative AI chatbots to improve their customers’ digital experiences. The advantage of added accuracy, integrity, and control of the customer experience fits the nature of what customers expect from a modern utility CX, and the utility’s regulatory and delivery requirements. We appreciate your insights, today.