Embracing Big Data
- May 25, 2018 4:36 pm GMT
The electric industry is not simply evolving; it’s very definition is changing. The legacy grid, a one-way structure, is being replaced by a far more complex, decentralized system in which many of the traditional, flexible power sources are being replaced by distributed, inflexible sources. These inflexible sources, distributed non-carbon generators such as wind, solar and battery, have the potential to dramatically lower carbon emissions, increase dependability, and lower costs, both for customers and power companies. But precisely because these sources are inflexible, they depend even more than legacy carbon-based generation, on the ability to predict demand down to the very local level. And to do that, power companies must rely on massive amounts of data.
As the ability to collect, store and manipulate ever increasing amounts of data has become a reality, the concept of Big Data is transforming much of society. The potential impact for the utility industry is huge, but it’s just beginning to embrace it, partly because few utilities have the expertise or the technology to use it. Turning big data into actionable information requires huge computing capacity and experts trained in advanced data analytics. Utilities have been collecting data for years, from customers and from the various components in their systems, but making use of this data is difficult and requires resources that utilities simply don’t have. The Data Analytics Institute estimated about a year and a half ago that 50% of utilities had very little use for the data they collected, almost 40% were trying to figure out what to do with it and a mere 5% to 10% have standardized data analytics tools and processes. A more recent study by Capgemini Consulting found that only 20% of utilities have implemented big data analytics.
Even those utilities that are using big data tend to use it for isolated issues such as predictive maintenance. The data itself tends to be in silos, existing in disparate systems and departments, making accessing it and making sense of it extremely difficult. Nor does it necessarily make sense to try to acquire all of those resources in-house. Data analytics, predictive technology and cloud computing are changing rapidly, and investments in today’s technology may not be worthwhile in five or ten years. Nor are utilities necessarily willing or able to add the technical personnel to make sense of all the data they collect. Rather, utilities need to secure partnerships with outside firms that specialize in big data and that have both the enormous computing power needed to process the data as well as the experts in data analytics needed to make sense of it. Or, rather than working in partnerships, utilities can outsource the work to third-party providers, although that raises serious security issues.
The benefits that can accrue from big data are enormous. Using big data and advanced analytics can allow utilities to take full advantage of distributed resources to allow for peaks and troughs in demand. Close to half of all households now have smart meters. By combining the data from these meters with various third-party information such as weather forecasts, utilities will be able to construct detailed information about how much energy will be needed and where it will be needed, not only in the next 24 hours, but weeks and even months in advance.
Advanced big data analytics gives utilities the information they need to know the installed capacity, location, size and output of grid-connected renewable generation and enables much more precise supply decisions as well as disaster preparedness. The result is greater reliability and much more efficient use of resources. It also allows far more precise estimates of exactly how the various assets on the grid are operating, which in turn boosts the reliability of predictive maintenance.
What all this means in real time in a specific city can be startling. MIT researcher Christoph Reinhart gathered enough data to model how buildings in Boston use energy. He identified a number of building profiles and calibrated them with actual usage data. By projecting this data on all the buildings in Boston, he and his team achieved 94 percent accuracy predicting how much power the grid would need at any given time.
Big data will allow the energy system, which is designed to provide sufficient energy for the most demanding day of the year – for much of the country, the hottest day of summer – to remodel itself to supply peak demand and to target buildings and locations that drive that demand while limiting those peaks to specific places, while at the same time focusing on efforts to increase building efficiency, install renewable and battery sources where they are needed most, and distribute capacity precisely where and when it is needed. Projected over the entire country, the savings, to customers and to utilities, can literally be in the billions.
No discussions yet. Start a discussion below.
Get Published - Build a Following
The Energy Central Power Industry Network is based on one core idea - power industry professionals helping each other and advancing the industry by sharing and learning from each other.
If you have an experience or insight to share or have learned something from a conference or seminar, your peers and colleagues on Energy Central want to hear about it. It's also easy to share a link to an article you've liked or an industry resource that you think would be helpful.