Utilities and Big Data: The Rushed Evolution

Print Friendly, PDF & Email

Ed CuocoIn this special guest feature, Ed Cuoco is the Director of Data Science at The Weather Company, takes a look at big data from the perspective of the utility company and how they can no longer serve their own or their stakeholders’ interests without smarter insights at scale and in real time. Ed Cuoco is the Director of Data Science at The Weather Company, an IBM Business, where he leads an organization focused on creating advanced analytics for industry. Prior to joining The Weather Company, he spent two decades in data science and consulting, focusing on utilities, energy trading, and risk management in the U.S. and Europe.

As utilities and the energy market evolve, the age of the spreadsheet falls further into history. Political, economic, cultural, physical, and climatological changes to operations, market, and business models have permanently altered the energy landscape, leading to a fundamental shift in the role of data. Today, the ability to hold and access huge amounts of disparate data (load, customer, demand) and derive and act upon insights at speed is an assumed core capability.

Weather data is a critical component of a utility’s business and impacts everything from the price of electricity and the load forecast to the performance of infrastructure and the ability to anticipate and respond to outages. When using this data, multiple types of analytical expertise must be seamlessly combined, and insights must rest upon analysis that is mathematically and meteorologically correct.

Utilities can no longer serve their own or their stakeholders’ interests without smarter insights at scale and in real time. These insights include organizational concerns from operational expectations and capabilities to rate structures and customer incentives – even regulatory and business model changes. Consider the following:

  • Utilities are evolving their operational and delivery capabilities in interconnection, storage, and renewables for grid efficiency and stability and flatten the “duck curve”
  • Evolution of the physical grid (modular, interconnected, smart); and the increased use of analytical solutions to enhance major infrastructure upgrades and changes.
  • Markets are more financially diverse given the rise in efficiency programs for residential and commercial ratepayers and the expectation of linking these programs to real financial incentives. Increased regulatory support and the use of demand response/demand management as an operational and economic tool has led to changes in rate design, particularly as impacted by changes at the “distribution edge.” More third-party players are introducing themselves into the market and acting as DR providers, grid managers, distributed energy suppliers, and delivery agents

These changes have real implications to the utility business model and its political/regulatory environment:

  • The increased use of DR and distributed energy creates perverse outcomes for utilities’ ability to support large infrastructure.
  • Political and regulatory focus on grid modernization (e.g. New York’s 2015 Energy Plan and SB 32 in California) and the reduction of greenhouse gases lead to pressure on the very idea of COSR.

Each area impacts, reinforces, and accelerates the others; the need for greater interconnection feeds expectations around the management of granular demand. Attempts to use analytics to improve the optimization of existing infrastructure is being driven by modular grids and enthusiasm for the increased use of distributed energy and DR programs.

Taken holistically, these pressures represent an unprecedented change in the utilities’ expectations in terms of operations, financial recovery, and business structure. These shifts require significant core capabilities in data science and analytics, and change is impossible without advanced capabilities in the collection and analysis of large, diverse, unstructured data sets and the ability to derive insights in near real time. Consider the following:

  • Any attempt to integrate renewables (even with storage) assumes an ability to predict and act on generation that is far more volatile than turbines, requiring near-real-time analysis and asset control.
  • Energy efficiency programs assume that you are not only tracking use and rates but also attempting to derive and predict users’ behavior.
  • Interconnection requires the ability to seamlessly share data across geographies at speed to maximize optimization.
  • Plans like New York’s Reforming the Energy Vision assume that advanced analytics across generation and delivery can optimize price and reduce carbon emissions.

These changes place infrastructure for the use of huge quantities of structured and unstructured data and data science capabilities at the center of a utility. Spreadsheets, therefore, can no longer be the central data management or analytical tool – a truth that extends to products that are, effectively, “spreadsheets at scale.” The diversity of data types requires deeper expertise in analysis and information domain.

The spreadsheet is a powerful tool when lead times are long, analytics are defined and understood, and data is structured, clean, and in small quantities. To address any one of the above shifts, high-volume, near-real-time data coupled with advanced predictive and prescriptive analytics are  critical.  With the sheer increase in data demands alone, spreadsheets can no longer form the analytical backbone of the modern utility; it’s time to embrace a holistic, modern era in which data can be integrated and displays can be adjusted in iterative real-time/near-real-time to improve utilities’ performance and bottom line.

 

Sign up for the free insideAI News newsletter.

Speak Your Mind

*