At the February 2024 Orlando edition of the Distributech conference the power and energy industry showcased the potential, and adoption limitations, of artificial intelligence (AI). At the event, industry giants like Duke Energy and GE Vernova lent weight to AI’s emergent role in addressing surging energy consumption needs. Moreover, on hand to facilitate the shift to AI solutions were custom AI solution vendors like Accenture, Deloitte, McKinsey, AWS, HCLTech/Symphony as well as a host of AI scaleups.
And yet, the energy and power industry are, in general, traditionally oriented around core operational functions like power generation and energy management. On the one hand, general calls for incremental YOY growth and optimization are expected and are unavoidable due to the surging US demand for energy. On the other hand, the language of exponential growth via optimization derived from AI and the Internet of Things (IoT) technologies is relatively new to the power and energy field. In turn, digital innovations are often aspirational in nature, and ingested slowly. By contrast, the common ground of resiliency, consistency, safety, and security are based on long-standing practical operations and standards in the complex US national energy and power infrastructure.
At a session on AI-enabled work management, a picture was painted on how to bridge the gap between data analysis, AI adoption practices, and the lived experiences of utility work crews. The session focused on efforts by Michigan-based Consumers Energy to enhance scheduling and work management in the power and energy industry while emphasizing human-centric stories. The session explored the nexus of operational expertise and technological innovation and underscored the integration happening in next-gen work management.
Panellists shared their journeys, highlighting AI's role in moving beyond traditional scheduling (significantly impacted by daily "emergent work" which never made it onto the official schedule) to a comprehensive, end-to-end process. At a high-level Consumers Energy aimed to dismantle silos, foster collaboration, and enhance efficiency. However, while data and AI model development was important, critical to the project’s success said Spencer, was asking work crews and schedulers a surprising simple question "What's really going on here?" On the one hand, getting a better grasp scheduling decisions using data was clear. On the other hand, Kandell from Nisource emphasized data wasn’t enough. Leveraging data effectively required getting out into the field with the crews, he said. The shift in mindset was significant, said Kendall, "it took a complete revamp of what we were thinking about from a culture standpoint."
Brad Steere of Consumers Energy described the process as iterative, requiring steady feedback from those who would be impacted. Moreover, he said, flexibility was key. There would have been a "mutiny" if the process was "too rigid," according to Matthew Spencer also of Consumers Energy. Spencer also noted adding explanations to scheduling recommendations made by the AI tool was helpful to the schedulers validating the recommendations. In short, creating the AI scheduling capability involves an ongoing data-human feedback loop.
Distributech 2024 session panellists (l-r) Matthew Spencer, Manager of Analytics at Consumers Energy; Brad Steere, Director of Operational Analytics at Consumers Energy; Robert Kandell, Vice President of Data, NiSource; and Jackie Valentine, Partner, McKinsey and Company. [February, 2024, Transforma Insights]
The process described by the panellists had characteristics of an organization undergoing a digital transformation, which includes an awareness of the specific meaning attached to language in an organization. Notably, the process of uncovering “What is really going on here?” in the Consumers Energy work culture involved key terms and phrases rich with ‘insider’ meaning and unique to the Consumers Energy work culture. Called metonymies, understanding how these terms and phrases operated in the specific context of scheduling work crews unlocked key themes in the alignment of AI technology with the workforce's needs. For example, stories based on the Consumer Energy phrase "the daily miracle" in the scheduling department illustrated the reliance on human effort in the face of daily emergent work orders and ad hoc scheduling systems.
Also, the metonymic term "rescheduler" referred to the fact that, according to only 15% of a work crews scheduler’s job involved scheduling. The majority of time was focused on rescheduling, i.e., managing changes to the schedule. (Note: Uncovering the role of metonymies in an organization often points to how trust and power move and shape a work culture. In the case of work crew scheduling, both so-called "front stage" and "back stage" areas of the organization impacted work scheduling. Combining the AI data, human-centric stories, and cultural metonymies helped to bring clarity to the project.)
Going forward, said Steere, using AI to schedule internal work crews would next extend to other work operational areas, as well as contractors. The data and AI process and success of people using the tool were critical, noted Robert Kandell of NiSource.
AI use cases were prevalent at Distributech 2024, and the Consumers Energy use case was laudable for its integration across individual, organizational, and data contexts. However, the primary solutions to solving surging energy needs remain centered on building core infrastructure like additional nuclear power plants and massive solar farms. Thus, innovative efficiency-focused AI use cases are happening at the edges – crew scheduling, load scenario testing, pricing arbitrage, and so on.
To date, AI is a Distributech and industry interloper, albeit an increasingly prominent one. The upshot was while AI dominated keynote themes and discourse, demonstrable AI at Distributech was often in the form of specific use cases, projects, and POCs. Moreover, as a backdrop to the lowborn AI use cases and highborn AI visions was a steady sentiment that AI also represented a growing threat to security in the power and energy industry.
Infrastructure resiliency is complicated. Very complicated. Managing energy and power grids – especially the energy load on the grid – involves a mix of dynamic dependencies. To name a few: federated regional and national nodes, private and public entities, archaic technologies (ex. some substations have been running for 5+ decades and no one really knows how to fix them anymore) and new system capacities (IoT monitoring and more), multiple energy sources (wind, solar, oil, coal, nuclear, etc.), mature and immature energy management capabilities (harvesting, transmit, batteries, etc.), and market forces (mandates and regulations, environmental factors, electric vehicles, working from home, etc.). On the one hand, complex ecosystems present a rich target for AI’s ability to derive value based on efficiency gains. On the other hand, they can be particularly challenging.
In several AI uses cases at Distributech, AI added value by simplifying the operational tangles that have arisen in these complex grids, including rapid scenario testing of proposed energy loads, real-time scheduling of work crews in dynamic environments, and providing arbitrage optimization to commercial relationships involving complex timing and pricing relationships involving power and energy.
A keynote at Distributech by Hussein Shel of Amazon Web Services (AWS) helped shed light on next steps by AWS in the energy and power industry involving technological innovation, corporate collaboration, and the strategic implementation of AI/ML.
Shel noted that rapid advancements in AI and ML technologies and their applications is automating tasks, improving efficiency, and fostering innovation across different sectors. Shel highlighted tools like Amazon CodeWhisperer and AWS's custom silicon for training and inference. Shel also pointed to AWS Qube and its application in inventory management for BMW to streamline operations and business intelligence.
Partnerships are key to the integration of the AI ecosystem and its ability to work with industry sectors, according to Shel who noted how tech giant partnerships like AWS and Accenture are deploying CodeWhisperer to improve developer productivity. Such alliances, said Shel, accelerate the strengths of AI to solve complex business challenges, enhance productivity, and drive growth. By way of examples, he pointed to virtual assistants and customer experience projects with Adidas and Delta employing AWS technologies. In short, partnerships and collaboration in AI projects are important.
Shel also pointed to AWS's role in providing an AI foundation as an infrastructure in support of AI solutions. Moreover, the next step, customization of projects, was supported by AWS Bedrock, a fully managed service, which enables the infusion of AI models with company-specific data. Shel also touched on AWS guardrails to prevent AI misinformation and address ethical considerations in AI deployments. All told, Shel painted a picture of AWS AI services as interrelated strategic layers, from infrastructure to out-of-the-box applications. AWS’s positioning as an all-in-one grouping of AI layers perhaps reflected a recognition that AI’s initial foray into energy and power field is best served by a managed service approach.
As keynote presenter Zack Kass former Head of GTM at OpenAI noted AI is evolving rapidly but with development towards AGI anyone’s guess. But, even as he listed three near term constraints to AGI, including one directly tied to the power and energy field, he did make a guess – AGI would emerge sometime around the year 2030.
Three factors stand in the way, said Kass.
Firstly, achieving AGI is hampered by a lack of computational power. Investments in chip fabrication and efficiency and the existing infrastructure falls short of the demands of next-generation models. "We just don't have enough compute yet. We need to make more chips. We need to make them more efficient," according to Kass.
Secondly, the current energy grid is incapable of supporting the massive energy requirements needed for AGI operations. The solution may lie in advancing nuclear or fusion energy sources, but the transition poses its own set of challenges. AGI requires a massive scaling of power generation said Kass. "The amount of energy required to run inference on the next generation of (AI) models is the equivalent to solar fields the size of Arizona, and Texas."
In Kass’s view, the third impediment to AGI is policy. For example, Kass said, the regulatory environment had historically hindered nuclear energy development in the United States, leading to an overreliance on coal. The development of fusion power and energy and power grid enhancements all play a part, but also underscore the broader issue of policy decisions facilitating or obstructing the path to AGI.