Highjoule
2026-04-29
At first, it seemed like it would be just another fleeting concept — like blockchain, Web3.0, or the metaverse. But after personally trying it out, ZhenLi discovered that the Tokens consumed by AI may be quietly transforming into future orders for the energy storage industry. From OpenClaw to energy storage power stations, a new industrial chain is silently taking shape.
Recently, a “crayfish craze” has swept the Chinese internet. OpenClaw — nicknamed “Little Crayfish” because its logo features a small shrimp — has gone viral. After testing a few automated tasks, the first impression was remarkable convenience: a complete PPT report can be generated automatically. But what followed was widespread grumbling about “Token consumption” — and rightly so. (Here, a Token is the smallest computational unit when AI processes text — think of it as the “currency” of the AI language world.)
From the test results, a moderately complex task can consume tens of thousands to over a hundred thousand Tokens. At first glance, this seems like a niche technical discussion within the AI community — but from a different angle, a clear industrial chain is taking form:
Token → Computing Power → Data Center → Electricity → Energy Storage
What appears virtual in the AI world ultimately lands on something very real — electricity.

OpenClaw is an AI Agent framework. Unlike traditional chat-based AI, it breaks tasks down into multiple steps: understanding the problem → searching for information → organizing data → multi-round reasoning → generating results. Each step may invoke a large language model (such as OpenAI, Anthropic, or DeepSeek). If one invocation consumes 3,000 input Tokens and 2,000 output Tokens, that’s 5,000 Tokens per call — and a complex task may invoke the model up to 10 times, ultimately consuming around 50,000 Tokens. Hence, many developers joke: “Agents aren’t working — they’re burning Tokens.”
In large language model systems, text is broken into Tokens for processing. For example, the sentence:
Energy storage demand is rising
might be split into:
Energy / storage / demand / is / rising
Each Token generated consumes computational resources, so Token count directly reflects computing consumption. And the core hardware behind computing power is GPU clusters. As AI model scales keep growing, computing demands are expanding exponentially.
The electricity consumption of AI data centers comes primarily from GPU servers, networking equipment, storage systems, and cooling systems — with GPU servers consuming the most. The International Energy Agency projects that by around 2030, global data center electricity demand could exceed 1,000 TWh, equivalent to the annual electricity consumption of a mid-sized industrial nation.
Over the past few years, tech companies have been signing long-term power purchase agreements, investing in wind and solar, and participating in electricity markets to secure computing capacity. Microsoft, Google, Amazon, Alibaba, Tencent, and other giants are building energy supply chains alongside their data centers. The reason is simple: computing growth inevitably drives increased electricity demand, and without a stable power supply, even the most advanced AI models cannot function.
As data centers continue to scale up, the power grid faces two major challenges: power stability and price volatility. Energy storage systems are well-positioned to address both: smoothing load fluctuations, providing backup power, enabling electricity arbitrage, and relieving grid pressure. In newly built computing parks, energy storage is being co-planned alongside data centers, forming an infrastructure combination of “Data Center + New Energy + Energy Storage.” At this point, energy storage is no longer merely a grid asset — it has become an essential component of computing infrastructure.
Taken together, the chain is clear: AI applications → Token consumption → computing demand → data center construction → electricity demand growth → energy storage deployment. AI is not only a revolution in the software industry — from an energy perspective, it is driving the development of the power and energy storage sectors.
The viral spread of OpenClaw gave people a visceral sense of the computational costs behind AI tasks for the first time. From a broader perspective, the more Tokens AI consumes, the greater the computing demand, and the larger the electricity requirement. Within the power system, the key tool for regulating fluctuations and stabilizing supply is energy storage.
Perhaps many still see AI and energy storage as two entirely separate industries today. But viewed through the lens of the industrial chain, their connection is growing ever closer. In the future, when the energy storage industry discusses market growth, the driving force will come not only from new-energy installation capacity, but also from the AI models continuously consuming Tokens in the cloud.