As AI costs rise, China is building shared computing platforms that allow businesses to access AI power like electricity or cloud utilities.
When Chinese AI entrepreneur Li Zhong talks about the biggest challenge facing startups today, he does not mention talent or investment.
“It’s computing power,” he said. “Sometimes we have customers waiting, but we still don’t dare take the orders.”
His experience reflects a broader issue in China’s AI sector.
AI applications are spreading rapidly across manufacturing, logistics, finance and customer service. At the same time, the cost of computing power continues to rise. For many small and medium-sized enterprises (SMEs), access to AI infrastructure has become difficult and expensive.
Chinese officials recently revealed how quickly demand is growing.
At the 2026 China Mobile Cloud Conference in Suzhou, authorities said China’s average daily AI token calls had exceeded 140 trillion by the end of March. That figure was more than 1,000 times higher than at the beginning of 2024.
Tokens are the basic units AI models use to process and generate information. The increase reflects the growing use of AI across industries in China.
However, the rapid growth is also increasing pressure on computing resources.
Who can afford the computing power needed to run AI at scale? China is now trying to answer that question with a new approach.
Instead of treating computing power as a resource controlled mainly by large technology companies, policymakers want to make it more flexible, shareable and accessible. Increasingly, officials describe computing power as a type of public infrastructure — similar to electricity, water or transportation networks.
Under a recent national initiative aimed at expanding inclusive access to computing resources, Chinese authorities highlighted new models such as “compute banks” and “compute supermarkets.”
The idea is to let companies access computing power on demand rather than build their own infrastructure.
China’s Growing Demand for AI Computing Power
China’s AI industry is evolving quickly.
In the past, many AI tools focused mainly on chatbots and content generation. Now, more companies are developing AI agents that can independently complete tasks, analyse data and manage workflows. These systems require significantly more computing power.
Industry estimates suggest a mature AI agent may use dozens of times more tokens per day than a traditional chatbot.
As a result, computing demand is rising across sectors.
Guo Liang, chief engineer at the Cloud Computing & Big Data Research Institute under the China Academy of Information and Communications Technology (CAICT), said many SMEs face three major problems. First, they struggle to secure stable access to high-performance computing resources. Second, rising cloud service costs are putting pressure on company finances. Third, many smaller firms lack the technical teams needed to fully use AI infrastructure. For AI startups that depend heavily on public cloud services, the pressure can be severe.
Analysts estimate that computing expenses can account for 30 to 50 per cent of operating costs for some AI companies. If cloud service prices rise sharply, profit margins can shrink quickly.
At the same time, China’s computing resources remain unevenly distributed. Some data centres operate at full capacity, while others sit partially idle during certain periods. Researchers increasingly argue that China’s problem is no longer just about building more computing centres. It is also about improving how existing resources are allocated.
That idea sits at the centre of China’s new “computing power bank” model.
Turning Idle GPUs into Shared Resources
A “computing power bank” works like a shared resource platform.
Companies, universities and research institutes can contribute unused GPUs or server capacity during off-peak periods. Later, when they need additional computing power, they can draw resources from the platform.
Guo Liang described the model as a type of “piggy bank” for computing resources. The system does not operate like a traditional financial bank. Participants do not receive interest payments. Instead, platforms may offer credits, subsidies or revenue-sharing based on the amount of computing power contributed.
The model fits institutions whose demand changes over time. For example, some university laboratories leave GPU clusters underused at night or during school breaks. Meanwhile, industries such as autonomous driving, robotics training and industrial simulation often need short bursts of intensive computing power. A unified scheduling platform can connect those different needs.
Industry observers say the model could help smaller companies avoid the huge upfront costs of building dedicated AI infrastructure. It also reflects a broader shift in China’s digital economy strategy. Instead of treating GPUs as fixed corporate assets, policymakers increasingly view computing power as a flexible production resource that can move across industries, regions and time periods.
Shopping for Computing Power Online
While “computing power banks” focus on resource coordination, “computing power supermarkets” target user experience.
The idea resembles an online marketplace for computing services. Platforms package computing resources into standardised products. Businesses can then choose different levels of service based on their needs. Some services charge by GPU-hour, while others use token-based pricing or subscription plans.
For many SMEs, the model lowers the barrier to AI adoption.
“Traditional AI infrastructure often requires large upfront investment and long-term commitments,” said Zhu Keli, founding director of the China Institute of New Economy. “Now the industry is gradually shifting toward subscription-style computing services.”
Pilot projects are already emerging across China. Authorities in Xiong’an New Area recently launched a dedicated computing platform for SMEs, while Ningxia has started linking western computing resources with demand from eastern provinces through integrated scheduling systems. In Hangzhou, a computing resource trading platform has reportedly handled more than 200 million yuan (about £22 million) in transactions.
Some manufacturers are already benefiting. In Kunshan, Jiangsu Province, a small auto-parts factory recently rented edge computing services to support around 100 AI-powered quality inspection robots operating 24 hours a day. Compared with building its own computing centre, the company reportedly reduced costs by about 30 per cent. It could also adjust computing capacity based on changing order volume.
That flexibility is increasingly important for Chinese manufacturers. Factories across China are increasingly testing AI systems in quality inspection, warehouse management, predictive maintenance and supply chain coordination. Yet many smaller firms lack the capital needed to build dedicated AI infrastructure. Shared computing platforms offer another path.
Challenges Remain
As AI becomes more deeply integrated into the economy, computing power itself is gradually turning into a tradable utility. China is now experimenting with a system where companies can access AI infrastructure on demand instead of owning it outright.
Despite growing momentum, China’s shared computing models remain at an early stage.
Many projects still rely heavily on government support. Several local governments have introduced subsidies, sometimes referred to as “computing power vouchers”, to help reduce AI infrastructure costs for startups and SMEs. Questions remain over whether these platforms can achieve long-term commercial viability.
Technical challenges also persist. Cross-regional scheduling requires fast network infrastructure, common technical standards and strong interoperability between different hardware systems. Data security and coordination across regions remain ongoing concerns.
As AI becomes more deeply embedded in the economy, computing power is increasingly being treated as a tradable utility rather than a fixed asset. China is experimenting with models that allow companies to access AI infrastructure on demand instead of owning it outright.
The model is still evolving, and its long-term scalability remains uncertain.
If you liked this article, why not read: How AI Is Rewiring China’s Massive Logistics System