04/30 2026
582

On April 24th, the global semiconductor industry experienced a rare, synchronized, and substantial fluctuation. Prior to the opening of the U.S. stock market, Intel witnessed a nearly 30% surge in its pre-market gains, buoyed by positive signals from its latest earnings report and performance announcements. In the A-share market, Hygon Information, the domestic CPU leader, also strengthened, closing up by 8.20%.
This was no mere coincidence. In contrast to the computing power trend centered around GPUs over the past two years, this round of market attention is distinctly focused on CPU leaders. The market has started to revisit a critical question: Is the growth of AI computing power still solely a narrative of 'more GPUs'?
01 A Fundamental Shift in AI Computing Power Logic
For an extended period, the investment rationale in the AI sector has been relatively straightforward: the larger the model, the greater the computing power required, positioning GPUs as the natural focal point. However, as we entered 2025, the industry's emphasis began to pivot.
Intel, in its recently concluded earnings announcement, posited that the key challenge in AI infrastructure is transitioning from computing power provision to system orchestration efficiency. Rather than merely expanding computing power capacity, companies are now more concerned with system scheduling, resource utilization, and the ability to support more applications continuously under the same hardware conditions.
This transformation is underpinned by a shift in the stage of AI applications. While training remains crucial, the deployment of inference, intelligent agents, and industry-specific applications is emerging as the new mainstream. These scenarios prioritize stability, efficiency, and cost control over raw computing power.
Consequently, the significance of CPUs is being re-emphasized. A research report released earlier by CITIC Securities noted that the market had previously 'underestimated' the role of CPUs in the AI era. With the evolution of AI application formats, CPU demand is being rejuvenated, prompting a reevaluation of their value.
02 CPUs Reclaim the Spotlight
In contemporary AI systems, GPUs still shoulder the primary computational tasks, but system operational efficiency increasingly hinges on how computing power is organized.
Intel highlighted that as AI system complexity escalates, the role of CPUs in task scheduling and data flow management is intensifying. This has gradually become an industry consensus, as inadequate CPU performance can directly become a system bottleneck when computing power clusters expand to tens of thousands of cards.
Analysts from CITIC Securities more explicitly pointed out that in AI clusters, CPUs not only handle control and scheduling functions but also serve as vital providers of shared memory to alleviate memory constraints during large-model inference. This implies that CPUs influence not only whether the system 'can run' but also 'how efficiently it can run'.
Simultaneously, the role of CPUs is extending to more practical scenarios, such as serving as the control core in AI clusters, participating in small-model inference and cloud resource reuse, and managing motion control and execution logic in embodied intelligence. The commonality among these scenarios is that CPUs are 'ascending' in the AI system hierarchy, reclaiming the spotlight.
03 Demand Reshapes Valuations
From an industry standpoint, the shift in CPU demand is not attributable to a single market but to changes in how computing power is utilized.
In inference scenarios, some lightweight models can already operate on CPUs. Research data from CITIC Securities reveals that under single-user conditions, 7B to 14B models demonstrate practical inference performance on high-end CPUs. Meanwhile, CPU utilization in cloud computing systems has long been suboptimal, and these resources are now being reallocated for AI computing.
In the realm of embodied intelligence, a division of labor between CPUs and GPUs has become an industry norm. Many robot manufacturers adopt a 'CPU + GPU' combination, where CPUs manage real-time control and communication functions, forming the bedrock for stable system operation.
When these transformations are amalgamated, CPUs are no longer merely 'support components' but have evolved into fundamental units directly involved in AI system operation.
04 Ecosystem Determines Certainty
Despite the rising demand for CPUs, industry benefits will not be evenly distributed. A key limiting factor is the strong reliance of enterprise computing systems on software ecosystems. CITIC Securities' research report highlights that despite the increasing market share of architectures like ARM in the server market, x86 still maintains a significant presence in AI systems due to its ecosystem compatibility advantages.
Industry insiders stated that this directly impacts vendors' actual ability to capitalize on the trend. CPU solutions that can seamlessly integrate with existing software systems and reduce migration costs are more likely to penetrate real business systems. Among current domestic vendors, Hygon Information stands out as a definitive target.
Relevant data indicates that, based on its C86 ecosystem compatibility, Hygon Information has achieved sustained market penetration in key industries such as finance, telecommunications, and energy.
These insiders believe that following the changes in the AI computing power structure, the increased importance of CPUs in systems has also expanded the benefit space for such vendors. From an industry evolution perspective, this round of changes is more akin to a redistribution of infrastructure rather than a simple replacement. In this process, demand will concentrate on vendors with ecosystem integration capabilities.
Overall, the current strength in the CPU sector appears to be an inevitable outcome of changes in AI computing power logic. As the industry shifts from 'increasing computing power' to 'enhancing computing power utilization efficiency,' CPUs are returning to the system's core, and their value will undergo a reassessment. This will not diminish the significance of GPUs but will alter the value distribution across the entire industrial chain.
In this process, the real beneficiaries will not only be chip companies but also infrastructure vendors capable of penetrating core system pathways and meeting real business demands. This is the more noteworthy aspect of the CPU value reassessment in the current AI computing power system.
The cover image and accompanying illustrations of the article are owned by their respective copyright holders. If the copyright owners believe that their works are not suitable for public browsing or should not be used without compensation, please contact us promptly, and our platform will make immediate corrections.