12/11 2025
575

In early December, two leading domestic universities rolled out specialized programs focused on embodied AI. This academic-industry collaboration has ignited imaginations about a promising future for China's homegrown embodied AI technology.
On December 8, Zhiyuan Robotics announced the mass production of 5,000 units. Throughout 2025, China's embodied AI sector has witnessed rapid development, from its debut on the Spring Festival Gala stage to its integration into top academic institutions by year-end. The number of participants in the embodied AI field continues to surge, with increasingly diverse interpretations of the technology.
Some perceive embodied AI as the physical manifestation of artificial intelligence, while others view it as a new paradigm for interaction. Although a universally accepted definition remains elusive, one consensus prevails: embodied AI is significant for everyone.
01
What Are the Limitations of Embodied AI?
In the era of large models, every industrial system deserves reinvention.
Optimists pose the question, "What can't embodied AI accomplish?"
Onlookers observe, "Embodied AI can fold clothes, handle logistics, play soccer, attract people, and more..."
However, a significant gap persists between vibrant demonstrations and real-world applications. To secure funding, some companies exhaust their "imagination" in demos. While a demo may showcase a particular capability, different demos do not necessarily reflect distinct algorithmic strengths. A trend of "horizontal expansion" in demos has emerged, where varied "tricks" are essentially the same, rearranged to create an illusion of prosperity. This contradiction underscores the challenge of balancing expansion with technological accumulation. An excessive focus on technological depth can lead to high investment and slow results, risking obsolescence.
Demos are abundant, yet as December arrives, embodied AI has not yet achieved widespread adoption. The first reason is that its capabilities have not yet reached the threshold for large-scale deployment. Assessing the capabilities of embodied AI requires evaluating success rates, speed, cost, and reliability in simple tasks.
While some embodied AI systems perform reliably in highly structured lab environments, with success rates exceeding 80%, their performance drops significantly in real-world settings. Moreover, even if individual task success rates improve, long-chain tasks involve multiplying probabilities, inevitably yielding lower overall success rates.
For embodied AI to achieve true adoption, it must identify suitable scenarios while improving success rates. The timeline remains uncertain, but one certainty is that the growth of embodied AI's scale does not correlate absolutely with the proliferation of demos. Like the boy who cried wolf, people heard in 2015 that "this would be the first year of embodied AI."
02
The ChatGPT Moment for Embodied AI
For embodied AI to achieve a ChatGPT moment—where it gains widespread real-world adoption—it must identify compelling applications. While capital rushes to explore use cases, companies must still determine how and where to apply the technology. On December 3, Tesla showcased a video of a robot running. Yes, embodied AI can now run—but in what scenarios do humans need robots to replace them in running?
The industry currently identifies three primary future scenarios for the deployment of embodied AI: commercial services, industrial settings, and households. The likely order of adoption will be commercial services first, followed by industrial applications, and finally, household integration.
This sequence stems from embodied AI's need for extensive data training to build a world model, enabling it to think and predict next steps. However, like a chicken-and-egg dilemma, the lack of opportunities to collect data in real-world scenarios hinders rapid model development. Commercial settings, particularly hotels, offer more controlled environments for training. From a value perspective, delivery robots effectively reduce labor costs.
Industrial scenarios demand high efficiency, making automation replacement a critical threshold. Even if robots can replicate industrial operations perfectly, their speed often lags behind human hands. For users, paying for slower "human resources" is a losing proposition. Technologically, the fragmented nature of industrial settings complicates data collection, making it difficult to achieve scale and overcome cost and efficiency barriers.
Finally, household services present two extreme perspectives. If expectations are limited to companionship and conversation, embodied AI could quickly enter the consumer market. After all, human-machine interaction is nothing new, from smart homes to AI toys. However, if embodied AI aims to become a true "family member," it faces safety and cost challenges. Definitions of "family member" often involve medical and elderly care scenarios, where safety considerations become even more critical.
The path to widespread adoption of embodied AI will likely progress from specialized to generalized capabilities. Initially, stable execution of single-scenario, single-task operations may occur, followed by single-scenario, multi-task execution, and eventually, multi-scenario, multi-task stability.
The development of embodied AI also requires industry consensus, such as standardized benchmark tests. Athletic competitions cannot fully reveal the true capabilities of embodied AI. Progress in this area demands collaboration between academia, industry, and research. Beyond Tsinghua University and Shanghai Jiao Tong University, which have already announced new embodied AI programs, several other domestic universities are applying to establish similar majors.
03
Prosperity and Anxiety Surrounding Embodied AI
For millennia, humanity has dreamed of creating artificial beings capable of performing tasks that require human intelligence and skill.
In Homer’s Iliad, Hephaestus, the god of blacksmiths and sculptors, crafts metal robots and golden servants to assist with chores. Aristotle envisioned automated tools rendering labor unnecessary. The Liezi recounts how the craftsman Yan Shi created a lifelike, singing, dancing, and emotional “mechanical puppet” for King Mu of Zhou. In Gulliver’s Travels, a mechanical device enables even the most uninformed person to write books on philosophy, poetry, politics, law, mathematics, and theology without innate talent or study.
Historically, visions of embodied AI centered on replacing humans in boring, repetitive, and low-value tasks. Yet, concerns persist about these systems evolving into “masters” of humanity, as evidenced by debates in 2025 over AI’s impact on the workforce. Thus, the future of embodied AI may lie not in replacing repetitive work but in enabling humans to delegate dangerous tasks.
Despite anxieties about the future, embodied AI has driven prosperity across multiple industries. For the chip sector, a wave of manufacturers has found new growth opportunities.
On the edge computing front, several domestic chipmakers have launched embodied AI products. Geehy Semiconductor’s G32R501 real-time control MCU meets the high computing power, efficiency, and precision demands of embodied robots in perception, decision-making, motion control, and human-machine interaction. Its “MCU+Driver+IPM” full-stack motor-specific chip, paired with Geehy’s proprietary motor algorithm platform, applies to core scenarios like robot joints, industrial encoders, and frameless torque motors, forming the “nerve center” of embodied AI.
National Technology’s N32H7 series MCU, with its multi-core heterogeneous architecture and ultra-high clock speed, delivers powerful computing and real-time responsiveness for complex control and synchronization in humanoid robots. Its built-in CORDIC coprocessor efficiently handles mathematical calculations like trigonometric and coordinate transformations, significantly reducing CPU load.
Allwinner Technology’s MR series robot chip, fabricated on a 12nm process, integrates a CPU+GPU+NPU heterogeneous architecture with 3-4 TOPS of computing power and just 5W power consumption. It supports millisecond-level responses and provides core computing for motion control and environmental perception in products like Xiaomi’s CyberDog and Unitree’s robot series, at one-third the cost of NVIDIA’s Jetson Nano.
Rockchip’s RK3588 features an octa-core 64-bit ARM architecture, combining four high-performance Cortex-A76 cores (2.4GHz) and four energy-efficient Cortex-A55 cores (2.0GHz), excelling in multitasking and complex computing. Its 6 TOPS NPU supports various data types and mainstream deep learning frameworks for efficient image recognition and voice interaction. Industry sources reveal Rockchip has shipped over 10,000 units for embodied AI applications.
Biwin Storage has introduced eMMC, UFS, BGA SSD, LPDDR4X/5/5X, and other products tailored for embodied AI, actively engaging top clients in the field. According to third-party teardown reports, Unitree’s Go2 robot dog incorporates Biwin’s LPDDR4X and eMMC storage solutions.
On the computing power front, Intel and NVIDIA remain dominant players in robotics. As previously noted, embodied AI’s VLA (Vision-Language-Action) models require world model construction, driving demand for computing resources. Intel’s heterogeneous systems, combining GPU+NPU+CPU, meet diverse workload requirements for motion control and AI inference, enabling VLA model operation.
Beyond hardware, NVIDIA launched the NVIDIA Cosmos platform to accelerate physical AI development. This platform integrates advanced generative world foundation models (WFM), tokenizers, guardrails, and efficient workflows for data processing and management. It supports world model training and accelerates physical AI development for autonomous vehicles and robots.
Due to overlaps in mechanical control and edge computing, chip suppliers for embodied AI and automotive systems share high similarity. The development trajectories of embodied AI and automobiles also exhibit parallels.
In 1885, Karl Benz built the first gasoline-powered internal combustion engine tricycle. Today, robots occupy a role akin to automobiles in 1900–1910: technological marvels yet to become societal infrastructure. While automobiles have since become ubiquitous, embodied AI's widespread adoption will take time—though not a century.
What is certain is that while the role of embodied robots remains undefined, their capabilities are far from lacking.