Meta Launches New Model Muse Spark: Transition from Open Source to Closed Source—Is This the Closed-Source Camp's Final Gambit?

04/10 2026 472

Open Source vs. Closed Source: Each Approach Has Its Strengths

Early this morning, Meta unveiled its latest innovation—Muse Spark, a generative model engineered for high efficiency and personalized capabilities, supporting multimodal understanding of both text and images. Designed for seamless integration, Muse Spark is set to be embedded directly into Meta's ecosystem, including Facebook, Instagram, WhatsApp, Messenger, and AI glasses, with future integration planned across all platforms.

What has truly caught the industry off guard, however, is Meta's decision to release Muse Spark as a closed-source model, a stark departure from its previous open-source strategy with Llama. This shift underscores the ongoing debate within the AI community over the merits of open-source versus closed-source models.

(Image source: Meta)

Over the past two years, the development of large AI models has been largely driven by scale, with parameters, training data, and computational power serving as the primary metrics of competition. Closed-source players like OpenAI and Anthropic have established a significant lead during this phase, as model training demands substantial financial investment, prompting these companies to adopt closed-source strategies to better position themselves for commercialization.

However, open-source models, exemplified by DeepSeek, Alibaba's Qwen, and Moonshot AI's Kimi, have made rapid strides in reasoning capabilities, cost control, and deployment flexibility, gradually closing the gap with their closed-source counterparts. The open-source approach has encouraged more enterprises to build their own models or deploy them privately, while developers are increasingly inclined to conduct secondary development based on open-source models, significantly reducing costs compared to closed-source alternatives.

(Image source: Meta)

As open-source models gain traction across the industry, Meta's decision to shift from open-source to closed-source at this critical juncture raises questions: Is open-source truly poised to dominate the industry? Can closed-source models enable Meta to recoup its substantial investments?

The Growing Threat of Open-Source Models to Closed-Source Models

Meta initially established itself as a major player in the open-source arena with the release of its LLaMA series in 2023, which gradually opened up to developers. This marked the first time a large model with mainstream capabilities was made available in an open-source format.

However, it was DeepSeek that truly elevated open-source models to a level where they directly challenged OpenAI and Anthropic. In January 2025, DeepSeek-R1 was officially released, billed as a fully open-source model, complete with technical reports, model weights, and an MIT license. Even smaller distilled versions of the model were made open-source.

DeepSeek also shattered industry perceptions of cost, revealing that its model was built in just two months for less than $6 million using a restricted version of NVIDIA's H800 chips. The training cost for R1 was approximately $294,000. With such low costs, DeepSeek triggered a stock market earthquake, causing NVIDIA to lose $593 billion in market value in a single day.

(Image source: DeepSeek)

Similarly, Alibaba's Qwen has transformed open-source into a comprehensive ecosystem, continuously updating models of varying sizes, modalities, and task orientations, enabling developers to immediately leverage the latest Qwen models for secondary development. According to official reports, the Qwen model family has accumulated 700 million downloads on Hugging Face, becoming one of the most widely used open-source AI models globally.

(Image source: Alibaba)

More importantly, Alibaba demonstrated that open-source models can foster ecosystems, be integrated by enterprises, and undergo secondary development by open-source communities for various purposes.

Moonshot AI's Kimi further solidified these capabilities in practical scenarios. Early versions of Kimi were known for their strong contextual understanding, enabling stable processing of long documents and complex information. However, in 2026, Moonshot AI open-sourced Kimi2.5, enhancing its capabilities in multimodality, tool invocation, and structured output while also enabling the completion of numerous diverse tasks within a single workflow.

(Image source: Moonshot AI)

In just one year, the former advantages of closed-source models have been severely challenged by these open-source alternatives. First, in terms of capabilities, closed-source vendors previously dominated because users were willing to accept black-box systems and high prices due to a lack of alternatives. However, open-source players like DeepSeek and Kimi now offer comparable capabilities at lower prices, making them irresistible to enterprises and developers.

Second, in terms of product completeness, while OpenAI, Anthropic, and Google Gemini offer comprehensive model families covering Q&A, vision, and images, Qwen and Kimi have also begun to fill these gaps, acquiring more complete multimodal capabilities. This makes the high prices of closed-source models seem even less justified.

Additionally, closed-source model vendors like Anthropic have long prided themselves on high control, but after enterprises experimented with open-source models tailored to their workflows, they became less interested in closed-source models that only offer API access.

In essence, open-source models are gradually dismantling the "moats" of closed-source models—first technological monopolies, then pricing, and finally ecosystems. This is why Meta's sudden decision to position Muse Spark as a closed-source model has sparked controversy. After all, open-source models have gained mainstream acceptance, and Meta, as one of the pioneers of open-source models, could have continued leveraging this advantage. However, Meta had its reasons, as its open-source model Llama never delivered the expected returns.

What Are Closed-Source Models Holding Onto?

If open-source models have gradually caught up in capabilities, pricing, and ecosystems, why are OpenAI, Anthropic, Google, and even Meta—the company that initially championed open-source models—now tightening their grip on their models? The reasons are straightforward.

Compared to open-source models, closed-source players like OpenAI and Anthropic have grasped the essence of commercialization earlier. Open-source models can certainly generate revenue, but their paths are more fragmented, potentially through cloud resources, hosting, toolchains, and enterprise customization, with the models themselves not directly generating income. In contrast, closed-source models have simpler revenue models: APIs, subscriptions, and enterprise services. As long as they provide a complete and usable service, they can simply wait to collect payments.

Data shows that OpenAI's annualized revenue reached $10 billion by June 2025 and further surpassed $20 billion by January 2026. Anthropic's annualized revenue hit $3 billion in May 2025 and approached $7 billion by October 2025, with about 80% of its revenue coming from enterprise clients.

(Image source: CNBC)

Moreover, closed-source models still hold certain advantages in terms of product offerings. While open-source models have advanced rapidly and now possess many robust capabilities, they often serve as underlying components requiring deployment, modification, and integration into business systems by developers and enterprises before they become truly usable. In contrast, closed-source models offer a complete, ready-to-use service directly from the vendor, justifying their higher prices.

Therefore, what closed-source vendors are holding onto are their established product forms and unalterable entry points, coupled with a proven profitability model, making it unnecessary to abandon this approach lightly. Similarly, while Meta significantly advanced open-source models with Llama and gained support from many developers, its massive investments of hundreds of billions of dollars ultimately yielded little substantive return.

Furthermore, Muse Spark was always intended to rely on data from Meta's social platforms and applications, making it a model base tailored to Meta's ecosystem. For such a model, Meta has even less reason to open-source it, allowing developers and enterprises to modify and use it freely.

Open Source or Closed Source: That Is the Question

In today's landscape, the debate between open-source and closed-source is no longer purely technical, as their capabilities are now largely on par. Newer models tend to have advantages in certain areas, regardless of whether they are open-source or closed-source.

Li Yanhong (Robin Li) was among the first in China to publicly support closed-source models. As early as 2024, he stated that the significance of open-source large models was overestimated, arguing that generative AI differs from traditional open-source software like Linux and Android. Ordinary users neither see nor care about how parameters change. In Li's view, closed-source models possess stronger commercialization capabilities, and only sustained profitability can ensure sufficient computational power and talent to continue advancing models.

(Image source: Artificial Intelligence Conference Forum/Baidu CEO Li Yanhong)

Conversely, Elon Musk has been more supportive of open-source large models, though many believe this is merely a tactic to annoy OpenAI. Nevertheless, he did open-source Grok's models.

Of course, Grok is pursuing an open-source approach, but Musk has his own strategies. Whenever a new version of Grok is released, Musk chooses to open-source the previous version at an opportune time, essentially retaining core capabilities while releasing older models only after new ones are introduced. In essence, he has not abandoned closed-source logic but instead incorporated "openness" as part of his competitive strategy.

(Image source: Reuters)

Meanwhile, Baidu's ERNIE Bot also open-sourced its ERNIE 4.5 model shortly after DeepSeek's debut, in June 2025. Although this move seemingly contradicted Li Yanhong's earlier statements, open-sourcing did help ERNIE Bot rapidly expand its reach and attract more developers.

Therefore, at Leikeji, we believe that open-source and closed-source models are not inherently opposed. Open-source models facilitate technological diffusion, attracting large numbers of developers to drive industry growth and making models more accessible as infrastructure for small and medium-sized enterprises. Closed-source models, on the other hand, are better suited for forming sustainable investment loops, bundling products, computational power, data, and responsibilities together. As Li Yanhong noted, only profitability ensures continued investment. Both models have their merits, stemming from different starting points and perspectives. For a long time to come, neither open-source nor closed-source models will completely displace the other, but time will reveal which direction holds greater advantages.

OpenAI, Meta, Alibaba, Moonshot AI, xAI

Source: Leikeji

Images in this article are from 123RF's licensed image library.

Solemnly declare: the copyright of this article belongs to the original author. The reprinted article is only for the purpose of spreading more information. If the author's information is marked incorrectly, please contact us immediately to modify or delete it. Thank you.