Advertisement
Advertisement
Alibaba
Get more with myNEWS
A personalised news feed of stories that matter to you
Learn more
Alibaba Group Holding’s open-source development strategy for Tongyi Qianwen has helped promote the commercialisation of this artificial intelligence model. Photo: Shutterstock

Alibaba strengthens commitment to open-source development of AI models amid debate over this strategy

  • Alibaba has its sights set on becoming ‘more aggressive in open-source [development]’ after the gains made by its Tongyi Qianwen AI model
  • This AI push could fuel further debate on whether China can continue relying on open-source development, instead of bolstering its own tech ecosystem
Alibaba
Alibaba Group Holding has strengthened its commitment to the open-source development of its large language model (LLM) – the deep-learning technology used to train generative artificial intelligence services like ChatGPT – several months after Tongyi Qianwen was made available to third-party developers.
The e-commerce giant could have been “more aggressive in open-source [development]” over the past year based on the gains made by Tongyi Qianwen, according to Alibaba’s Lin Junyang, who is in charge of the model’s open-source buildout, in a WeChat post published on Monday. Alibaba owns the South China Morning Post.
“We really feel the power of the open-source community,” Lin said in the post published by Alibaba’s open-source platform ModelScope. “After contributing to the community, the community has also given us a lot of feedback.”
Open source gives public access to a program’s source code, allowing third-party software developers to modify or share its design, fix broken links or scale up its capabilities. Open-source technologies have been a huge contributor to China’s flourishing tech industry over the past few decades.

05:03

How does China’s AI stack up against ChatGPT?

How does China’s AI stack up against ChatGPT?
After Tongyi Qianwen’s launch in April last year, Alibaba’s cloud services unit, which is responsible for AI initiatives, open-sourced two simpler forms of its LLM that were trained on 7-billion “parameters” – a machine-learning term for the variables present in the model on which it was trained that can be used to infer new content.
Alibaba Cloud last December opened access to the 72-billion-parameter and 1.8-billion-parameter versions of its LLM, and also made freely available another model that understands audio. These were made accessible via ModelScope and US open-source platform Hugging Face.
“After the 72B [version] was released, people felt that Tongyi Qianwen has risen to a higher level than before,” Lin said. He added that the next step was to “compare [Alibaba’s progress] with the top international players”, acknowledging that French start-up Mistral AI and Meta Platforms currently lead the global open-source community in terms of LLM development.
Alibaba’s latest AI push could fuel further debate on whether China can continue relying on the open-source community, instead of bolstering its tech ecosystem, amid concerns that such collaborations would put the country at a disadvantage, especially when global tensions ratchet up further.

Alibaba Cloud cuts prices for international customers amid global AI race

Still, Alibaba’s Lin asserted the apparent advantages to continued open-source AI technology development.

“Open-source [development] has helped promote the commercialisation of our large [language] models,” Lin said. He pointed out that “the developers [with access to Alibaba’s LLMs] are mainly the core engineers at our potential customers”.

In January, China approved this year’s first batch of LLMs. The latest approvals – a total of 14 LLMs and AI enterprise applications for commercial use – come after an initial number of generative AI services were allowed for release to the public last August.

China gives nod to 14 AI large language models and enterprise applications

Baidu co-founder, chairman and chief executive Robin Li Yanhong recently said the company’s Ernie LLM would not be open-sourced because this strategy is not cost-effective and reiterated that there were too many open-source AI models in the market.
Last November, Li described the repeated launch of various LLMs on the mainland as “a huge waste of resources” at the annual X-Lake Forum in Shenzhen.

The number of government-approved LLMs and related AI applications on the mainland currently totals more than 40. But at present, there are more than 200 China-developed LLMs in the market.

China trails US in AI development ‘by two years’: Alibaba’s Joe Tsai

Alibaba’s Lin also expressed global tech industry concerns about the scarcity of advanced graphics processing units (GPUs), which provide the computing power in data centres for LLMs to train generative AI systems.

That echoed comments made by Alibaba co-founder and chairman Joe Tsai earlier this month in a podcast interview with Nicolai Tangen, chief executive of Norges Bank Investment Management – the branch of Norway’s central bank that is responsible for managing the world’s largest sovereign wealth fund.
Tsai said US export restrictions that bar Chinese companies’ access to advanced semiconductors, such as the GPUs from Nvidia, have “definitely affected” tech firms on the mainland, including Alibaba. Tsai pointed out that China’s tech companies are “possibly two years behind” the top AI firms in the US.
Post