Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Alibaba AI reorganizes again
Less than a month after the last concentrated reorganization around Token Hub, Alibaba’s organizational structure for its AI business is undergoing another iteration.
On April 8, Alibaba Group CEO Eddie Wu released an internal letter announcing AI-related organizational adjustments, including the establishment of a new Group Technology Committee, upgrading the Tongyi large-model business division, and accelerating AI development.
According to the internal letter, Alibaba set up a Technology Committee at the Group level, with Eddie Wu serving as group leader. Members include Zhou Jingren, Wu Zeming, and Li Feifei. Among them, Zhou Jingren serves as the Chief AI Architect of the Technology Committee. Li Feifei is responsible for Alibaba Cloud technology and the construction of AI cloud infrastructure, and Wu Zeming is responsible for building the Group’s business technology platform and the AI inference platform.
Three people, three lines—pointing respectively to the model, the infrastructure, and the inference platform.
It’s important to note that Alibaba’s traditional organizational structure emphasizes “specialization + BU structure.” But this time, Alibaba has brought together everyone who can make it work for the future at one table, integrating key links that were previously scattered across cloud, business lines, and model teams.
An individual close to Alibaba told Wall Street Insights that in the past, the company was good at “stacking people, stacking resources, and stacking business matrixes,” but in the era of large models, this approach no longer works. “Models need to be trained quickly, inference needs to be deployed quickly, businesses need to be reused quickly—if the organization becomes fragmented again, it will slow down the entire chain.”
Therefore, within the industry, the newly established Technology Committee is viewed as a decision-making hub. In which direction the model iterates, how to allocate computing resources, and how to build the inference platform—these will all be decided at this level.
One noteworthy detail is that in this round of adjustments, Wu Zeming stepped down as CEO of Taobao Flash Sale, and Lei Yanqun took over. Wu Zeming is an old hand at Alibaba and also the Group CTO. Having him step away from front-line business management to focus on building the Group’s technology platform and AI inference platform is, in itself, a signal: within Alibaba, the priority of AI infrastructure has already been elevated above day-to-day business operations.
A similar logic can be seen with Li Feifei as well. He has served as Alibaba Cloud CTO and is also responsible for building AI cloud infrastructure.
Alibaba Cloud is the “shovel-side” port for Alibaba’s AI strategy—if enterprises want to use large models, they need computing power, inference services, and a model-calling platform. What Li Feifei needs to do is ensure that this pipeline remains sufficiently smooth.
As Chief AI Architect, Zhou Jingren simultaneously oversees the upgraded Tongyi large-model business division, shouldering the most core mission: ensuring that Alibaba’s models are always positioned on the global first-tier. The explosive performance of Qwen 3.6 Plus proves the viability of this route, but the large-model race has no endgame. OpenAI, Anthropic, and at home ByteDance and Tencent—no one will stop and wait.
By pooling advantageous forces and resources, and investing in the most critical battleground, this shows that Alibaba has entered a state of full-scale battle for AI. In fact, this is already Alibaba’s second major organizational transformation around AI within less than a month.
On March 16, Alibaba just announced the establishment of the ATH business group—full name Alibaba Token Hub—directly led by CEO Eddie Wu. Under it are Tongyi Laboratory, the MaaS business line, the Qianwen business division, the Wukong business division, and the AI Innovation business division. A complete chain of “creating tokens, delivering tokens, and applying tokens” has been tied together at the organizational level.
This is a judgment about the future business model: the core of large models is not capability, but consumption. Whoever can make Token flow faster, broader, and more stable will be the one that controls the future of AI cloud.
In Alibaba Group’s recent earnings call, Eddie Wu said that since 2026, the company has already seen some very clear trends. Large models are starting to develop the capability to complete complex To B workflows. When more and more enterprises begin using Agents driven by large models internally to complete end-to-end work tasks, the entire IT budget market facing AI and cloud undergoes fundamental change.
Eddie Wu said that when enterprises consume Tokens, they no longer treat them as an IT budget. Instead, they treat them as production costs or R&D costs—part of production inputs—which is the most fundamental internal factor supporting long-term AI growth.
Facing the huge and long-term growth momentum of the AI market, Eddie Wu announced the commercial targets for Alibaba Group’s AI strategy. Over the next five years, including MaaS, annual revenue from cloud and AI commercialization is expected to exceed $100 billion.
“Regarding the goal that annual revenue from AI-and-cloud-related businesses will exceed $100 billion over the next five years—based on the current market growth space, and our existing business base and product foundation—the path to achieving this goal is highly visible.”
Of course, it isn’t only Alibaba that changes course to capture opportunities in the era. During the same period when Alibaba was adjusting intensively, Tencent was also reshaping its AI organizational structure.
On March 20, Tencent’s internal notice revoked AI Lab. Some personnel were merged into the Large Language Model department, reporting to Chief AI Scientist Yao Shunyu. AI Lab was founded in 2016 and was one of Tencent’s earliest enterprise-level AI laboratories. Its dissolution is precisely to concentrate scattered AI R&D forces onto the main line of the Hunyuan large model.
In mid-March, Tencent President Liu Chiping disclosed in a media exchange that over the past few months Tencent had carried out intensive team adjustments and workflow restructuring around AI.
He said that in the next two to three quarters, “quantifiable progress” will be shown. Tencent’s new version Hunyuan HY 3.0 has already been under internal testing, and it is claimed that inference and Agent capabilities have been significantly improved.
The moves by the two tech giants were almost synchronized, but the paths differed. Alibaba’s thinking is more “institutionalized”—building from scratch a complete business group centered on Token, with the CEO personally leading it and five business divisions advancing in parallel. Tencent’s thinking is more “intensive”—gathering the scattered AI R&D forces into a unified technical foundation, making Hunyuan the only foundation-model entry point.
They come to the same destination: both companies are doing the same thing—eliminating internal AI islands and pouring resources in one direction.
This is not a coincidence. The AI competition in 2026 has entered a new stage—not a strategic selection question of whether to do AI, but a contest of execution power to see whether AI can be taken to the absolute extreme. The ceiling of model capabilities is still rising rapidly. Agents are moving from concept to product, and enterprise-side demand is shifting from “let’s try it” to “full deployment.”
In this window of opportunity, whoever has higher organizational efficiency, whoever integrates resources faster, will be the one that can take the largest slice of the cake.
Risk Warning and Disclaimer Clauses