Chinese tech giants reveal how they’re dealing with U.S. chip curbs to stay in the AI race

Chinese tech giants reveal how they're dealing with U.S. chip curbs to stay in the AI race

Niphon | Istock | Getty Images

Tencent and Baidu, two of China’s largest technology companies, revealed how they’re keeping in the global artificial intelligence race even as the U.S. tightens some curbs on key semiconductors.

The business’ methods include stockpiling chips, making AI models more efficient and even using homegrown semiconductors.

While the administration of U.S. President Donald Trump scrapped one controversial Biden-era chip rule, it still tightened exports of some semiconductors from companies including Nvidia and AMD in April.

Big names in the sector addressed the issue during their latest earnings conference calls.

Martin Lau, president of Tencent — the operator of China’s biggest messaging app WeChat — said his company has a “pretty strong stockpile” of chips that it has previously purchased. He was referring to graphics processing units (GPUs), a type of semiconductor that has become the gold standard for training huge AI models.

These models require powerful computing power supplied by GPUs to process high volumes of data.

But, Lau said, contrary to American companies’ belief that GPU clusters need to expand to create more advanced AI, Tencent is able to achieve good training results with a smaller group of such chips.

“That actually sort of helped us to look at our existing inventory of high-end chips and say, we should have enough high-end chips to continue our training of models for a few more generations going forward,” Lau said.

Regarding inferencing — the process of actually carrying out an AI task rather than just training — Lau said Tencent is using “software optimization” to improve efficiency, in order to deploy the same amount of GPUs to execute a particular function.

Lau added the company is also looking into using smaller models that don’t require such large computing power. Tencent also said it can make use of custom-designed chips and semiconductors currently available in China.

“I think there are a lot of ways [in] which we can fulfill the expanding and growing inference needs, and we just need to sort of keep exploring these venues and spend probably more time on the software side, rather than just brute force buying GPUs,” Lau said.

Baidu’s approach

Baidu, China’s biggest search company, touted what it calls its “full-stack” capabilities — the combination of its cloud computing infrastructure, AI models and the actual applications based on those models, such as its ERNIE chatbot.

“Even without access to the most advanced chips, our unique full stack AI capabilities enable us to build strong applications and deliver meaningful value,” Dou Shen, president of Baidu’s AI cloud business, said on the company’s earnings call this week.

Baidu also touted software optimization and the ability to bring down the cost of running its models, because it owns much of the technology in that stack. Baidu management also spoke about efficiencies that allow it to get more out of the GPUs it possesses.

“With foundation models driving up the need for a massive computing power, the abilities to build and manage large scale GPU clusters and to utilize GPUs effectively has become key competitive advantages,” Shen said.

The Baidu executive also touted the progress made by domestic Chinese technology firms in AI semiconductors, a move he said would help mitigate the impact of U.S. chip curbs.

“Domestically developed self-sufficient chips, along with [an] increasingly efficient home-grown software stack, will jointly form a strong foundation for long-term innovation in China’s AI ecosystem,” Shen said.

China domestic chip focus

admin