• Turing Post
  • Posts
  • China Opens Up AI: Top 5 Large Language Models to Know Now

China Opens Up AI: Top 5 Large Language Models to Know Now

Dive into the cutting-edge of Chinese AI with these powerful language models

Last week, Securities Times, a Chinese national financial newspaper, reported that local regulators granted approvals to a total of 14 large language models (LLMs) for public use. This year’s first batch of approvals included 40 AI models in total.

Among the 14 LLMs greenlit for commercial deployment are products from well-known companies such as smartphone maker Xiaomi and the startup 01.AI started by Kai-fu Lee.

The full list of LLMs is not disclosed so we decided to focus on the five prominent Chinese models you need to know about:

  1. CPM (2020; 2.6B): China’s first large-scale pre-trained model created by the Beijing Academy of Artificial Intelligence (BAAI) and Tsinghua University.

  2. ERNIE 3.0 Titan (2021; 260B): A big brother of Baidu’s ERNIE 3.0 foundation model having 10B parameters. It was designed to explore the performance of scaling up the original model. At the time of its creation, it was the largest Chinese dense pre-training model.

  3. QWEN 1.5 family (2023; 1.8B, 7B, 14B): The QWEN models distinguish themselves through extensive training on diverse datasets, enabling superior performance across a variety of tasks including language understanding, coding, and mathematics, as well as offering specialized capabilities in vision-language tasks.

  4. Yi family (2023; 6B, 34B): These models, developed by 01.AI, represent a pioneering force in bilingual large language models, demonstrating exceptional strength in language understanding and commonsense reasoning. Particularly notable for their performance on both English and Chinese benchmarks, they have secured top rankings on prestigious leaderboards like AlpacaEval and Hugging Face Open LLM Leaderboard, making them among the strongest LLMs globally.

  5. Baichuan 2 (2023; 7B, 13B): By providing both Base and Chat models in various sizes, including an efficient 4bits quantized version, Baichuan 2 facilitates a wide range of research and commercial applications, lowering the barrier for innovation and enabling more developers to incorporate advanced AI capabilities into their projects.

If you’ve found this article valuable, subscribe for free to our newsletter.

We post helpful lists and bite-sized explanations daily on our X (Twitter). Let’s connect!

Join the conversation

or to participate.