While the world is busy talking about GPT-4, Llama 2, PaLM 2 and others, Chinese giants, far from the limelight, are developing similar powerful models that might dominate the global market in the future.
In the past year, Alibaba Cloud has announced many language models — Tongyi Qianwen, Tongyi Qianwen 2.0, Qwen-7B, Qwen-7B-Chat, Qwen-72B, and Qwen-1.8B.
Qwen-7B and Qwen-7B-Chat are open-source LLMs with 7 billion parameters. Both these models are smaller versions of the Tongyi Qiawen and aim at helping small and medium businesses to start using AI. These were announced in August, a few months after the launch of LLaMA and Llama2. The company also added Qwen-1.8B, an open source smaller language model for research purposes. It possesses a context length of 2k and demands just 3GB of GPU memory.
Keep reading with a 7-day free trial
Subscribe to Sector 6 | The Newsletter of AIM to keep reading this post and get 7 days of free access to the full post archives.