A New Era of Open-Source LLMs Begins
Last week, a new open-source language model, known as Falcon, emerged as a true champion in the race of large language models (LLMs). Developed by Technology Innovation Institute (TII), Abu Dhabi, UAE, the model offers better performance than Meta’s LLaMA, Stability AI’s StableLM, RedPajama, MPT, and others. It has also claimed a top spot on the Hugging Face OpenLLM Leaderboard.
With 40 billion parameters, and trained on a massive one-trillion tokens, this new foundational large language model is said to grant unprecedented access to researchers and SMEs alike. Recently, the institution announced its decision to make its model royalty-free for commercial and research use. “Waiving off Falcon 40B royalties promotes inclusive tech advancements for a cohesive society,” said Dr Ebtesam Almazrouei, director and co-founder – AI Cross-Center Unit at TII.
Keep reading with a 7-day free trial
Subscribe to Sector 6 | The Newsletter of AIM to keep reading this post and get 7 days of free access to the full post archives.