Infinite Context Length 🤯
Hello there! It looks like the rain gods mistakenly visited Dubai this week, instead of Namma Bengaluru. And while your friendly AI Human, Amit Raja Naik, waits patiently for the rains to descend on Sector 6, a mini-storm brews in the tech world.Â
Meta recently launched MEGALODON, an advanced neural architecture with unlimited context length. It showcases enhanced efficiency over Llama 2 with 7 billion parameters and 2 trillion training tokens, hinting that Llama-3, expected this summer, is likely to include infinite context length capabilities.Â
Soon, the debate over which model boasts the greatest context length will become irrelevant. Most recently, Microsoft, Google, and Meta have taken strides in this direction, making context length infinite.Â
Keep reading with a 7-day free trial
Subscribe to Sector 6 | The Newsletter of AIM to keep reading this post and get 7 days of free access to the full post archives.