While the AI world faces a GPUs crisis, AMD has figured out a way to capitalise on the situation. The chip company, in September, tied up with Lamini, which helps startups build and run generative AI products by fine-tuning existing foundational models on AMD GPUs.
During AMD's Advancing AI event, Lamini co-founder Sharon Zhou highlighted their continuous use of AMD hardware and software. This revelation came two months ago, when Lamini made a significant disclosure that it had been exclusively utilising AMD GPUs over the past year.
Lamini, in collaboration with AMD's ROCm open software ecosystem, has unveiled the LLM Superstation, a finely tuned supercomputer with 128 AMD Instinct GPUs. This allows customers to train large LLMs, including Meta AI's Llama 2, making their AI models proprietary.
Keep reading with a 7-day free trial
Subscribe to Sector 6 | The Newsletter of AIM to keep reading this post and get 7 days of free access to the full post archives.