In this post, we explore why tokens per second doesn't paint the full picture of enterprise LLM inference...
Samba-CoE-v0.3, our latest Composition of Experts, surpasses DBRX Instruct 132B and Grok-1 314B on the OpenLLM...
SambaLingo has been downloaded over 15,000 times and has achieved remarkable performance of 280 tokens/s inference...
Samba-CoE v0.2 is climbing on the AlpacaEval leaderboard, outperforming all of the latest open-source models.
SambaFlow 1.18 introduces support for mixed precision on RDUs, streamlining the experience for model developers and...
Benchmarking Samba-1 with the EGAI benchmark - a comprehensive collection of widely adapted benchmarks sourced from...
We're thrilled to unveil Samba-CoE-v0.1, a scaled down version of Samba-1, our latest breakthrough model that...
SambaNova is excited to open source a collection of expert models that adapt Llama 2 to a diverse set of 9 languages.
Numbers Station and SambaNova have released a text-to-SQL model that surpasses the accuracy of GPT-4.