Samba-CoE-v0.3, our latest Composition of Experts, surpasses DBRX Instruct 132B and Grok-1 314B on the OpenLLM...
SambaLingo has been downloaded over 15,000 times and has achieved remarkable performance of 280 tokens/s inference...
Samba-CoE v0.2 is climbing on the AlpacaEval leaderboard, outperforming all of the latest open-source models.
SambaFlow 1.18 introduces support for mixed precision on RDUs, streamlining the experience for model developers and...
Benchmarking Samba-1 with the EGAI benchmark - a comprehensive collection of widely adapted benchmarks sourced from...
We're thrilled to unveil Samba-CoE-v0.1, a scaled down version of Samba-1, our latest breakthrough model that...
SambaNova is excited to open source a collection of expert models that adapt Llama 2 to a diverse set of 9 languages.
Numbers Station and SambaNova have released a text-to-SQL model that surpasses the accuracy of GPT-4.
We are proud to release BLOOMChat-v2, a 32K sequence length, 176B multilingual language model.
One of the key characteristics of our system is performance for enterprise data center AI workloads, and SNFM...
Designed specifically for the enterprise, the SambaNova platform includes enterprise level features to deliver the...
The LLM community has proposed a variety of positional interpolation methods to extend the maximum sequence length...
SambaStudio now supports text embedding models. This new feature significantly boosts the information retrieval...
At SambaNova, we have been researching and developing methods to train long sequence size (SS) models on our...
At SambaNova, we have been researching and developing methods to train long sequence size (SS) models on our...
SambaNova and Together are excited to announce the public release of BLOOMChat, a 176 Billion parameter multilingual...
We show how one can use the SambaNova Suite to develop a model that is highly optimized towards a specific domain or...
SambaNova is committed to the development of open-source technology and today we are excited to announce that the...
SambaNova shows how one can use the SambaNova platform to develop a GPT 13B parameter model that can outperform a...
With a model that’s multiple orders of magnitude smaller, the SambaNova GPT model outperforms the leading GPT-175b...
Using the capabilities of the SambaNova DataScale® system, researchers at the U.S. Department of Energy’s Argonne...
Natural Language Processing (NLP) is one of the most pervasive areas of adoption and growth for Deep Learning.
Explainability in AI seems like a very simple and reasonable concept. In simplest terms, it is the ability to...
Given recent advances in computer vision, the popularity of convolution-free models has risen quickly, demonstrating...