Blog 2
Use this space to tell everyone about what you have to offer.
SambaNova Delivers Accurate Models At Blazing Speed
In late February, we announced Samba-1, a CoE architecture that is a paradigm-shift, and will ultimately become the...
Using Mixed Precision on RDUs
Deep learning frameworks have traditionally relied upon the 32-bit single-precision (FP32) format. However, FP32...
Sambaverse: Discover, Compare, Evaluate
As generative AI transforms the enterprise, it is becoming increasingly vital that organizations choose the right...
Benchmarking Samba-1
Today, we announced Samba-1, a Composition of Experts model with over 1 trillion parameters, built on top of open...
Samba-CoE v0.1: Unlocking the Power of Experts
We're thrilled to unveil Samba-CoE-v0.1, a scaled down version of Samba-1, our latest breakthrough model that...
Samba-1: A Composition of Experts Model
SambaNova announces Samba-1, the first trillion-parameter generative AI model that meets the performance, accuracy,...
High-Accuracy AI Models in 9 Languages | SambaLingo
SambaNova is excited to open source a collection of expert models that adapt Llama 2 [12] to a diverse set of 9...
Samba-CoE-v0.1 - Our Latest Breakthrough Model coming this week!
Built on top of open source models, using unique ensembling methods, the model outperforms Gemma-7B from Google and...
Text-to-SQL accuracy that beats GPT-4
SambaNova and Numbers Station is proud to announce the release of the SambaCoder-nsql-Llama-2-70B model, which...
Subscribe
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nullam ultrices massa sit amet auctor scelerisque. Cras vel quam non lorem tincidunt facilisis.