Blog 2
Use this space to tell everyone about what you have to offer.
ALiBi Deep Dive: Interpolation vs. Extrapolation
Introduction Long sequence capabilities enable a variety of applications like summarizing long form documents, such...
Elevating Information Retrieval and Augmenting Large Language Models
We are pleased to announce that SambaStudio now supports text embedding models. This new feature significantly...
Enabling Open Source LLMs to Become Effective Tool Manipulators
INTRODUCTION Using tools can extend the capabilities of LLMs’ to access knowledge beyond their training data. As an...
Training long sequence size models with SambaNova
At SambaNova, we have been researching and developing methods to train long sequence size (SS) models on our...
BLOOMChat: Open-Source Multilingual Chat LLM
[1] The image is created with Midjourney Highlights SambaNova, in collaboration with Together, is excited to present...
Achieving GPT 175B Level Accuracy with a 10x More Efficient Model
In this blogpost, we show how one can use the SambaNova platform to develop a GPT 13B parameter model that can...
Achieving Best-in-Class Large Language Model Accuracy in Low-Resource Settings
The opportunity: solving a range of language tasks using large language models with zero and few-shot learning
Subscribe
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nullam ultrices massa sit amet auctor scelerisque. Cras vel quam non lorem tincidunt facilisis.