Blog 2
Use this space to tell everyone about what you have to offer.
Enabling Open Source LLMs to Become Effective Tool Manipulators
INTRODUCTION Using tools can extend the capabilities of LLMs’ to access knowledge beyond their training data. As an...
Training long sequence size models with SambaNova
At SambaNova, we have been researching and developing methods to train long sequence size (SS) models on our...
BLOOMChat: Open-Source Multilingual Chat LLM
[1] The image is created with Midjourney Highlights SambaNova, in collaboration with Together, is excited to present...
Achieving GPT 175B Level Accuracy with a 10x More Efficient Model
In this blogpost, we show how one can use the SambaNova platform to develop a GPT 13B parameter model that can...
Achieving Best-in-Class Large Language Model Accuracy in Low-Resource Settings
The opportunity: solving a range of language tasks using large language models with zero and few-shot learning
Dataflow Architecture Leads to a Performance Breakthrough on GNN Fused Kernels
A Collaboration between SambaNova Systems and Argonne National Laboratory Using the capabilities of the SambaNova...
Subscribe
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nullam ultrices massa sit amet auctor scelerisque. Cras vel quam non lorem tincidunt facilisis.