Blog 2

Use this space to tell everyone about what you have to offer.

September 8, 2023

Enabling Open Source LLMs to Become Effective Tool Manipulators

INTRODUCTION Using tools can extend the capabilities of LLMs’ to access knowledge beyond their training data. As an...

August 7, 2023

Training long sequence size models with SambaNova

At SambaNova, we have been researching and developing methods to train long sequence size (SS) models on our...

May 19, 2023

BLOOMChat: Open-Source Multilingual Chat LLM

[1] The image is created with Midjourney Highlights SambaNova, in collaboration with Together, is excited to present...

February 13, 2023

Achieving GPT 175B Level Accuracy with a 10x More Efficient Model

In this blogpost, we show how one can use the SambaNova platform to develop a GPT 13B parameter model that can...

December 22, 2022

Achieving Best-in-Class Large Language Model Accuracy in Low-Resource Settings

The opportunity: solving a range of language tasks using large language models with zero and few-shot learning

December 16, 2022

Dataflow Architecture Leads to a Performance Breakthrough on GNN Fused Kernels

A Collaboration between SambaNova Systems and Argonne National Laboratory Using the capabilities of the SambaNova...

Subscribe

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nullam ultrices massa sit amet auctor scelerisque. Cras vel quam non lorem tincidunt facilisis.