Blog 2

Use this space to tell everyone about what you have to offer.

November 18, 2024

Oak Ridge National Laboratory Deploys SambaNova Suite, Enabling Energy-Efficient AI Inference for Science

Today, the Oak Ridge National Laboratory (ORNL) has deployed SambaNova Suite to expand its research with secure,...

November 18, 2024

Texas Advanced Computing Center Deploys SambaNova Suite, Enabling AI Inference for Science

Today, we announced a new customer relationship with the Texas Advanced Computing Center (TACC), one of the world’s...

November 18, 2024

Argonne National Laboratory Deploys SambaNova Suite to Advance AI Inference In Science Research

Today we announced that the U.S. Department of Energy’s Argonne National Laboratory will expand its AI...

September 11, 2024

SambaNova Cloud: The fastest inference and the best models - for free

SambaNova has announced our platform, which delivers world record inference performance on Llama 3 8B, 70B, and...

July 19, 2024

Three Predictions for the Upcoming Llama 3 405B Announcement

Next week’s much-anticipated announcement of the Llama 3 405B model is set to make waves in the developer community....

July 10, 2024

Typhoon model adds Thai language to Samba-1

The Samba-1 model from SambaNova offers enterprises an unmatched combination of data security and privacy, access...

June 14, 2024

SambaNova CEO explains why only one AI company wants a monopoly

Veteran tech journalist Don Clark of The New York Times interviewed SambaNova Systems CEO Rodrigo Liang before a...

May 29, 2024

Transform Your Data Privacy with SambaNova Systems

As enterprise and government agencies strive to power more innovation, drive operational efficiency, and...

May 29, 2024

SambaNova has broken the 1000 t/s barrier: why it's a big deal for enterprise AI

SambaNova is the clear winner of the latest large language model (LLM) benchmark by Artificial Analysis. Topping the...

Subscribe

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nullam ultrices massa sit amet auctor scelerisque. Cras vel quam non lorem tincidunt facilisis.