Products
Developers
About

Foundation models and deep learning for science

How foundation models are advancing scientific discovery

SambaNova at Lawrence Livermore National Laboratory

 

World record performance, world changing research

The world is entering a new era of AI, powered by foundation models and state-of-the-art deep learning. These new models are enabling organizations to achieve new and exciting discoveries across numerous areas of research such as medical image analysis, large language models for science, and multi-physics simulation workloads. While the potential for using foundation models and deep learning to accelerate these discoveries is immense, these state-of-the-art models come with their own training and management challenges.

SambaNova’s Foundation Model Platform delivers value and innovation across the full AI stack – hardware, software, systems, and even pre-trained models – to enable research organizations to achieve a performance advantage over GPU based systems on the most challenging foundation models and deep learning workloads. For research organizations, this means more experiments and more discoveries with the potential to change the world.

UNLOCK NEW DISCOVERIES WITH DEEP LEARNING

Large Language Models (LLMs) for Science

Large language models (LLMs) can unlock insights in unstructured data with human level accuracy and solve dozens of language tasks with a single model. Beyond traditional language tasks, these models have demonstrated potential in scientific domains by becoming ‘experts’ in specific topics, such as genomic data for Covid-19 research.

Challenge

Challenge

AI can revolutionize research by unlocking new insights in research reports, studies, journals and more. However, training custom LLMs on domain specific scientific topics is a complex and technically challenging process on GPU-based systems.
Solution

Solution

SambaNova’s dataflow architecture and large memory enable research organizations to train custom LLMs with a scientific corpus that can understand domain specific content, without the massive parallelization needed with GPU-based systems.
Results

Results

Delivering 6x faster performance over legacy GPU-based systems, researchers can now make ground breaking discoveries 6x faster than before, while leveraging the power of LLMs to further validate their findings before publication.
ACCELERATING MULTI-PHYSICS SIMULATIONS

Deep learning accelerated surrogate models

Surrogate models are deep learning models that replace one or more components of larger multi-physics simulation workloads, such as computational fluid dynamics or weather forecasting. However, the GPU/CPU-based architectures used to compute these simulation workloads struggle to deliver performance on sparse, detailed deep learning models.

first-level-card-img-

Challenge

Research organizations must either operate both types of workloads on the same architecture, or tradeoff gained speed with added latency from shifting data between different systems optimized for each of the different workloads.
first-level-card-img-2

Solution

The SambaNova platform significantly improves the performance of sparse, detailed deep learning models compared to GPU-based architectures, even taking into consideration the additional latency requirements of two different systems.
first-level-card-img-3

Results

Utilizing the SambaNova platform, researchers can efficiently run deep learning workloads for advanced scientific workloads, such as computational fluid dynamics simulations, weather forecasting, and physics informed neural networks.