News

Press Releases
sambanova. ai > press > sambanova-announces-collaboration-with-intel-on-ai-solution

Samba Nova and Intel Announce Blueprint for Heterogeneous Inference: GPUs For Prefill, Samba Nova RDUs for Decode, and Intel" Xeon" 6 CPUs for Agentic Tools

1+ day, 2+ hour ago  (203+ words) We are seeing AI Agents code output grow exponentially and as a result, Daytona is seeing the need for more and more sandboxes to run and compile this code, which runs on CPUs like Intel's Xeon" said Ivan Burazin, CEO…...

sambanova. ai
sambanova. ai > products > dataflow-architecture

Dataflow Architecture

4+ week, 2+ day ago  (521+ words) The natural movement of AI All AI models are represented as a graph of operations, where data flows from one operation to the next. To achieve faster tokens per second per user, more tokens per watt, and support more users,…...

Press Releases
sambanova. ai > press > sambanova-unveils-fastest-chip-for-agentic-ai-collaborates-with-intel-and-raises-350m

Samba Nova Unveils Fastest Chip for Agentic AI, Collaborates with Intel, and Raises $350 M+

1+ mon, 1+ week ago  (607+ words) SAN JOSE, Calif. , Feb. 24, 2026 " Samba Nova today introduced their SN50 AI chip, which boasts a max speed that's 5 X faster than competitive chips. The company also announced a planned collaboration with Intel to deliver high'performance, cost'efficient AI inference solutions, and more…...

sambanova. ai
sambanova. ai > blog > sovereign-ai-national-autonomy-in-the-ai-era

Sovereign AI: National Autonomy in the AI Era

2+ mon, 1+ week ago  (1221+ words) These concerns are why sovereign AI matters. Nations want direct control of infrastructure, models, and data. They want to protect critical systems under their own laws. They want the ability to decide how AI behaves and what rules govern its…...

sambanova. ai
sambanova. ai > blog > ai-is-no-longer-about-training-bigger-models-its-about-inference-at-scale

AI Is No Longer About Training Bigger Models " It's About Inference at Scale

3+ mon, 3+ day ago  (374+ words) Large language model (LLM) development typically has been divided into two distinct phases: the massive, capital-intensive undertaking of training; and the operational utility of inference. For years, the industry's focus " and investments " was dominated by the race to train larger models…...

sambanova. ai
sambanova. ai > blog > sambanova-partners-with-meta-to-deliver-lightning-fast-inference-on-llama-4

Samba Nova Partners with Meta to Deliver Lightning Fast Inference on Llama 4

1+ year, 1+ day ago  (561+ words) We're thrilled to announce a major milestone as we partner with Meta to bring their cutting-edge Llama 4 models to life on the Samba Nova Cloud. - Products Samba Nova's AI platform is the technology backbone for the next decade of AI…...