Press Releases

Research Finds AI's Energy Use Is Driving Concern

Written by SambaNova | March 2, 2026

Consumers brace for AI’s power bill: Three in four fear data centers will hike household energy costs, four in five demand efficiency-first AI, and nearly all want sovereign, homegrown AI systems.

March 2, 2026 – Barcelona, Spain, FYFN  SambaNova, builders of the fastest chip for agentic AI, today released new research highlighting the mounting concerns over the energy demands of AI data centers and the impact on households and national power grids. As AI deployment accelerates, both business leaders and consumers are aware that legacy, GPU-based infrastructure is not built for the efficiency and scale required in a power‑constrained world.

The survey of 2,525 adults across the US and UK shows that concern about AI’s energy appetite is no longer abstract. Awareness of AI data centers electricity usage is widespread, and consumers are drawing a direct line between the infrastructure choices providers make and their own monthly bills.

Key findings from the AI Energy Consumer Survey

  • 75% of respondents fear AI data centers could lead to higher household energy bills in their area.
  • 75% say they are aware of the significant electricity consumption associated with AI data centers.
  • 83% believe AI companies should prioritize energy efficiency, even if it slows the rollout of new AI capabilities.
  • 71% agree AI data centers will strain their country’s power grid.
  • 91% say it is important that their country has its own AI systems.

The findings echo SambaNova’s 2024 AI Leadership Survey, which exposed a readiness gap inside enterprises as AI deployments surged. Then, 49.8% of business leaders were concerned about AI’s energy and efficiency challenges, yet only 13% monitored the power consumption of their AI systems. One year later, concern has moved beyond the data center floor and into living rooms: While leaders still struggle to measure AI power usage, three in four consumers now worry AI infrastructure will raise their household bills and strain national grids.

Rodrigo Liang: AI data centers must be engines of efficient growth

“The findings reveal a new reality: AI is no longer just an enterprise technology story — it is an infrastructure story that reaches all the way to consumers’ electricity bills,” said Rodrigo Liang, CEO and co‑founder of SambaNova. “Data centers are the growth engine of AI, but if they are built on inefficient hardware, that growth will come with unacceptable power and cost trade‑offs.”

“Consumers are telling us they want powerful, always‑on AI but they also want providers to keep grids stable and energy costs under control,” Liang continued. “This is why we’ve focused SambaNova on building efficient systems that dramatically increase tokens‑per‑second and throughput per rack, without blowing past standard power envelopes.”

Liang added: “With our new SN50-based systems, customers can stand up high‑density AI data centers that run fleets of intelligent agents in real time while staying within 20 kW per rack and using standard air cooling — no exotic power or cooling retrofits required. SN50 delivers up to five times more compute per accelerator and as much as three times better inference efficiency than leading GPU-based systems, enabling operators to scale AI services faster, serve larger models and longer context, and still reduce total cost of ownership. This is how we turn AI data centers into efficient, high‑growth infrastructure for the next decade, instead of a drag on national power systems.”

Last year’s AI Leadership Survey showed enterprises were racing ahead with AI adoption while underestimating its power implications: “Last year, we projected that by 2027 more than 90% of leaders would be concerned about AI’s power demands and would monitor consumption as a board‑level KPI,” stated Liang. “This new data suggests the inflexion point may arrive faster than expected: with three‑quarters of consumers worried about AI’s impact on their bills and 83% explicitly calling for energy‑efficient AI.”

SN50 RDU chip: Built for power‑constrained, AI‑first data centers

SambaNova’s fifth‑generation SN50 RDU is purpose‑built for fast inference and agentic workloads in modern AI data centers. Each 20 kW SambaRack SN50 integrates 16 SN50 processors, and up to 16 racks can be interconnected to support 256 accelerators over a multi‑terabyte‑per‑second fabric, enabling customers to deploy very large models with longer context while maintaining high throughput and low latency.

For data center operators facing tightening power budgets and surging AI demand, SN50 turns existing facilities into high‑density AI zones, allowing them to expand capacity quickly inside current power and cooling envelopes exactly the kind of infrastructure shift consumers are now demanding.

About SambaNova

SambaNova is a leader in next‑generation AI infrastructure, providing a full-stack platform that powers the fastest, most efficient AI inference for enterprises, NeoClouds, AI labs and service providers, and sovereign AI initiatives worldwide. Founded in 2017 and headquartered in San Jose, Calif., SambaNova delivers chips, systems and cloud services that enable customers to deploy state‑of‑the‑art models with superior performance, lower total cost of ownership and rapid time to value.

For more information, visit sambanova.ai or follow SambaNova on X and LinkedIn.

About the survey

This survey was commissioned by SambaNova on AI’s energy requirements and conducted in December 2025 with 2,525 consumers across the US and UK.