Aravind Srinivas, CEO of Perplexity, said that advances in hardware efficiency and model optimisation are making on-device AI increasingly viable. According to him, many AI tasks that currently rely on cloud-based data centres could eventually be handled locally on user devices.
This shift, he argued, has the potential to fundamentally disrupt how the data centre industry operates and scales.
Why On-Device AI Is Gaining Momentum
On-device AI refers to running AI models directly on consumer hardware such as smartphones, PCs, and IoT devices. Improvements in chips, neural processing units (NPUs), and compressed AI models now allow devices to perform complex inference without constant cloud access.
This reduces latency, improves responsiveness, and lowers ongoing cloud compute costs—key advantages for both users and companies.
Threat to the Data Centre Business Model
The claim that on-device AI can disrupt the data centre industry challenges a core assumption of the current AI boom. Today, massive data centres power large language models, search engines, and generative AI tools, consuming vast amounts of electricity and capital.
If more AI workloads shift to the edge, demand growth for large-scale data centres could slow, impacting cloud providers, infrastructure investors, and energy suppliers.
Cost, Energy, and Privacy Advantages
Running AI locally can significantly cut operational costs by reducing cloud inference expenses. It also lowers energy usage at scale, addressing growing environmental concerns around AI-driven data centre expansion.
From a privacy perspective, on-device processing keeps sensitive user data local, reducing the need to transmit information to remote servers—a major advantage in regulated and privacy-focused markets.
Limits of On-Device AI Today
Despite its promise, on-device AI still faces constraints. Large models require significant memory and compute, which many consumer devices cannot yet support at full scale. Training models will also remain data-centre-dependent for the foreseeable future.
As a result, experts expect a hybrid future, where smaller models run locally while complex tasks continue to rely on the cloud.
Industry-Wide Implications
If the vision outlined by the Perplexity CEO gains traction, chipmakers, device manufacturers, and AI software companies could benefit from increased demand for powerful local hardware. At the same time, cloud providers may need to rethink growth strategies built on ever-expanding AI compute demand.
The shift could reshape investment flows across the entire AI value chain.
What This Means for the AI Ecosystem
The statement that on-device AI can disrupt the data centre industry reflects a broader debate about the future architecture of AI. Rather than centralised intelligence, the next phase of AI may be more distributed—embedded directly into everyday devices.
This could democratise AI access while reducing infrastructure bottlenecks.
Conclusion
The Perplexity CEO’s comments signal a potential turning point in how the tech industry thinks about AI deployment. While data centres will remain critical, especially for training and large-scale tasks, the rise of on-device AI could meaningfully reduce their dominance over time.
If hardware and model efficiency continue to improve, the balance of power in the AI ecosystem may gradually shift—from massive cloud infrastructure to the devices in users’ hands.
