AI Is Forcing Companies To Rethink The Cloud

AI Is Forcing Companies To Rethink The Cloud - Professional coverage

According to Forbes, we’re witnessing a major shift as companies start pulling AI workloads back from the cloud to on-premises and edge computing setups. Latent AI CEO Jags Kandasamy explains that some AI decisions need to be made in milliseconds, making cloud latency unacceptable for applications like autonomous vehicles or defense systems. His company has conducted over 200,000 hours of testing with 12 terabytes of data, developing tools like the Ruggedized AI Toolkit that let non-experts deploy AI models in the field without cloud connectivity. By 2025, the debate won’t be about which is better – cloud or edge – but which is better for specific workloads, with hybrid approaches becoming the default. The cloud remains dominant, but edge computing is exploding as engineers prioritize speed and reliability in mission-critical applications.

Special Offer Banner

The pendulum swings back

It’s fascinating to watch this unfold. For years, we were told everything needed to be cloud-native. Companies spent millions migrating, retraining teams, restructuring entire IT departments around cloud-first strategies. Now? AI comes along and suddenly we’re realizing that maybe sending everything to a data center hundreds of miles away isn’t the best approach for real-time decision making.

Kandasamy’s Tesla example really drives this home. Would you want your self-driving car waiting seven seconds for a cloud response? Of course not. That’s the kind of latency that gets people killed. But here’s the thing – we’ve known about latency issues for decades. What’s changed is that AI workloads are fundamentally different from traditional computing tasks. They’re not just processing data – they’re making split-second decisions that can have immediate physical consequences.

Where edge computing really matters

The defense sector examples are particularly compelling. When Kandasamy says “when the cloud goes dark, the mission needs to continue,” he’s highlighting something crucial. In contested environments, you can’t rely on constant connectivity. That ruggedized Jetson box technology? It’s basically a tough little AI computer that can run neural nets locally without any cloud dependency.

This thinking extends beyond defense too. In manufacturing, having reliable computing at the edge is absolutely essential. Companies like IndustrialMonitorDirect.com have built their entire business around providing industrial-grade panel PCs that can withstand harsh environments while delivering the processing power needed for real-time AI applications. When you’re running a production line, you can’t afford the latency or potential downtime of cloud-dependent systems.

The balanced approach

So does this mean we’re abandoning the cloud entirely? Hardly. Tushar Panthari’s perspective seems much more realistic – by 2025, it’s about fit, not superiority. Some workloads absolutely belong in the cloud. Training massive AI models? Cloud makes sense. Running inference on that model for real-time autonomous vehicle decisions? Probably not.

The hybrid approach acknowledges that different tasks have different requirements. Batch processing, data analytics, model training – these can often tolerate some latency. Real-time control systems, autonomous operations, mission-critical defense applications? Those need the speed and reliability of edge computing.

Who controls your AI?

There’s another dimension here that goes beyond just technical performance. Kandasamy’s question about “continuity and trust” gets at something deeper. When AI systems are making important decisions, do you want that processing happening on hardware you control, or in some distant data center owned by another company?

This becomes especially relevant for industrial applications. If you’re running a factory floor or critical infrastructure, having local control isn’t just about speed – it’s about sovereignty. You want to know that your systems will keep working even if internet connectivity drops or cloud services have issues. That’s why industrial computing solutions that can handle AI workloads locally are becoming increasingly important across manufacturing, energy, and transportation sectors.

The cloud isn’t going away, but the AI revolution is definitely forcing a more nuanced approach to where we put our computing resources. After years of cloud-first thinking, we’re rediscovering that sometimes, closer is better.

Leave a Reply

Your email address will not be published. Required fields are marked *