Could edge computing unlock AI’s vast potential?

By Wayne Carter, VP Engineering, Couchbase.

  • 1 year ago Posted in

Artificial intelligence (AI) technologies are providing transformational business benefits across many sectors. AI has set new standards of personalisation for customers and users, and teams are now achieving efficiencies across workflows that have been previously unattainable.

Although many enterprises are in their infancy with the technology, AI-enhanced services are already delighting customers and users – with the vast increase in speed of delivery perhaps the most significant upgrade for businesses. To give a simple example, a US airline recently saved 280 seconds per chat – 73,000 hours per quarter – by automating its chat channel for handling customer interactions.

As a result of AI, customer and end-user expectations are higher than ever. Many now expect real-time, hyper-personalized offers – and they expect them instantly. Internally, different business departments now expect to be able to quickly draw on AI to reduce their workload and automate repetitive tasks, at scale. Speed is king, and AI promises to deliver just that.

As businesses explore further AI use cases, many may rightly be asking: will the current IT infrastructure be able to power this new generation of services?

One computing model in particular, edge computing, has previously demonstrated the ability to reliably power fast business applications. To illustrate its potential, the market value of edge AI in 2022 stood at $14.7 billion. With the AI race set to pick up pace though 2023 and beyond, could edge help businesses amplify the performance of their applications?

Living on the edge

To recap, edge computing is the deployment of computing and storage resources at the location where data is produced and consumed. This differs from more traditional cloud-based setups where data is processed at a data center. The upshot is that data at the edge uses less bandwidth and can be processed, analyzed and accessed much faster.

In practice, this might mean a railway station – for instance – places sensors within the station buildings and infrastructure, or within the train, to collect and process data about train speeds, track use, signaling and other business critical factors.

Similarly, both physical and online retailers have used edge computing to power on-the-spot item recommendations, while manufacturers have seen success predicting and preventing problems with high-speed factory processes. In all these cases, processing the relevant data closer to its source at the edge of the network results in better quality data and much faster recommendations and insights.

Ultimately, edge computing helps businesses make the most of the vast number of different touchpoints that users, or customers, are now accustomed to. IDC estimates the number of connected IoT devices will reach 55.7 billion by 2025.

How does the edge model support AI applications?

Beyond the increased performance that AI applications demand, a key benefit of the edge model is reliability and resilience. Consumers have taken to AI, with 73% worldwide saying they trust content produced by generative AI, and 43% keen for organizations to implement

generative AI throughout customer interactions. Businesses that can’t keep their AI-powered services running will suffer from declining customer satisfaction and even a drop in market share.

When a traditional data center suffers a power outage – perhaps due to a grid failure or natural disaster – apps reliant on these centralized data centers simply cannot function. Edge computing avoids this single point of failure: with compute more distributed, smart networks can instead use the processing power nearest to them to keep functioning.

There are also benefits when it comes to data governance. If sensitive data is processed at the edge of the network, it doesn’t need to be processed in a public cloud or centralized data center, meaning fewer opportunities to steal data at rest or in transit. Better data governance means more trust in AI applications, and much smoother sharing of information between business employees, partners and customers.

Finally, there are cost savings to think about. Cloud service providers often charge businesses to transfer data from their cloud storage. And given that AI-driven applications rely on large volumes of data – with applications in industries such as digital marketing sifting through literally billions of data points – these charges can rack up very fast if data is pulled from the cloud.

This isn’t to say edge computing is a completely flawless way of harnessing AI. It isn’t always appropriate for AI to run at the edge. For example, deep learning AI training utilizes immense amounts of data in centralized clouds, where storage and horsepower are constrained only by cost.

But smaller machine learning AI models can run at the edge directly on edge devices, enabling them to make on-the-spot recommendations to a user based on local data and the current situation.

A solid foundation for AI-driven business

Businesses today are enamored by AI, it’s fair to say. They can see its transformative power, and want this power coursing through their enterprises. But to really make the most of AI, they must ensure the right architecture is in place to leverage the benefits of AI at the edge. Modern mobile databases, which can run at the edge, in the cloud and on mobile devices, are one powerful tool for doing this.

It’s a cliché that AI is only as useful as the data it’s fed. Perhaps it would be more accurate to say it’s only as useful as the data it’s fed fast. Processing data quickly is vital for AI to flourish, and edge computing – though it’s no silver bullet – can help hugely.

By David de Santiago, Group AI & Digital Services Director at OCS.
By Krishna Sai, Senior VP of Technology and Engineering.
By Danny Lopez, CEO of Glasswall.
By Oz Olivo, VP, Product Management at Inrupt.
By Jason Beckett, Head of Technical Sales, Hitachi Vantara.
By Thomas Kiessling, CTO Siemens Smart Infrastructure & Gerhard Kress, SVP Xcelerator Portfolio...
By Dael Williamson, Chief Technology Officer EMEA at Databricks.