That’s because it lets companies analyze data in real-time or near real-time, making it easier to train AI models and boost how AI-driven applications perform.
The impact of edge computing spans applications, including 5G-enabled multi-access edge computing, autonomous vehicles, aerial imagery analysis, biometrics, access control, defect inspection, and even smart spaces.
Sunil Senan is senior vice president and global head – data, analytics, and AI at Infosys, an IT services and consulting company,
He says that the edge’s major role is in reducing latency, enhancing efficiency, and lessening dependencies on centralized cloud resources.
By deploying AI solutions directly at the edge, organizations can achieve quicker decision-making and actions, easily integrate AI into various operations, and minimize data transfer over external networks.
“This approach enhances data security, reduces bandwidth usage, and accelerates decision-making and real-time actions, ultimately driving superior business outcomes,” he says.
Gerald Longoria, director and business unit executive of Truscale infrastructure services at technology company Lenovo, says that Edge computing plays a vital role in the evolution of GenAI because it addresses the problems of real-time processing, reduced latency, and efficient data management that generative AI needs to solve as it grows.
He adds that analyzing data at the network’s edge instead of just relying on centralized cloud infrastructure also allows edge computing to solve specific problems in verticals adopting GenAI tools.
Meanwhile, a report called “Leading with Edge Computing: How to Reinvent with Data and AI” report from Accenture suggests that evolving from ad hoc to integrated approaches that leverage the power of cloud, data, and AI will be critical for businesses to affordably accelerate edge innovation.
Edge Computing’s Value to AI
When getting started with GenAI, many companies begin with a cloud-managed model for speed, says Teresa Tung, cloud-first chief technologist at Accenture and co-author of the report.
However, as companies move GenAI out of pilots and into production, they must face the usual questions about long-term cost, security, data privacy, and sovereignty.
“And Edge’s AI value proposition is more important than ever, whether it’s to train the model with the data gravity already at the edge or as an efficient means of inferencing to provide real-time insight and highly reliable responses,” Tung says.
GenAI is a fancy calculator – it’s vector and statistical math at a large scale with many factors, according to Wayne Anderson, director of cloud, security, and infrastructure at BDO Digital, a provider of technology and business advisory services. When done well, edge computing enables organizations to aggregate data where it is happening and increase the quality of the data.
He notes that raw data from many of today’s business processes can be unstructured, and there are often errors in the input, especially when humans are part of creating or entering the data. Edge computing is where you can usually do the work and analysis and ensure the inputs are right before you send them to the expensive cloud training processes.
“Otherwise, you spend a lot more with master’s degree-level math to get the wrong answer because the data you started with is wrong,” he says.
READ MORE: What is Multi-Access Edge Computing?
Importance of AI at the Edge
For businesses, the power to drive positive customer experiences and revenue growth comes from capturing and understanding their own data at the point where it’s created, says Gil Shneorson, senior vice president of edge solutions at Dell Technologies.
He explains that AI is the key to harnessing valuable data at the edge so businesses can process information close to the source and make real-time decisions that positively impact the bottom line, from streamlining processes to driving cost optimization.
For example, a retailer may deploy AI at the edge to gather data on store foot traffic and product interactions, helping to improve inventory management and enhancing the shopping experience, Shneorson says.
And for a manufacturer, AI at the edge can monitor equipment performance, detect anomalies, and enable predictive maintenance to reduce downtime and increase productivity.
“Businesses that tap into the power of AI at the edge can help accelerate their business growth while also delighting customers,” he says.
Edge and AI naturally complement one another to push the innovation boundaries of IoT, according to Sudhir Mehta, global vice president Optra engineering and products at Lexmark, a provider of printing and imaging products, software, and services.
He says the sheer volume of data that IoT can produce hampers its own potential. Organizations can only process so much data so quickly. The combination of edge and AI solves that.
The data processing happens where it is needed – at the network’s edge. Mehta says that processing at the edge – as opposed to in the cloud – cuts down on latency while ensuring security and compliance.
With AI, you now have a fully automated platform in place that can immediately turn that data into insights – applying it in completely new and interesting ways that you could not easily do manually.
Edge Takes AI Personalization to the Next Level
Businesses developing and using generative AI require significant computing resources, which can cause latency challenges with typical cloud computing, according to Peter Wang, CEO at Anaconda, a data science platform provider.
Edge computing, on the other hand, results in lower latency because the data processing is closer to where the data is generated and consumed, minimizing the time spent sending data to and from centralized servers, he says.
“This real-time communication is critical for businesses implementing generative AI, such as chatbots interacting with employees or customers,” Wang adds.
Edge computing also takes AI personalization to the next level, says Nick White, data strategy senior director at Kin + Carta, a digital transformation consultancy. It allows AI to evolve dynamically, helping tailor experiences to the user’s distinct preferences and needs in a sophisticated and seamless manner. All this is done in real-time while also minimizing latency.
He explains that edge devices continually accumulate data on their individual preferences and behaviors as users engage with the system. This is then integrated back into the model, enabling on-device learning and adaptation.
Additionally, the decentralized nature of edge computing ensures that this refinement process occurs directly on the user’s device, respecting privacy and creating almost the perfect scenario of secure personalization to users, White adds.
More Edge is Coming
The continued demand for AI-driven innovation and other data-intensive workloads will fuel more edge adoption in 2024, says Anderson.
As part of their AI transformation roadmaps, businesses must consider managing the influx of data required for these workloads, he says.
Organizations must consider the type of data, where it resides, their security controls, and compute capacity.
Edge data centers can improve performance, provide local compute for rapid inferencing, and reduce AI application latency while addressing security and data residency concerns, Anderson says.
Data preparation and data quality have gained increasing significance for businesses as AI applications proliferate across industries, says Emma McGrattan, senior vice president of engineering at Actian Corp., the data and analytics division of HCLSoftware.
The recent surge in GenAI has further accelerated this trend. Real-time data analytics that is processed at the edge – making decisions on locally-generated data from IoT devices and local edge servers – provides businesses with instantaneous insights for key decision-making across their operations.
“It will also be crucial in providing large volumes of complex data needed to train their AI models so their AI tools can adapt and align more effectively with the specific needs of the business across internal workflows, customer experience, and more,” McGrattan adds.
“An AI model is only as good as the training data it has been provided, making preparation and quality prerequisites for successful AI initiatives.”