Personal tools

Edge AI and Applications

Edge AI_011124A
[Edge AI - Visio.ai]
 
 

Edge AI: The Future of AI and Edge Computing

 

- Overview

Edge inteligence is emerging as a new concept and has extremely high potential in addressing the new challenges in B5G (6G) networks by providing mobile edge computing and edge caching capabilities together with AI to the proximity of end users.

Edge AI is the process of running machine learning (ML) algorithms on devices near the edge of a network to make predictions and decisions as close to the data source as possible. Edge AI is also known as edge machine learning (edge ML).

Edge AI uses edge computing and AI to perform ML tasks on interconnected edge devices. Edge AI allows data to be stored near the device location, and AI algorithms enable the data to be processed right on the network edge. Edge AI can perform tasks such as: Predictive analytics, Speech recognition, and Anomaly detection.

Edge AI can run on a wide range of hardware, from existing central processing units (CPUs), to microcontrollers and advanced neural processing devices.

Edge AI is expected to drive the AI future, by moving AI capabilities closer to the physical world. It opens up possibilities for new, robust, and scalable AI systems across multiple industries, including: Healthcare, Manufacturing, Smart homes, and Security and surveillance.

Edge AI is growing in popularity as industries discover new ways to harness the power of edge AI to optimize workflows, automate business processes, and unlock new innovation opportunities while solving issues such as latency, security, and cost reduction.

 

- The Main Drivers of Moving AI to the Edge

Privacy is one of the main drivers pushing artificial intelligence to the edge, especially for consumer devices such as phones, smart speakers, home security cameras, and consumer robots. 

Network latency affects the autonomous mobility of drones, robots and self-driving cars, with all device categories likely to have sub-millisecond latency requirements. However, network latency is also important from a consumer experience perspective, and Google's auto-suggestion apps on mobile phones have a 200 millisecond latency requirement. 

Bandwidth will impact vision-based applications (collectively known as HMDs) such as augmented reality (AR), virtual reality (VR) and mixed reality (MR), where bandwidth requirements will increase from 2 megabits per second ( Mbps) to the current 20 Mbps, and then to 20 Mbps. 50 Mbps because the HMD supports 360° 4K video. 

Safety is an important consideration for AI use cases such as security cameras, self-driving cars, and drones. While hardened chips and secure hardware packaging are critical to preventing tampering or physical attacks, having edge devices store and process data locally often increases redundancy and reduces the number of security vulnerabilities. 

The cost of performing AI processing in the cloud and edge needs to consider the cost of AI device hardware, bandwidth costs, and the cost of AI cloud/server processing. 

The ability to run large-scale deep neural network (DNN) models on edge devices depends not only on improvements in hardware, but also on improvements in software and techniques that can compress models to fit into small hardware form factors with limited power and power. Performance ability.

 

- The Benefits of Moving AI to the Edge

There are several drivers for moving AI processing to the edge. In an edge AI environment, AI computations are done at the edge of a network, usually on the device where the data is created. 

This is different from using cloud infrastructure, where AI computations are done in a centralized facility. Here are some benefits of edge AI: 

  • Reduced latency: Processing data locally can help reduce latency and bandwidth requirements
  • Faster analysis: Edge devices perform computations locally, which can lead to faster analysis
  • Real-time data processing: Edge AI enables real-time data processing and analysis without reliance on cloud infrastructure
  • Reduced reliance on external resources: Edge devices can reduce reliance on external resources

Edge AI doesn't require connectivity and integration between systems, allowing users to process data on the device in real time.

Some examples of edge AI applications include: 

  • Facial recognition
  • Object detection
  • Pose estimation
  • Autonomous AI video surveillance software

 

- 5G and Edge Computing

5G edge computing creates huge opportunities for every industry. It brings compute and data storage closer to where the data is generated, enabling better data control, cost reduction, faster insights and actions, and continuous operations.

5G and edge computing are two very different technologies with very complementary properties. 5G increases data transmission speeds, while edge computing reduces the back and forth between the data center and the cloud, thereby reducing unnecessary traffic on the network. 

The combination of the two enhances the digital experience, improves network performance and opens up opportunities for a new generation of applications in nearly every industry.As more and more devices connect to the cloud, 5G and edge computing will become critical to improve application performance and process large amounts of real-time data.

It’s too early to predict exactly what new applications and use cases 5G and edge computing will enable in the future. Still, one thing is certain: without 5G and edge computing, they will be vulnerable to latency and lag.

 

Thun Town_Switzerland_120120A
[Thun Town, Switzerland_120120A]

- Edge AI vs. Cloud AI

Edge AI and cloud AI are both types of AI that involve storing, managing, and processing data. However, they are used in different ways and have some key differences. 

Here are some differences between edge AI and cloud AI:

  • Location of processing: Edge AI processes data locally, closer to the source, while cloud AI processes data on remote servers.
  • Bandwidth: Edge AI requires lower bandwidth because it processes data locally on the device. Cloud AI involves data transmission to distant servers, which requires higher network bandwidth.
  • Latency: Edge AI offers low latency due to proximity to data sources. Cloud AI may have higher latency due to data transmission.
  • Use cases: Edge AI is used for real-time processing, data privacy, and reduced bandwidth usage. Cloud AI is used for complex computations, large-scale data analysis, and applications where latency is less of a concern.

Edge AI is designed to monitor specific devices, analyze a limited amount of data, and predict the behavior of specific systems. Cloud AI can ingest and analyze vast amounts of data from many sources and provide a holistic view of an entire process or ecosystem. 

Edge AI hardware can be vulnerable to attacks by threat actors because it lives on the Edge, outside of the security perimeter. Attackers may exploit vulnerabilities in the hardware to gain unauthorized access, disrupt operations, or extract sensitive information.

Cloud service providers implement appropriate security measures to prevent data theft, leaks, or security breaches. However, when using third-party cloud services, data protection policies and privacy regarding data protection depend on the Cloud provider.

 
 

[More to come ...]



Document Actions