Personal tools
You are here: Home Research Trends & Opportunities New Media and New Digital Economy Distributed Cloud Edge Computing

Distributed Cloud Edge Computing

(Jungfrau, Switzerland - Alvin Wei-Cheng Wong)


From Data Center, To Cloud, To Edge 

- Moving Beyond Cloud Computing To Edge Computing

Edge computing is the next big wave of technology architecture in information management, moving processing power away from centralized and cloud data centers and closer to the origins of physical data. Edge computing allows processing to be performed locally at multiple decision points for the purpose of reducing network traffic.

The goal of moving closer to the edge - that is, within miles of the customer premises - is to boost the performance of the network enhance the reliability of services and reduce the cost of moving data computation to distant servers, thereby mitigating bandwidth and latency issues. Next-generation applications focused on machine-to-machine interaction with concepts like Internet of Things (IoT), AI and Machine Learning will transition the focus to edge computing.

Organizations that rely heavily on data are increasingly likely to use cloud, fog, and edge computing infrastructures. These architectures allow organizations to take advantage of a variety of computing and data storage resources, including IoT. By 2020, there will be 30 billion IoT devices worldwide, and in 2025, the number will exceed 75 billion connected things. All these devices will produce huge amounts of data that will have to be processed quickly and in a sustainable way. To meet the growing demand for IoT solutions, fog/edge computing comes into action on a par with cloud computing.


- The Definitions for Edge Computing, Mobile Edge Computing, and Multi-access Edge Computing

Edge computing refers to computing happening at the edge of a network. Various access points define the network edge, hence the name for its architectural standard, Multi-access Edge Computing (MEC). Edge access points include cell phone towers, routers, WiFi, and local data centers. Edge computing in telecom, often referred to as Mobile Edge Computing (or Multi-access Edge Computing), provides execution resources (compute and storage) for applications with networking close to the end users, typically within or at the boundary of operator networks.

The acronym MEC is used interchangeably to stand for Mobile Edge Computing or Multi-access Edge Computing. In September 2017, the European Telecommunications Standards Institute (ETSI) Industry Specification Group (ISG) officially changed its name from "Mobile Edge Computing" to "Multi-access Edge Computing". MEC characteristics include: proximity, ultra-low latency, high bandwidth, and virtualization.


(Distributed Cloud Edge Computing - CableLabs)

- The Need For Edge Computing

The growth of the wireless industry and new technology implementations over the past two decades has seen a rapid migration from premises data centers to cloud severs. However, with the increasing number of Internet of Things (IoT) applications and servers, performing computation at either data centers or cloud servers may not be an efficient approach. 

Cloud computing requires significant bandwidth to move the data from the customer premises to the cloud and back further increasing latency with stringent latency requirements for IoT applications and devices requiring real-time computation, the computing capabilities need to be at the edge - closer to the source of data generation.

Edge devices can include many different things, such as an IoT sensor, an employee’s notebook computer, their latest smartphone, the security camera or even the Internet-connected microwave oven in the office break room. Edge gateways themselves are considered edge devices within an edge-computing infrastructure. 

The edge gateway is the core element in edge/fog computing. As the name suggests, it provides gateway functions - it connects sensors/nodes at one end, provides one or multiple local function, and extends bi-directional communications to the cloud.


- The Benefits of Edge Computing

Edge computing can be placed at enterprise premises, for example inside factory buildings, in homes and vehicles, including trains, planes and private cars. The edge infrastructure can be managed or hosted by communication service providers or other types of service providers. Several use cases require various applications to deployed at different sites. In such scenarios a distributed cloud is useful which can be seen as an execution environment for applications over multiple sites, including connectivity managed as one solution.

With edge computing, each intelligent device - including smartphones, drones, sensors, robots and autonomous cars - shifts some of the data processing from the cloud to the edge. The cloud will continue to be used to manage IoT devices and to analyze large datasets in use cases where immediate action is not imperative.

The benefits of edge computing manifest in these areas:

  • Latency: moving data computing to the edge reduces latency.
  • Bandwidth: pushing processing to edge devices, instead of streaming data to the cloud for processing, decreases the need for high bandwidth while increasing response times.
  • Security: from a certain perspective, edge computing provides better security because data does not traverse over a network, instead staying close to the edge devices where it is generated.


- Multicloud, Fog, and Edge Computing Architectures

Today, multicloud strategy - in which enterprises use public, on-premises private clouds and hybrid models - has become the most assured path to cloud success. Multicloud also applies to high-bandwidth applications and devices as well. They will increasingly benefit from edge computing architectures. The growth of new technologies such as 5G wireless technologies necessitate multicloud approaches, including edge computing architectures. 

Edge computing brings cloud resources - compute, storage and networking - closer to applications, devices and users. It does by using small power cell stations to enable data to travel at high speeds - without having to travel long distances to a cloud or data center. With edge computing architecture, complex event processing happens in the device or a system close to the device, which eliminates round-trip issues and enables actions to happen quicker.  

The trend in edge computing is to bring machine learning, artificial intelligence, Internet of Things (IoT) data processing, the ability to run containers, and even the ability to run full virtual machines directly into a wide range of devices. These devices may be as small as a camera or as large as full compute racks for complex processing. Regardless of the size and capabilities of the device, the software on these devices is connected to the cloud in some form.  


(How Edge Computing Works - Networkworld)

- Edge Computing Will Augment Cloud Computing

However, we are not looking at a complete shift. In a similar way that cloud computing has not and will not fully replace centralized data centers, edge computing will augment rather than replace cloud computing. The new paradigm of “processing anywhere“ means that data will be processed where it originates and ingested into workflows aligning with business requirements.

Edge computing will forever alter how businesses interact with the physical world. Whether you consider it revolutionary or evolutionary, it is well on its way to mainstream adoption. Edge computing provides compute and storage resources with adequate connectivity (networking) close to the devices generating traffic.

Due to the increased data collected, both the physical environment of the edge (i.e., processing power in devices), and the virtual capacities (i.e., software partitioned computing machines deployed within purpose-built edge hardware like routers), device servers, terminal servers, and gateways, will evolve.


Going Beyond Edge Computing with Distributed Cloud


- Distributed Cloud

Distributed cloud computing refers to having computation, storage, and networking in a micro-cloud located outside the centralized cloud. It generalizes the cloud computing model to position, process, and serve data and applications from geographically distributed sites to meet requirements for performance, redundancy and regulations. Examples of a distributed cloud include both fog computing and edge computing. Establishing a distributed cloud situates computing closer to the end user, providing decreased latency and opportunities for increased security. 

Edge computing is a solution where data is processed as close as possible to the place where it is generated. Applications that can benefit from edge computing are those where low latency and high throughput are critical, or where it is too expensive to send the data back to a distant cloud for processing. Other ways edge computing offers benefits includes cases where the transport network is bandwidth constrained or unreliable, or the data is too sensitive to be sent over public networks, even if encrypted. 

Therefore, edge computing is not a different computing paradigm but an extension of distributed cloud computing. The two models can be reconciled by considering edge computing resources as a “micro” cloud data center, with the edge storage and computing resources connected to larger cloud data centers for big data analysis and bulk storage. 


- Creating the Next-Generation Edge-Cloud Ecosystem

Distributed cloud computing expands the traditional, large data center-based cloud model to a set of distributed cloud infrastructure components that are geographically dispersed.
Distributed cloud computing continues to offer on-demand scaling of computing and storage while moving it closer to where these are needed for improved performance. Edge computing is a complementary aspect of distributed cloud computing, and represents the farthest end of a distributed cloud architecture.

Edge computing has great potential to help communication service providers improve content delivery, enable extreme low-latency use cases and meet stringent legal requirements on data security and privacy. To succeed, they need to deliver solutions that can host different kinds of platforms and provide a high level of flexibility for application developers.




 [More to come ...]






Document Actions