Close

What is Edge Computing?

The practice of placing, processing power near the “edge” of a network, so that data can be processed locally, before entering a Wide Area Network.

Overview

What it is: In Edge Computing, processing power is placed near the “edge” of a network, so that data can be processed locally, rather than needing to be passed across a Wide Area Network. This is different from typical cloud computing, in which data is routed to a centralized location for processing. By localizing data, Edge Computing can reduce data transfers, increase response times and potentially increase security and resilience.

What it does: Edge Computing places computing power close to where it is needed, whether by locating compute in actual IoT devices or by placing it in server devices closer to the edge. This speeds up processing, which is vital in some use cases—such as autonomous vehicles—and which can improve the consumer experience. Moreover, it avoids data traffic jams that can occur in Wide Area Networks, as well as potential outages in centralized computing locations.

Why it matters: In the era of the cloud, it’s easier than ever for companies to scale their computing power. However, as this continues, computing power is concentrated, which places stress on networking capacity, creating architectural issues. Meanwhile, smart buildings, smart vehicles, and many other innovations of the IoT era rely on fast, reliable bi-directional transmission of information. Edge Computing enables this.

What to do about it: If your products, whether applications or devices, collect and process high quantities of data near the edge, consider Edge Computing solutions as a way of lowering latency. Note that services such as Amazon CloudFront make it easy to integrate edge servers with centralized compute. Keep in mind that Edge Computing can cause security risks since decentralizing computing power creates a greater attack surface.

Full content available to GigaOm Subscribers.

Sign Up For Free