More computing is happening on edge devices to handle the information overload.
By 2020, an average internet user will use 1.5GB of traffic a day, and daily video traffic will reach 1PB, Intel predicts. A huge amount of data will be generated by autonomous vehicles, mobile devices, and internet-of-things devices.
Every day, more information is being collected and sent to faster servers in mega data centers, which analyze and make sense of it. That analysis has helped improved image and speech recognition and is making autonomous cars a reality.
Emerging superfast data networks like 5G — a melting pot of wireless technologies — will dispatch even more gathered information, which could stress data centers. Servers are already being redesigned to handle more data, and throughput technologies like Gen-Z and fiber optics will reduce latency.
At Mobile World Congress this week, computing on the edge was a big topic among infrastructure providers. Edge computing involves light processing on intermediary servers or on the edge of the networks. This type of computing can lead to faster responses for mobile services, without putting stress on servers in the core network.
Edge computing can also provide instant analytics to ensure the junk information is discarded and only useful information reaches servers on the core network. Edge computing is also emerging for virtualization, which slices loads of data in smaller packages rerouted to the right servers to be handled.
At MWC, many companies showed products and shared new ideas for computing on the edge. Most vendors had a common goal: to better control the flow and make better sense of data, especially with more data being collected by IoT devices.
Hewlett Packard Enterprise showed off its latest generation of IoT servers called Edgeline, similar to the company’s Proliant servers but slimmed down. The servers sit on the edge and are able to analyze data before sending them to core servers in data centers. The servers can virtualize data packets on the edge, which then makes better use of computing resources.
For example, the Edgeline servers can handle mobile computing tasks, like responding to Facebook posts or search requests, on the edge. More intense data tasks, like image recognition, can be routed back to the centralized data center network, which handles machine learning tasks.
Nearby, Dell was showing off its Edge Gateway 3000 servers for multiple IoT applications. The servers can analyze data at the point of collection and dispatch them to the data center. These servers use Intel’s latest Atom chips, which have up to 16 cores.
The Model 3001 is targeted toward industrial automation and energy management; the Model 3002 is aimed at transportation and logistics and fits into large vehicles like trucks; and the Model 3003 is designed for video streaming and point-of-sale applications.
The servers use new Xeon-D or Atom chips from Intel, which were shown by the chipmaker on the floor of the show. The chips are targeted at edge servers and networking gear. These chips are not as powerful as the Xeon E5 chips used in mainstream servers in data centers.
Ubuntu showed off some Open Compute Project network router designs with edge processing. The routers ran Ubuntu Snappy Core and can run software for edge processing. These routers are servers in their own right and reduce the load on high-end servers. Ubuntu “snaps” — which are like apps — can be tuned for specific edge processing tasks. More information about new edge router designs will be shared at Open Compute Project Summit next week.
Another interesting demo at the Ubuntu booth showed an ARM server doing edge processing for 5G networks. The ARM servers had 64-bit Cavium ThunderX2 chips that are already being used in networking gear but also could double up as low-power server chips for more powerful edge processing.
Powerful CPUs aren’t needed for edge processing, but chips need to able to collate large datasets and ensure they are virtualized and redirected properly. ARM servers chips can virtualize data with the ARMv8 64-bit architecture, which has been adopted by chipmakers like Cavium.
Edge processing will be important for autonomous cars as well. The ultimate vision is that autonomous cars will be able to connect to superfast servers in the cloud to recognize images, draw up maps, and make better driving decisions. But those connections may not always be possible, which is where edge processing on the autonomous car will step in.
Intel and Qualcomm are providing unique circuitry for handling edge processing in autonomous vehicles. Intel is pushing FPGAs (field programmable gate arrays) into autonomous vehicles to handle specific tasks, while Qualcomm is relying on digital signal processors. FPGAs and other components are key components for autonomous cars and a flexible 5G infrastructure, said Venkata Renduchintala, president of the Intel Client and Internet of Things businesses and the Systems Architecture Group.
Edge processing is also becoming a priority for telecom providers, which are looking for ways to add flexibility to the services they provide to customers. For example, SK Telecom — which is testing 5G rollouts in South Korea — found edge processing valuable in virtualizing its data and radio networks.