As 5G becomes a reality – following the launch of next-generation networks in the UK, the US and Spain, and with more companies adopting internet of things (IoT) services – an important supplementary technology that is receiving growing attention is edge computing.
One company that looks poised to capture some of the early opportunities associated with edge computing is Nvidia. Traditionally a specialist in graphics processing units (GPUs) for the gaming industry, Nvidia’s approach towards edge computing involves partnering with multiple technology vendors and integration with the leading public cloud providers. This should help to establish it as a leading player in this emerging technology sector and it is notable that it already has more than 40 early adopters of its new edge computing solutions, including major multinationals like BMW Group and Foxconn.
Edge computing refers to a set of initiatives for bringing computing power closer to places where data is collected and where digital content and applications are consumed. Recent months have seen a growing number of information and communications technology (ICT) companies target what the opportunities associated with edge computing.
They include telecoms network operators such as Deutsche Telekom and AT&T, public cloud service providers such as Amazon Web Services and Microsoft, and IT infrastructure vendors such as Dell EMC and Hewlett Packard Enterprise.
In recent months Nvidia has also extended its focus to edge computing, with specific initiatives aimed at developing edge technologies that combine artificial intelligence (AI) capabilities with software for managing distributed IT footprints.
Last month, Nvidia announced a new server that uses AI to analyse and process data generated by IoT sensors close to the points of data collection. With ever larger volumes of data expected to be generated by IoT sensors and 5G mobile networks, it will become essential for cost and performance reasons to process this data close to where it is generated, instead of transporting it over long distances to central data centres. The benefits of using AI at the edge include the ability to make more intelligent, real-time decisions about which data to keep and how to use it.
Nvidia’s EGX server combines server, storage, networking and security technologies from various technology partners, with a choice of Nvidia AI processors. Its AI processors range from high-performance T4 products that are designed for real-time speech recognition and other real-time AI operations, through to the 70x45mm Jetson Nano, which was announced in March and is designed to empower small, low-power AI systems such as those found in embedded IoT applications.
The new EGX server will also make use of a new software platform called Nvidia Edge Stack. This allows users to manage complex distributed IT footprints comprising multiple edge servers and IoT sensor locations. The platform also connects to major cloud IoT services such as AWS IoT Greengrass and Microsoft Azure IoT Edge. This ensures that AI applications developed in the cloud can run on Nvidia EGX and vice versa.
Nevertheless, despite this strong start in providing tailored AI-powered solutions for emerging edge computing requirements, it is likely that Nvidia and other edge technology providers will have to help potential customers address a whole range of adjacent concerns related to edge computing and distributed IT, these include all-important questions about security, architecture design, and placement and management of diverse sets of workloads.