IEEE Anoints OpenFog for Cloud Services And IoT

The IEEE Standards Association (IEEE-SA) has adopted the OpenFog Consortium’s reference architecture for fog computing. This may prove significant to broadcasters in the OTT era.

Fog computing, a term coined by Cisco, refers to a model for distributed IT based services originating in the cloud, designed to optimize performance, cost and scalability. For broadcasters the significance is that this standard will underpin future IP based infrastructures delivering increasingly online based services where low latency as well as real time analytics are key.

Fog computing is a horizontal architecture that distributes resources and services of computing, storage, control and networking across a whole cloud from centralized server farms out to edge devices. It has been designed to support multiple industry vertical sectors and application domains including broadcasting, enabling services and applications to be distributed closer to the data-producing sources.

This does not automatically mean pushing everything out to the network edge. Indeed, fog is distinct from edge computing in that resources can be deployed anywhere along the service distribution line according to need, with the idea being to balance cost against performance and scale. After all, the idea of cloud computing is to exploit centralized and virtualized commodity hardware where possible to optimize performance and scale against cost. The fog concept modifies this idea by recognizing that many applications, especially in broadcasting and around the Internet of Things (IoT), require low latency, both to deliver the primary service and to analyze data generated by end devices in real time to make decisions such as content recommendations.

The new standard, known as IEEE 1934, relies on the OpenFog Consortium’s reference architecture as a universal technical framework designed to underpin the data-intensive requirements of the IoT, 5G and AI applications in different sectors including broadcasting. It enables services and applications to be distributed closer to the data-producing sources, and extends from the things, over the network edges, through the cloud and across multiple protocol layers.

“We now have an industry backed and supported blueprint that will supercharge the development of new applications and business models made possible through fog computing,” said Halder Antunes, chairman of the OpenFog Consortium and senior director, Cisco. “This is a significant milestone for OpenFog and a monumental inflection point for those companies and industries that will benefit from the ensuing innovation and market growth made possible by the standard.”

The OpenFog Consortium was founded in 2016 to accelerate adoption of fog computing through an open, interoperable architecture. “The reference architecture provided a solid, high-level foundation for the development of fog computing standards,” said John Zoo, Chair, IEEE Standards Working Group on Fog Computing & Networking Architecture Framework, which was sponsored by the IEEE Communications Society’s Edge, Fog, and Cloud Communications Standards Committee. “The OpenFog technical committee and the IEEE standards committee worked closely during this process and benefited from the collaboration and synergies that developed. We’re very pleased with the results of this standards effort.”

The OpenFog Reference Architecture, released in February 2017, is based on 10 technical principles representing key attributes a system needs to embrace to be defined as “OpenFog.” These are security, scalability, openness, autonomy, reliability, availability, serviceability, agility, hierarchy and programmability. The reference architecture, now the IEEE standard, addresses the need for an interoperable end-to-end data connectivity solution along the cloud-to-things continuum.

The IEEE is headquartered in Piscataway, New Jersey, US.

The IEEE is headquartered in Piscataway, New Jersey, US.

“As a consortium, we developed the OpenFog Reference Architecture with the intention that it would serve as the framework for a standards development organization,” Antunes added. “We’re pleased to have worked so closely with the IEEE in this effort as the result is a standardized computing and communication platform that will serve as a catalyst to the next digital revolution.”

Adoption of the new standard by IEEE comes at a time when fog computing in general is ready for lift off, according to a report just out from market research firm Million Insights. Its Fog Computing Market Size & Forecast Report, 2016 – 2025, rated the global fog computing market at just $9.33 million in 2016 but set to reach $617.3 million by 2015. That number looks conservative and could be greatly exceeded if the IoT market does take off at the rate some analysts have predicted, but then so far growth has been below many expectations.

Million Insights identified primary drivers for fog computing market as an expected wave of IoT devices with parallel IoT connectivity, combined with subsequent growth in cloud computing and machine to machine (M2M) connectivity. Presumably fog as defined for this survey includes its derivative, mist computing, which is a lightweight version for some IoT and especially M2M applications involving small low powered devices.

You might also like...

Standards: Part 9 - Standards For On-air Broadcasting & Streaming Services

Traditional on-air broadcasters and streaming service providers use many of the same standards to define how content is received from external providers and how it is subsequently delivered to the consumer. They may apply those standards in slightly different ways.

An Introduction To Network Observability

The more complex and intricate IP networks and cloud infrastructures become, the greater the potential for unwelcome dynamics in the system, and the greater the need for rich, reliable, real-time data about performance and error rates.

Designing IP Broadcast Systems: Part 3 - Designing For Everyday Operation

Welcome to the third part of ‘Designing IP Broadcast Systems’ - a major 18 article exploration of the technology needed to create practical IP based broadcast production systems. Part 3 discusses some of the key challenges of designing network systems to support eve…

What Are The Long-Term Implications Of AI For Broadcast?

We’ve all witnessed its phenomenal growth recently. The question is: how do we manage the process of adopting and adjusting to AI in the broadcasting industry? This article is more about our approach than specific examples of AI integration;…

Next-Gen 5G Contribution: Part 2 - MEC & The Disruptive Potential Of 5G

The migration of the core network functionality of 5G to virtualized or cloud-native infrastructure opens up new capabilities like MEC which have the potential to disrupt current approaches to remote production contribution networks.