IEEE Anoints OpenFog for Cloud Services And IoT

The IEEE Standards Association (IEEE-SA) has adopted the OpenFog Consortium’s reference architecture for fog computing. This may prove significant to broadcasters in the OTT era.

Fog computing, a term coined by Cisco, refers to a model for distributed IT based services originating in the cloud, designed to optimize performance, cost and scalability. For broadcasters the significance is that this standard will underpin future IP based infrastructures delivering increasingly online based services where low latency as well as real time analytics are key.

Fog computing is a horizontal architecture that distributes resources and services of computing, storage, control and networking across a whole cloud from centralized server farms out to edge devices. It has been designed to support multiple industry vertical sectors and application domains including broadcasting, enabling services and applications to be distributed closer to the data-producing sources.

This does not automatically mean pushing everything out to the network edge. Indeed, fog is distinct from edge computing in that resources can be deployed anywhere along the service distribution line according to need, with the idea being to balance cost against performance and scale. After all, the idea of cloud computing is to exploit centralized and virtualized commodity hardware where possible to optimize performance and scale against cost. The fog concept modifies this idea by recognizing that many applications, especially in broadcasting and around the Internet of Things (IoT), require low latency, both to deliver the primary service and to analyze data generated by end devices in real time to make decisions such as content recommendations.

The new standard, known as IEEE 1934, relies on the OpenFog Consortium’s reference architecture as a universal technical framework designed to underpin the data-intensive requirements of the IoT, 5G and AI applications in different sectors including broadcasting. It enables services and applications to be distributed closer to the data-producing sources, and extends from the things, over the network edges, through the cloud and across multiple protocol layers.

“We now have an industry backed and supported blueprint that will supercharge the development of new applications and business models made possible through fog computing,” said Halder Antunes, chairman of the OpenFog Consortium and senior director, Cisco. “This is a significant milestone for OpenFog and a monumental inflection point for those companies and industries that will benefit from the ensuing innovation and market growth made possible by the standard.”

The OpenFog Consortium was founded in 2016 to accelerate adoption of fog computing through an open, interoperable architecture. “The reference architecture provided a solid, high-level foundation for the development of fog computing standards,” said John Zoo, Chair, IEEE Standards Working Group on Fog Computing & Networking Architecture Framework, which was sponsored by the IEEE Communications Society’s Edge, Fog, and Cloud Communications Standards Committee. “The OpenFog technical committee and the IEEE standards committee worked closely during this process and benefited from the collaboration and synergies that developed. We’re very pleased with the results of this standards effort.”

The OpenFog Reference Architecture, released in February 2017, is based on 10 technical principles representing key attributes a system needs to embrace to be defined as “OpenFog.” These are security, scalability, openness, autonomy, reliability, availability, serviceability, agility, hierarchy and programmability. The reference architecture, now the IEEE standard, addresses the need for an interoperable end-to-end data connectivity solution along the cloud-to-things continuum.

The IEEE is headquartered in Piscataway, New Jersey, US.

The IEEE is headquartered in Piscataway, New Jersey, US.

“As a consortium, we developed the OpenFog Reference Architecture with the intention that it would serve as the framework for a standards development organization,” Antunes added. “We’re pleased to have worked so closely with the IEEE in this effort as the result is a standardized computing and communication platform that will serve as a catalyst to the next digital revolution.”

Adoption of the new standard by IEEE comes at a time when fog computing in general is ready for lift off, according to a report just out from market research firm Million Insights. Its Fog Computing Market Size & Forecast Report, 2016 – 2025, rated the global fog computing market at just $9.33 million in 2016 but set to reach $617.3 million by 2015. That number looks conservative and could be greatly exceeded if the IoT market does take off at the rate some analysts have predicted, but then so far growth has been below many expectations.

Million Insights identified primary drivers for fog computing market as an expected wave of IoT devices with parallel IoT connectivity, combined with subsequent growth in cloud computing and machine to machine (M2M) connectivity. Presumably fog as defined for this survey includes its derivative, mist computing, which is a lightweight version for some IoT and especially M2M applications involving small low powered devices.

You might also like...

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…

Standards: Part 4 - Standards For Media Container Files

This article describes the various codecs in common use and their symbiotic relationship to the media container files which are essential when it comes to packaging the resulting content for storage or delivery.

Standards: Appendix E - File Extensions Vs. Container Formats

This list of file container formats and their extensions is not exhaustive but it does describe the important ones whose standards are in everyday use in a broadcasting environment.

Standards: Part 3 - Standards For Video Coding

This article gives an overview of the various codec specifications currently in use. ISO and non-ISO standards will be covered alongside SMPTE 2110 elements to contextualize all the different video coding standard alternatives and their comparative efficiency - all of which…

Standards: Part 2 - Standards For Broadcasting & Deployment

This article gives an overview of the standards relating to production and transmission or playout. It prepares the ground for subsequent more detailed articles which will explore the following subject areas: ST 2110, higher bit rate codecs and profiles that are…