IEEE Anoints OpenFog for Cloud Services And IoT

The IEEE Standards Association (IEEE-SA) has adopted the OpenFog Consortium’s reference architecture for fog computing. This may prove significant to broadcasters in the OTT era.

Fog computing, a term coined by Cisco, refers to a model for distributed IT based services originating in the cloud, designed to optimize performance, cost and scalability. For broadcasters the significance is that this standard will underpin future IP based infrastructures delivering increasingly online based services where low latency as well as real time analytics are key.

Fog computing is a horizontal architecture that distributes resources and services of computing, storage, control and networking across a whole cloud from centralized server farms out to edge devices. It has been designed to support multiple industry vertical sectors and application domains including broadcasting, enabling services and applications to be distributed closer to the data-producing sources.

This does not automatically mean pushing everything out to the network edge. Indeed, fog is distinct from edge computing in that resources can be deployed anywhere along the service distribution line according to need, with the idea being to balance cost against performance and scale. After all, the idea of cloud computing is to exploit centralized and virtualized commodity hardware where possible to optimize performance and scale against cost. The fog concept modifies this idea by recognizing that many applications, especially in broadcasting and around the Internet of Things (IoT), require low latency, both to deliver the primary service and to analyze data generated by end devices in real time to make decisions such as content recommendations.

The new standard, known as IEEE 1934, relies on the OpenFog Consortium’s reference architecture as a universal technical framework designed to underpin the data-intensive requirements of the IoT, 5G and AI applications in different sectors including broadcasting. It enables services and applications to be distributed closer to the data-producing sources, and extends from the things, over the network edges, through the cloud and across multiple protocol layers.

“We now have an industry backed and supported blueprint that will supercharge the development of new applications and business models made possible through fog computing,” said Halder Antunes, chairman of the OpenFog Consortium and senior director, Cisco. “This is a significant milestone for OpenFog and a monumental inflection point for those companies and industries that will benefit from the ensuing innovation and market growth made possible by the standard.”

The OpenFog Consortium was founded in 2016 to accelerate adoption of fog computing through an open, interoperable architecture. “The reference architecture provided a solid, high-level foundation for the development of fog computing standards,” said John Zoo, Chair, IEEE Standards Working Group on Fog Computing & Networking Architecture Framework, which was sponsored by the IEEE Communications Society’s Edge, Fog, and Cloud Communications Standards Committee. “The OpenFog technical committee and the IEEE standards committee worked closely during this process and benefited from the collaboration and synergies that developed. We’re very pleased with the results of this standards effort.”

The OpenFog Reference Architecture, released in February 2017, is based on 10 technical principles representing key attributes a system needs to embrace to be defined as “OpenFog.” These are security, scalability, openness, autonomy, reliability, availability, serviceability, agility, hierarchy and programmability. The reference architecture, now the IEEE standard, addresses the need for an interoperable end-to-end data connectivity solution along the cloud-to-things continuum.

The IEEE is headquartered in Piscataway, New Jersey, US.

The IEEE is headquartered in Piscataway, New Jersey, US.

“As a consortium, we developed the OpenFog Reference Architecture with the intention that it would serve as the framework for a standards development organization,” Antunes added. “We’re pleased to have worked so closely with the IEEE in this effort as the result is a standardized computing and communication platform that will serve as a catalyst to the next digital revolution.”

Adoption of the new standard by IEEE comes at a time when fog computing in general is ready for lift off, according to a report just out from market research firm Million Insights. Its Fog Computing Market Size & Forecast Report, 2016 – 2025, rated the global fog computing market at just $9.33 million in 2016 but set to reach $617.3 million by 2015. That number looks conservative and could be greatly exceeded if the IoT market does take off at the rate some analysts have predicted, but then so far growth has been below many expectations.

Million Insights identified primary drivers for fog computing market as an expected wave of IoT devices with parallel IoT connectivity, combined with subsequent growth in cloud computing and machine to machine (M2M) connectivity. Presumably fog as defined for this survey includes its derivative, mist computing, which is a lightweight version for some IoT and especially M2M applications involving small low powered devices.

You might also like...

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…

The Big Guide To OTT: Part 9 - Quality Of Experience (QoE)

Part 9 of The Big Guide To OTT features a pair of in-depth articles which discuss how a data driven understanding of the consumer experience is vital and how poor quality streaming loses viewers.

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…

Essential Guide: Next-Gen 5G Contribution

This Essential Guide explores the technology of 5G and its ongoing roll out. It discusses the technical reasons why 5G has become the new standard in roaming contribution, and explores the potential disruptive impact 5G and MEC could have on…

Designing IP Broadcast Systems: Part 2 - IT Philosophies, Cloud Infrastructure, & Addressing

Welcome to the second part of ‘Designing IP Broadcast Systems’ - a major 18 article exploration of the technology needed to create practical IP based broadcast production systems. Part 2 discusses the different philosophies of IT & Broadcast, the advantages and challenges…