IEEE Anoints OpenFog for Cloud Services And IoT

The IEEE Standards Association (IEEE-SA) has adopted the OpenFog Consortium’s reference architecture for fog computing. This may prove significant to broadcasters in the OTT era.

Fog computing, a term coined by Cisco, refers to a model for distributed IT based services originating in the cloud, designed to optimize performance, cost and scalability. For broadcasters the significance is that this standard will underpin future IP based infrastructures delivering increasingly online based services where low latency as well as real time analytics are key.

Fog computing is a horizontal architecture that distributes resources and services of computing, storage, control and networking across a whole cloud from centralized server farms out to edge devices. It has been designed to support multiple industry vertical sectors and application domains including broadcasting, enabling services and applications to be distributed closer to the data-producing sources.

This does not automatically mean pushing everything out to the network edge. Indeed, fog is distinct from edge computing in that resources can be deployed anywhere along the service distribution line according to need, with the idea being to balance cost against performance and scale. After all, the idea of cloud computing is to exploit centralized and virtualized commodity hardware where possible to optimize performance and scale against cost. The fog concept modifies this idea by recognizing that many applications, especially in broadcasting and around the Internet of Things (IoT), require low latency, both to deliver the primary service and to analyze data generated by end devices in real time to make decisions such as content recommendations.

The new standard, known as IEEE 1934, relies on the OpenFog Consortium’s reference architecture as a universal technical framework designed to underpin the data-intensive requirements of the IoT, 5G and AI applications in different sectors including broadcasting. It enables services and applications to be distributed closer to the data-producing sources, and extends from the things, over the network edges, through the cloud and across multiple protocol layers.

“We now have an industry backed and supported blueprint that will supercharge the development of new applications and business models made possible through fog computing,” said Halder Antunes, chairman of the OpenFog Consortium and senior director, Cisco. “This is a significant milestone for OpenFog and a monumental inflection point for those companies and industries that will benefit from the ensuing innovation and market growth made possible by the standard.”

The OpenFog Consortium was founded in 2016 to accelerate adoption of fog computing through an open, interoperable architecture. “The reference architecture provided a solid, high-level foundation for the development of fog computing standards,” said John Zoo, Chair, IEEE Standards Working Group on Fog Computing & Networking Architecture Framework, which was sponsored by the IEEE Communications Society’s Edge, Fog, and Cloud Communications Standards Committee. “The OpenFog technical committee and the IEEE standards committee worked closely during this process and benefited from the collaboration and synergies that developed. We’re very pleased with the results of this standards effort.”

The OpenFog Reference Architecture, released in February 2017, is based on 10 technical principles representing key attributes a system needs to embrace to be defined as “OpenFog.” These are security, scalability, openness, autonomy, reliability, availability, serviceability, agility, hierarchy and programmability. The reference architecture, now the IEEE standard, addresses the need for an interoperable end-to-end data connectivity solution along the cloud-to-things continuum.

The IEEE is headquartered in Piscataway, New Jersey, US.

The IEEE is headquartered in Piscataway, New Jersey, US.

“As a consortium, we developed the OpenFog Reference Architecture with the intention that it would serve as the framework for a standards development organization,” Antunes added. “We’re pleased to have worked so closely with the IEEE in this effort as the result is a standardized computing and communication platform that will serve as a catalyst to the next digital revolution.”

Adoption of the new standard by IEEE comes at a time when fog computing in general is ready for lift off, according to a report just out from market research firm Million Insights. Its Fog Computing Market Size & Forecast Report, 2016 – 2025, rated the global fog computing market at just $9.33 million in 2016 but set to reach $617.3 million by 2015. That number looks conservative and could be greatly exceeded if the IoT market does take off at the rate some analysts have predicted, but then so far growth has been below many expectations.

Million Insights identified primary drivers for fog computing market as an expected wave of IoT devices with parallel IoT connectivity, combined with subsequent growth in cloud computing and machine to machine (M2M) connectivity. Presumably fog as defined for this survey includes its derivative, mist computing, which is a lightweight version for some IoT and especially M2M applications involving small low powered devices.

You might also like...

Designing IP Broadcast Systems: Routing

IP networks are wonderfully flexible, but this flexibility can be the cause of much frustration, especially when broadcasters must decide on a network topology.

Audio For Broadcast: Cloud Based Audio

With several industry leading audio vendors demonstrating milestone product releases based on new technology at the 2024 NAB Show, the evolution of cloud-based audio took a significant step forward. In light of these developments the article below replaces previously published content…

Designing IP Broadcast Systems: Why Can’t We Just Plug And Play?

Plug and play would be an ideal solution for IP broadcast workflows, however, this concept is not as straightforward as it may first seem.

Why Live Music Broadcast Needs A Specialized Music Truck

We talk to the multi-award winning team at Music Mix Mobile about the unique cultural and creative demands of mixing music live for broadcast.

An Introduction To Network Observability

The more complex and intricate IP networks and cloud infrastructures become, the greater the potential for unwelcome dynamics in the system, and the greater the need for rich, reliable, real-time data about performance and error rates.