IEEE Anoints OpenFog for Cloud Services And IoT

The IEEE Standards Association (IEEE-SA) has adopted the OpenFog Consortium’s reference architecture for fog computing. This may prove significant to broadcasters in the OTT era.

Fog computing, a term coined by Cisco, refers to a model for distributed IT based services originating in the cloud, designed to optimize performance, cost and scalability. For broadcasters the significance is that this standard will underpin future IP based infrastructures delivering increasingly online based services where low latency as well as real time analytics are key.

Fog computing is a horizontal architecture that distributes resources and services of computing, storage, control and networking across a whole cloud from centralized server farms out to edge devices. It has been designed to support multiple industry vertical sectors and application domains including broadcasting, enabling services and applications to be distributed closer to the data-producing sources.

This does not automatically mean pushing everything out to the network edge. Indeed, fog is distinct from edge computing in that resources can be deployed anywhere along the service distribution line according to need, with the idea being to balance cost against performance and scale. After all, the idea of cloud computing is to exploit centralized and virtualized commodity hardware where possible to optimize performance and scale against cost. The fog concept modifies this idea by recognizing that many applications, especially in broadcasting and around the Internet of Things (IoT), require low latency, both to deliver the primary service and to analyze data generated by end devices in real time to make decisions such as content recommendations.

The new standard, known as IEEE 1934, relies on the OpenFog Consortium’s reference architecture as a universal technical framework designed to underpin the data-intensive requirements of the IoT, 5G and AI applications in different sectors including broadcasting. It enables services and applications to be distributed closer to the data-producing sources, and extends from the things, over the network edges, through the cloud and across multiple protocol layers.

“We now have an industry backed and supported blueprint that will supercharge the development of new applications and business models made possible through fog computing,” said Halder Antunes, chairman of the OpenFog Consortium and senior director, Cisco. “This is a significant milestone for OpenFog and a monumental inflection point for those companies and industries that will benefit from the ensuing innovation and market growth made possible by the standard.”

The OpenFog Consortium was founded in 2016 to accelerate adoption of fog computing through an open, interoperable architecture. “The reference architecture provided a solid, high-level foundation for the development of fog computing standards,” said John Zoo, Chair, IEEE Standards Working Group on Fog Computing & Networking Architecture Framework, which was sponsored by the IEEE Communications Society’s Edge, Fog, and Cloud Communications Standards Committee. “The OpenFog technical committee and the IEEE standards committee worked closely during this process and benefited from the collaboration and synergies that developed. We’re very pleased with the results of this standards effort.”

The OpenFog Reference Architecture, released in February 2017, is based on 10 technical principles representing key attributes a system needs to embrace to be defined as “OpenFog.” These are security, scalability, openness, autonomy, reliability, availability, serviceability, agility, hierarchy and programmability. The reference architecture, now the IEEE standard, addresses the need for an interoperable end-to-end data connectivity solution along the cloud-to-things continuum.

The IEEE is headquartered in Piscataway, New Jersey, US.

The IEEE is headquartered in Piscataway, New Jersey, US.

“As a consortium, we developed the OpenFog Reference Architecture with the intention that it would serve as the framework for a standards development organization,” Antunes added. “We’re pleased to have worked so closely with the IEEE in this effort as the result is a standardized computing and communication platform that will serve as a catalyst to the next digital revolution.”

Adoption of the new standard by IEEE comes at a time when fog computing in general is ready for lift off, according to a report just out from market research firm Million Insights. Its Fog Computing Market Size & Forecast Report, 2016 – 2025, rated the global fog computing market at just $9.33 million in 2016 but set to reach $617.3 million by 2015. That number looks conservative and could be greatly exceeded if the IoT market does take off at the rate some analysts have predicted, but then so far growth has been below many expectations.

Million Insights identified primary drivers for fog computing market as an expected wave of IoT devices with parallel IoT connectivity, combined with subsequent growth in cloud computing and machine to machine (M2M) connectivity. Presumably fog as defined for this survey includes its derivative, mist computing, which is a lightweight version for some IoT and especially M2M applications involving small low powered devices.

You might also like...

Brazil Adopts ATSC 3.0 For NextGen TV Physical Layer

The decision by Brazil’s SBTVD Forum to recommend ATSC 3.0 as the physical layer of its TV 3.0 standard after field testing is a particular blow to Japan’s ISDB-T, because that was the incumbent digital terrestrial platform in the country. C…

Designing IP Broadcast Systems: System Monitoring

Monitoring is at the core of any broadcast facility, but as IP continues to play a more important role, the need to progress beyond video and audio signal monitoring is becoming increasingly important.

Broadcasting Innovations At Paris 2024 Olympic Games

France Télévisions was the standout video service performer at the 2024 Paris Summer Olympics, with a collection of technical deployments that secured the EBU’s Excellence in Media Award for innovations enabled by application of cloud-based IP production.

HDR & WCG For Broadcast - Expanding Acquisition Capabilities With HDR & WCG

HDR & WCG do present new requirements for vision engineers, but the fundamental principles described here remain familiar and easily manageable.

What Does Hybrid Really Mean?

In this article we discuss the philosophy of hybrid systems, where assets, software and compute resource are located across on-prem, cloud and hybrid infrastructure.