The Super Bowl isn’t just a Sunday afternoon world championship game. It’s a week-long event with massive TV coverage. Cameras and TV crews from around the globe congregated on this year’s Super Bowl city, Houston, TX, where the game was played in the NRG Stadium.
The broadcast and news crews are visible on the streets and in venues including the NFL Experience at the George R. Brown Convention Center, NFL On Location, Media Row and team headquarters. The question might be, how did all those broadcast teams get their signals live back to their stations and networks?
With an estimated 3,000 frequencies used by over 10,000 radios at the week-long event, having clear communication and program links was top of mind for a small, but highly trained coordination crew. These experts were charged with making sure all the backhaul and comms channels work in a super-charged atmosphere.
Contribution networks weren’t the only RF hogs at Houston’s NRG Stadium. The 72,000 fans eager to share the experience with friends over the internet and social media could quickly overload the normal services. Companies including Sprint, Verizon and AT&T installed COWS (not the milking kind) around key venues to handle the extra load.
Under seat enclosures at last year's Super Bowl at Levi stadium provided Wi-Fi access for fans. This year's installation at NRG stadium is similar.
Once inside the stadium, Wi-Fi was key to fan satisfaction. To serve those customers, an additional 1260 Wi-Fi access ports were installed in the stadium complex.
Want to know more about RF, cell, and streaming technology used at the 2017 Super Bowl? The exciting details are just ahead.
You might also like...
Optimization gained from transitioning to the cloud isn’t just about saving money, it also embraces improving reliability, enhancing agility and responsiveness, and providing better visibility into overall operations.
The basic goal is for consumers of video services to be highly engaged. It is easy to say but hard to do. Yet it is at the core of being a D2C streamer. D2C requires a deep understanding…
The video streaming tide has been accelerated by the Covid-19 pandemic, but will continue to flow as relative normality returns, driving demand to monetize online content not just through subscriptions but also advertising.
Video, audio and metadata monitoring in the IP domain requires different parameter checking than is typically available from the mainstream monitoring tools found in IT. The contents of the data payload is less predictable and packet distribution more tightly defined…
More than 52% of survey participants report at least three-second latency delays or more in their live streaming broadcasts.