Is it a Camera, a Phone, a 2nd Screen or Superhero device
This year's NAB convention provided many examples of new technology? Which should you choose?
There are many options when it comes to selecting new technological solutions. But not all of the new technology changes will move you forward. Remember 3D? Engineers and managers must carefully leverage from among the best options.
This chapter in the series of articles will look at the different devices and systems that capture and record content. Acquisition has gone through one of the more profound changes in the move to IP. The term professional camera does not have a true meaning anymore. Now there are many non-traditional devices that capture video suitable for broadcast.
At the same time, live production is still challenged in how studio and field cameras make the conversion to IP as the production switcher needs to be able to transition (clean switch between sources) and layer (key) IP video as seamlessly as SDI. Content needs to be smarter as it moves through the media chain beginning with acquisition.
Down memory lane
While it may be questionable whether to include a bit of history, I think it’s worth a little reflection just to set the stage for the current dramatic change in content acquisition. There have always been a number of different technologies to capture content. There were different film formats 8, 16, 35 and 70mm; anamorphic and wide screen. Then came IMAX.
When video was introduced, in the beginning there were only large studio cameras until the ENG or handheld camera was introduced. Recording was done on videotape and in various formats but all VTRs used the ubiquitous BNC video connector, which made interfacing easy.
When the CCD or chip replaced the tube as the imaging device, resolution entered the conversation as each different size chip and the quantity (one or three) made more of a difference than with tubes.
As technology moved from analog to digital, the one common thread that stayed was the BNC connector. In the beginning there were multiple formats (MPEG1 & Motion JPEG) and tape wars (D1, 2, 3, 5 M & DigiBeta). Then there was both a component and composite digital plus serial and parallel. The industry ultimately settled on the Serial Digital Interface SMPTE 259M (SDI) as an interoperable standard based on the ISO/IEC 13818-1 MPEG2 MP@HL (MPEG2 Part 2 Main Profile @ High Level).
Software takes center stage
Well, all that has changed. We have MPEG2, MPEG4, MPEG4/HEVC and cameras come in all shapes and sizes plus the actual imaging devices are being embedded in everything from glasses to clothing articles or in the case of the Volvo Ocean Racing Challenge even embedded in the construction of the boats.
Volvo racing fleet stopover in Auckland, Photo: David Austerberry
Camera control used to be a CCU now it’s an App. Even the cable and connector to the cameras has changed. First there was copper, fiber and hybrid camera cable. There are many video formats, connectors (both copper and fiber optic) with multiple resolutions covering a broad spectrum from phones to 4K.
One of the larger challenges in acquisition is tagging the content with metadata from all the devices. Cameras now include attached storage (ie SSD, HDD, SD and optical), and can live stream to a laptop or output directly to an encoder then transported over an IP link to a server and storage device in a geographically separated location.
The new JVC GY-HM200 camera supports streaming connectivity RTMP.
First the “conventional” camera has changed. There are large format cameras for studio production and major remotes like sports, theatrical (awards, concerts, etc.) Event-style productions are using Point-of-View (POV) and robotic cameras to add new and interesting elements.
Now let’s add DSLR’s that are HD enabled, small POV cameras (ie. GoPro), phones, tablets PC/MAC (FaceTime and Skype) to the mix. Content is being captured local to the camera on removable media or transported over copper, fiber or wireless network to an encoder or server and then a storage network.
Today, Skype content is often used in live programs.
VidiGo Toolbox, like similar products shown at NAB 2015, supports Skype, Google-Hangout and Facebook video and social services.
GoPro has changed the landscape of POV capture. Others quickly followed in small form factor cameras that has both on board storage and streaming capabilities. The cameras can be fully controlled wirelessly using a phone, tablet or computer. The NHL has recently announced they will have GoPro cameras on some players and in the goal posts to broadcast live POV during games.
At the same time, the professional sports leagues and broadcasters are introducing 4K with cameras, production and transport. What’s interesting is that 4K programming is being delivered as Over the Top (OTT) content until the Over the Air, cable and satellite channels allocate enough bandwidth to carry it.
There is still a gap in the IP technology for acquisition and production. There’s only one manufacturer with an IP output and it is not the main stream product for production. The primary large format camera products do not output content in an IP format. And, many servers are not configured to capture an IP stream and create a file. Importantly, few production switchers can support source inputs as an IP stream or have the ability to mix video as an IP stream.
Salt Lake City station KTVX, is using six JVC GY-HM890 ProHD shoulder-mount camcorders in conjunction with the JVC BR-800 ProHD Broadcaster server (powered by Zixi technology) to produce live ENG reports from the field.
While contribution is a key component of acquisition, the content still needs to get from the camera to a storage device. One of the interesting conversations taking place for IP acquisition is how many uncompressed HD video feeds can a single IP transport link handle. JVC has a MPEG4/HEVC streaming engine in their cameras. There are other technologies (bonded cellular, 802.11n/ac) that rely on wireless networks (Wi-Fi and LTE) to send live content directly to a server or to the cloud where it can be retrieved into a production environment located anywhere. The use of dedicated video circuits is being superseded by WAN topologies such as MPLS, SONET and mesh networks.
Dejero’s LIVE+ GoBox is a professional-grade mobile transmitter for ENG and video content contributors who need to transmit from mobile locations.
The carriers and circuit providers are having an interesting dilemma. A video circuit is a 270Mb/s for an ASI circuit or 1.485Gb/s for an HD circuit. A network connection for dedicated bandwidth is a 1Gb/s. But, speeds of 10 Gb/s are now possible and we are moving towards 40Gb/s and 100Gb/s. The same vendors are making it difficult to get network bandwidth but easier to get a “video” circuit which is the same type of bandwidth, probably over the same fibers. There are technologies for multiplexing multiple live signals over a single path. At this time these technologies are based on different compression schemas and bitrates (IE. J2000, MPEGTS)
Many of the American sports leagues are using these technologies for their new replay systems. Another component to contribution is delivering program files from remote production to the broadcast center. In sports, once the event is over files are either posted FTP or use acceleration technologies like Aspera to quickly move content into and out of the cloud.
One thing that has not changed is “The last mile is the hardest mile”. This is true for IP as well. Even so, consumer-focused companies are offering solutions. AT&T, Verizon, Sprint and even Google can provide high-speed links to venues.
For broadcasters and content producers, the key is to select the most appropriate technology for the task. This series will look at all aspects of the media life cycle and offer some suggestions on how to evaluate the options available. No one wants to make the mistake of choosing the wrong horse--remember 3D.
This article is part of Olson’s continuing series “Smoothing the Rocky Road to IP”. Other articles include:
The Anatomy of the IP Network, Part 1
The Anatomy of the IP Network, Part 2
The Anatomy of the IP Network, Part 3
You might also like...
Delivering Intelligent Multicast Networks - Part 1
How bandwidth aware infrastructure can improve data throughput, reduce latency and reduce the risk of congestion in IP networks.
NDI For Broadcast: Part 1 – What Is NDI?
This is the first of a series of three articles which examine and discuss NDI and its place in broadcast infrastructure.
Brazil Adopts ATSC 3.0 For NextGen TV Physical Layer
The decision by Brazil’s SBTVD Forum to recommend ATSC 3.0 as the physical layer of its TV 3.0 standard after field testing is a particular blow to Japan’s ISDB-T, because that was the incumbent digital terrestrial platform in the country. C…
Designing IP Broadcast Systems: System Monitoring
Monitoring is at the core of any broadcast facility, but as IP continues to play a more important role, the need to progress beyond video and audio signal monitoring is becoming increasingly important.
Broadcasting Innovations At Paris 2024 Olympic Games
France Télévisions was the standout video service performer at the 2024 Paris Summer Olympics, with a collection of technical deployments that secured the EBU’s Excellence in Media Award for innovations enabled by application of cloud-based IP production.