EditShare Flow 2019 Extension 3 Supports The Newsroom

Connect your NRCS.
MOS integration facilitates news production workflows and new multicam editing features brings advanced editing on the road.
“EditShare news customers can link their NRCS to the production and storage systems via Flow 2019 Extension 3, providing journalists better access to content with tools to quickly package a news story at their fingertips,” comments Matt Sandford, Flow Product Manager, EditShare. “The integration across the workflow allows news teams to stay connected to all aspects of the newsroom via Flow 2019, increasing the overall efficiency of creating and distributing news stories and highlights.”
With support for MOS protocol in Flow 2019 Extension 3, journalists can create placeholders, craft stories and send for approval and distribution without having to jump from one application to another. Based on assignments from the NRCS, news clips are automatically created in Flow 2019 as sequence placeholders with news content edited in the Flow Story module and pushed to the NRCS when final. Flow 2019 automatically alerts the NRCS that the news clip is ready for distribution.
Sandford adds, “Flow 2019 Extension 3 makes it easy for journalists to see the entire picture. Placeholder sequences can be created in bins named for a particular news bulletin. This means staff can have a view of all the news bulletins and know exactly which ones need to be assembled.”
In addition to adding news, Flow 2019 Extension 3 expands editing functionality to include the multicam technology developed for Lightworks.
Flow 2019 Extension 3 new multicam editing capabilities allow multiple clips to be synchronized using timecode or audio waveform, and then played concurrently for convenient reviewing and logging. Sequences can be built conventionally, or by making a ‘live’ edit from the playing sources, and sequence segments can be instantly swapped for alternate angles at any time.
Sandford says, “Flow Story also has the flexibility to allow sequences to be used in a multicam group, making it simple for editors to deal with footage from cameras that were stopped and started during a recording. Multiple clips from a single camera can be automatically stitched back together into a single sequence ready to use without the need for rendering or any time consuming manual alignment.”
You might also like...
The Peril Of HDR: Just Because You Can Doesn’t Mean You Should
There is a disturbing and growing consensus among viewers that many movies and TV shows today are under illuminated or simply too dark. As DOPs, we surely share some of the blame, but there is plenty of blame to go…
The Sponsors Perspective: What Are You Waiting For?
Cloud based workflows are here to stay. So, there are no longer any technical or production-based reasons not to take full advantage of the opportunities this can bring to broadcasters.
Broadcast And The Metaverse - Part 2
The Metaverse may seem a long way off but the technology underpinning its rapid deployment is here today and has the potential to empower broadcasters to improve the immersive viewing experience.
Scalable Dynamic Software For Broadcasters: Part 7 - Connecting Container And Microservice Apps
Messaging is an incredibly important concept in microservices and forms the backbone of their communication. But keeping systems coherent and resilient requires an understanding of how microservices communicate and why.
The Sponsors Perspective: Why Settle For A Or B?
AMPP provides many different configurations for high availability cloud systems that empower broadcasters to choose the best infrastructures to meet their demands and further improve the viewing experience.