Red5 Brings Distance Independence To the Revolution in Live Production

Red5 Studio for Production
SHARE

The tech-driven revolution unfolding in sports, news, real-time auctions, and other live program productions has set the stage for even greater gains in efficiency and revenue-generating power once distance is no longer an impediment to optimizing results. This blog explores how Red5’s TrueTime Studio and XDN architecture enable distance-independent live production, removing workflow bottlenecks caused… Continue reading Red5 Brings Distance Independence To the Revolution in Live Production

The tech-driven revolution unfolding in sports, news, real-time auctions, and other live program productions has set the stage for even greater gains in efficiency and revenue-generating power once distance is no longer an impediment to optimizing results. This blog explores how Red5’s TrueTime Studio and XDN architecture enable distance-independent live production, removing workflow bottlenecks caused by streaming latencies. Learn what the live production revolution means for broadcasters, content creators, and media companies from REMI enhancements to the role of AI-driven indexing and virtual production tools. Discover how Red5 is shaping the future of live content with real-time, multi-view streaming, and low-latency collaboration.

What is the Live Production Revolution?

The live production industry is undergoing the tech-driven revolution. Producers have eagerly embraced the cost-saving efficiencies unleashed by the workflow convergence and studio virtualization enabled by applications of AI, video game rendering technology, a new generation of cameras, and much else. But it’s clear the full potential of the live production revolution can’t be reached unless producers free themselves and the creator community from the operational restrictions imposed by workflow streaming latencies.

Goals commonly articulated by producers of everything from sports, esports, news, and weather to live video coverage of concerts, award shows, political gatherings, church services, trade shows, and other events include the ability to:

  • avoid sending big production vans and crews to venues,
  • engage participants in production workflows from dispersed locations,
  • tap commentary from announcers, influencers and reporters wherever they happen to be,
  • pull video clips and assemble stat-rendering graphics from scattered repositories in sync with live output,
  • embellish core content with feature-rich applications delivered on the fly from third-party sources.

All of these goals can be and, at an accelerating pace, are being met in real-world operations by producers and creators who choose to put Red5’s suite of TrueTime Studio™ tools to work with real-time multidirectional connectivity enabled by Experience Delivery Network (XDN) infrastructure. Red5 and partners like AWS, Zixi, Nomad Media, Blackbird Video, Osprey Video, Videon Labs, Magnifi, and others have confirmed this can be done in multiple demonstrations at recent IBC and NAB conventions.

Now several more displays of Red5’s support for what’s coming next in live production are scheduled for the 2025 NAB Show. For example, AWS with Red5’s help is staging what promises to be one of the show’s highlights by providing video coverage of NAB attendees’ engagements with a cluster of Formula 1 simulators. AWS will also feature some real-time commentary from dispersed observers of NHL game clips enabled by Red5’s recently acquired Dale platform.

The scope of the impact the fusion of advanced live production technologies in TrueTime Studio-enabled workflows will have on live content production and the roles played by creators is mind boggling. Beyond trade show demos, Red5 is working with these and many other partners to help producers of commercially available services execute the capabilities described here to deliver an unprecedented range of next-generation user experiences. For a comprehensive overview of Red5’s partnership ecosystem, see this whitepaper.

Part 1: REMI on Steroids – The Live Production Revolution in a Nutshell

In what amounts to REMI (remote integration model) on steroids, producers can augment the action and elements comprising the on-premises studio scenario with any mix of inputs from remote camera feeds, dispersed commentators and archived files of ancillary content unimpeded by latencies incurred over long distances. To comprehend what’s about to unfold with the introduction of real-time streaming across networks linking all points in live production a good place to start is with a quick review of the advances in production technology that have brought us to this inflection point.

Triggering the Revolution with the Transition to IP and the Cloud

While AI has generated much of the buzz at industry trade shows over the past two years, the revolution really began with the television industry’s transition to use of IP technology in production workflows. This was enabled by the ST-2110 suite of standards built by the Society of Motion Picture and Television Engineers (SMPTE) to serve as the bridge to IP from the legacy Serial Digital Interface (SDI) mode of operations.

Emerging in parallel with ST-2110 the proprietary IP production connectivity protocol stack known as Network Device Interface (NDI), offered free through the NDI subsidiary owned by broadcast production vendor Vizrt, gained traction among smaller providers of streaming services and is now widely supported by many suppliers of production software and video cameras who are also integrated with ST-2110.

Both protocols have facilitated use of cloud-based processing in production workflows. But ST-2110 has been the primary factor behind the shift to cloud production among studios, broadcasters and streaming services, including the slower-paced cloud transition pursued by major producers of sports and other live programming.

As Steve Reynolds, CEO of TV playout technology supplier Imagine Communications, put it in an interview with TV Technology Magazine last year, “The whole explosion around cloud production wouldn’t have happened without 2110.” With pervasive use of ST-2110 driving costs down to where “2110 has become cheaper than SDI on a per port basis,” the cloud-based operations convergence has begun to blur the boundaries between TV and streaming and across all OTT service categories as well.

Cloud production-related solutions are enabling a streamlined, cost-saving approach to launching linear TV channels combining any mix of live and file-based content for any distribution model, whether it be free-to-air broadcast (FAST), pay TV, SVOD, or AVOD. Now, as Reynolds says, “FAST is just TV.”

A New Generation of Content Management Systems

The market is awash in new software platforms that are designed to converge production processing of programming for TV and streaming distribution touching on everything from encoding and packaging to feature enhancement, captioning, formatting for ad support and use of stored assets.

New content management systems (CMSs) have made it possible to orchestrate preparations of content for delivery into contribution and distribution channels. Workflow orchestration tools support all major media asset management (MAM) and production asset management (PAM) systems along with automation systems used with these and more workflow-specific systems.

This ensures that storage-related management tasks can be brought into play with any workflows interacting with production-, post-production, and contribution-related processes executed on in-house or third-party platforms. The new CMSs are facilitating automatic transfer and indexing of files for insertion into the dispersed storage grids while making it possible to pull whatever assets are needed from wherever they are in live productions.

Hot Storage and AI-Assisted Indexing and File Searches

As a result, all users and business systems in any given operational environment can work with a single point of portal access to all repositories comprising that operation’s storage ecosystem. Rapid search and discovery across all tape, disk, optical, and cloud storage systems no matter what type of asset is needed at any moment in time are facilitated by SMPTE’s AXF open-standard object-container format, which supports the bundling of files and any type of related metadata into content-aware packages regardless of what the underlying operating system might be.

Of course, to be useful in live productions, stored content must be accessible from digital storage systems, which is facilitated by wide adoption of “hot” object-based storage technology that treats all data elements related to a specific content file, including audio, video, metadata, graphics, captioning and other components, as a single object with a unique identifier. Massive digital conversion initiatives undertaken by producers with archives of content predating the digital era are making sports and news from the distant past readily available for use in live programming.

AI has been a big contributor to automating both the indexing process, often with an expansion of metadata that can be used to identify content, and the search for assets relevant to any given live content flow. Employing image, audio and text recognition, AI systems used in live productions are facilitating real-time aggregations of clips, graphics and other data from any and all digital repositories that producers have access to.

Hybrid Approaches to Virtual Production

Meanwhile, on another front, virtualization of on-set elements and backgrounds has taken off across the live production sector. This is a hybrid form of virtual production (VP) that involves what were once separate aspects to virtualized production employing 3D REMI contributions to live broadcast settings and in-camera video effects (ICVFX) displayed on LED walls in movie, episodic TV and ad production.

Hybrid VP has materialized too fast to have been factored into researchers’ projections. But even before the new paradigm took hold, researchers were predicting big things for VP. For example, Research and Markets projects the global VP market will grow at a 19.98% compound annual growth rate (CAGR) from $1.99 billion in 2022 to $7.13 billion by the end of 2029. According to Grand View Research, the VP spend in the U.S. over that timeframe will increase at a 15.9% CAGR, reaching $1.09 billion in 2029.

VP in all its permutations is now seen as an essential tool that can contribute time- and cost-saving efficiency to just about any project. Distributed workflow systems allowing live content producers to configure graphic elements for pixel-accurate renderings across multiple display environments in tandem with events have created a multi-use studio environment that’s saving broadcasters a lot of money.

Instead of relying on separate studios for news, sports and weather, they’re creating flexible multi-use workspaces. At any point in the broadcast schedule a barebones set and broadcast desk with display wall can become a fully featured weather reporting studio with the kind of green wall functionality weathercasters are accustomed to and then, during a station break, be turned into a news or sports reporting venue with LED wall support for rendering the background and 3D REMI support for virtual placement of graphics and their display components.

The Use of Multiplayer Video Game Engines in Live Production

Multiplayer high-action video game production software is playing a major role in managing the mix of data feeds delivering A/V content to all virtual display elements. Leveraging high-end hardware acceleration, such solutions, especially from Epic Games’ Unreal Engine but also on occasion from Unity Technologies, are widely used in all types of M&E productions to create hybrid blends of virtual and real elements that can be made to look entirely real to viewers – or, in the case of ads and movies, as phantasmagoric as creators desire.

These game engines are applying groundbreaking innovations like real-time ray tracing used to deliver realistic variations in shadows and light, densification of micro-polygon geometries, streamlined aggregation of visual effects, batch rendering of multiple camera inputs, and accelerated encoding. They combine, compress, and parse out the moment-to-moment changes in virtualized elements from all sources in frame-accurate synchronicity to provide audiences a convincing rendering of the hybrid scene.

The Remote Video Versatility Enabled by PTZ Cameras

Advances in cameras, too, are playing a big role in the production revolution, especially when it comes to the versatility and remote-control capabilities enabled by professional-grade PTZ (pan, tilt, zoom) cameras. They can be used for just about any purpose anywhere, from playing a complementary role with traditional cameras in live sports and other big event coverage to serving as the primary video sources in news reporting and in-studio camera work.

High-quality PTZ cameras can be controlled from a central location, eliminating the need for on-site camera operators and reducing travel and staffing costs. This allows production teams to cover multiple locations or events simultaneously, increasing efficiency and productivity.

Moreover, the integration of PTZ cameras into virtual set environments enables the creation of immersive and interactive experiences for viewers. With precise camera movements and seamless integration with virtual set technology, these cameras can be used to create dynamic and engaging content that would be difficult or impossible to achieve with traditional camera setups.

A spate of recent integrations of NDI as a conduit for delivering PTZ camera feeds from remote locations to production centers has pushed NDI into greater prominence among mainstream broadcasters. It’s another sign that the live production market wants to take the revolution to the next level.

Part 2: Consummating the Revolution with Red5’s TrueTime Studio

To achieve the full potential of live production free of distance restrictions producers and creators must be able to rely on an all-encompassing multidirectional real-time networking capability that can connect all points without proprietary encumbrances requiring the use of plug-ins or specialized appliances. As noted earlier, Red5 and its partners are letting everyone with a stake in live production know that this vital next step in the revolution is now at hand.

The possibilities are endless when remote collaboration on video production is supported by virtually latency-free transfers of video, graphics and other assets. It doesn’t matter how far apart participants in the production and postproduction workflows might be or how many might be involved, they all work in the same real-time temporal space where shared experiences can be synchronized with accessibility to transferred assets from any source at end-to-end latencies no greater and often lower than 250ms.

Best-of-Breed Solution Versatility

All aspects of what’s needed to activate the many production advances described above in real time at any distance are readily available for use in distributed live production operations through providers of holistic CMSs, gaming engines, AI-assisted solutions, encoding systems, video cameras and other components that have been fully integrated with Red5’s XDN architecture and the TrueTime Studio toolset. And producers can on board any other suppliers they choose to work with through plug-and-play integrations enabled by industry APIs embodied in the SDKs comprising the Red5 Pro development toolset.

Moreover, VP and any other application Red5 customers want to support that involves use of the leading game engines can be created through Red5 SDKs specifically dedicated to Unreal Engine and Unity. In the VP environment, these plugins make it easy to incorporate the types of augmented-reality and even holographic elements that are increasingly employed in live broadcast studios.

Through these and other supplier integrations, real-time collaboration workflows supported by Red5 TrueTime Studio make all the essential production tools available for simultaneous usage in remotely staffed locations.

Instant Switching Across Multiple Stream Views

For example, our live production demo hosted by Zixi at last year’s IBC show featured a collaboration with encoding technology supplier Osprey Video and editing tool supplier Blackbird that showed how producers can leverage TrueTime Studio tie-ins with Red5’s TrueTime Multiview for Production™ tools to allow everyone on the shared workflow simultaneous multi-view access to an unlimited volume of externally and internally generated content feeds.

Anyone connected to the distributed production workflows can instantly switch from one full screen rendering to another across any array of thumbnail displays to select the full screen view. As a result, producers can collaborate over any distance in real time with great latitude in determining what end users see moment to moment, ranging from the content in a single A/V feed to split-screen displays to composites of multiple video streams or single image captures.

Moreover, as was demonstrated at the 2024 NAB Show, when XDN infrastructure is used to stream output to audiences in real time, producers can put the Red5 Multiview technology to use to deliver long-sought multiscreen viewing experiences that are not attainable with conventional HTTP-based streaming. At NAB Red5 and Zixi with Red5 partners Nomad Media and Videon Labs highlighted the unparalleled multiscreen viewing scalability enabled by Red5’s TrueTime Multiview for Fans™. The demo featured real-time delivery of a 16-camera thumbnail viewing matrix that allowed viewers to jump from one feed to another activating full-screen displays without any interruption in the flow. The viewing options consisted of channels aggregated in a Nomad Media asset management workflow and encoded by Videon for ingress onto the XDN platform in the cloud, from which they were delivered through the show Wi-Fi system to TV screens at the Zixi and Nomad booths.

Bringing Remote Commentators & Influencers into the Live Production Mix

More recently, as mentioned earlier, Red5 has been preparing to demonstrate another major step in real-time collaboration that entails extending XDN connectivity to remotely stationed commentators and influencers. Red5 has made such connections easy to activate and manage in TrueTime Studio through its acquisition of Dale, a Brazil-based standalone platform that had already been integrated with XDN to provide dispersed creators a visual presence and real-time views of events on the field in conjunction with their coverage of soccer games and other sports events.

The integration of Dale with TrueTime Studio brings multiviewing and streamlined production controls into play with remote creators. And it provides support for real-time visual interactions between influencers and audiences when the multidirectional real-time streaming infrastructure is used in distribution.

The New Realm of Possibilities for Live Content Producers

The capabilities we’ve been discussing here are just the beginning of what can be done to move the needle in live production. The combination of XDN-enabled backend live production and frontend distribution applications offers producers and creators an unprecedented opportunity to take their services to new heights with a multitude of user experiences.

They can support applications like:

  • Watch parties and other forms of video-rich social engagement in 2D and 3D immersive extended reality environments with no scaling limitations,
  • Sports and esports micro-betting activated in-screen with video rather than through data feeds on second screens,
  • Ads and online purchasing venues featuring interactive video engagement between marketing personalities and end users,
  • and a virtually unlimited array of feature enhancements that can be delivered on a market-wide or personalized basis as overlays precisely paired frame by frame with the primary content.

XDN Architecture

The sophisticated multi-purpose XDN architecture makes all this possible in live productions that employ TrueTime Studio on the backend while using XDN infrastructure to connect end users numbering into the millions in real time. In either domain, the platform can be implemented with point-and-click configurability through the Red5 Cloud service or in customer-tailored configurations with the aid of Red5 Pro SDKs and toolsets.

XDN architecture leverages automatically orchestrated hierarchies of Origin, Relay and Edge Nodes operating in one or more private or public cloud clusters. One or more Origin Nodes in a cluster serve to ingest and stream encoded content out to Relay Nodes, each of which serves an array of Edge Nodes that deliver live unicast streams to end points in their assigned service areas. In cases like live production that may encompass a small geographic area or very few end points, content is streamed directly to Edge Nodes without the use of Relay Nodes.

The platform relies on the real-time communications capabilities of the Real Time Transport Protocol (RTP), which underlies IP-based voice communications and is the foundation for both WebRTC (Real-Time Communications), originally developed for peer-to-peer video communications, and RTSP (Real-Time Streaming Protocol), a one-to-many video streaming alternative to HTTP widely used in IP camera outputs and for mobile video transmissions. WebRTC is the primary transport option, which by virtue of its support in all the major browsers, eliminates the need for device plug-ins.

Along with ingesting any content delivered via WebRTC or RTSP, XDNs can ingest video formatted to all the other leading protocols used with video productions, including Real-Time Messaging Protocol (RTMP), Secure Reliable Transport (SRT), Zixi Software-Defined Video Protocol (SVDP), and, MPEG Transport Protocol.

It’s also important to note that Red5’s integrations with Zixi and SRT facilitate live production output to distribution affiliates over those commonly used contribution protocols when XDN architecture isn’t in play on the distribution side. Zixi’s hosting of Red5 trade show demos highlighting XDN usage in live productions is a testament to how important real-time connectivity is to freeing producers from reliance on costly venue-based production vans.

Conclusion

These are just some of the high points contributing to the multi-cloud capabilities of XDN architecture. From the production perspective, the automated approach to capitalizing on XDN architecture is a much-needed breakthrough for sports and other live production scenarios.

Without a cost-effective way to utilize real-time connectivity between cameras capturing on-field action and studios or other remote production locations it’s been impossible for event producers to eliminate the need for big on-site production set-ups. Now, with Red5 Cloud in play, that’s no longer the case. The migration to cloud production in sports is getting underway with a top-tier professional league we can’t name. They’re starting with activation of XDN architecture to support real-time sharing of archived video in multiple use cases tied to internal operations.

There’s much more to come. Meanwhile, to learn more about what TrueTime Studio means to the live production revolution, contact info@red5.net or schedule a call.