Client News

manor marketing curve

Greenland’s KNR chooses nxtedition as its primary production system for another five years

Broadcast microservices specialists nxtedition has inked a five-year deal with KNR (Kalaallit Nunaata Radioa), the Greenlandic Broadcasting Corporation, which has chosen to maintain nxtedition’s end-to-end system as their primary broadcast solution. KNR first installed nxtedition as part of its new production infrastructure in 2017 and relies on it to deliver a futureproof workflow for streamlining all their broadcast, web, and social media production needs.

Using nxtedition’s story-centric workflow, KNR journalists can run their programs themselves, without having to rely on a lot of technical support. The system encompasses pre-production, ingest, media management, script writing, graphics, prompting, live studio automation and social media management. Everything is elegantly controlled from within a single user interface.

Karl-Henrik Simonsen, CEO of KNR said: “We have seen just how reliable the nxtedition technology is and the way it makes the process of creating and delivering content to digital and terrestrial platforms makes it the best choice for us. We have no hesitation in extending our collaboration with them; nxtedition has great support and deliver what they promise. We look forward to our continued cooperation for many years to come.”

Since the initial installation, KNR once again placed nxtedition technology at the heart of its new radio production system.

Ola Malmgren, CEO, nxtedition explained: “Great storytelling is what audiences come to television for and too often production workflows put the story last. At nxtedition we have changed the game; by allowing journalists to take control of their stories, we are empowering broadcasters to give audiences what they really want – a fantastic viewing experience with an engaging story.”

Sky Deutschland Upgrades to Next-Generation UHD Rohde & Schwarz VENICE and SpycerNode for Sport Production

Broadcaster gains flexibility, scale to support ingest, transcoding and playout for premium football coverage

Rohde & Schwarz, a global leader in media technologies, has supplied the latest version of its R&S®VENICE media server and R&S®SpycerNode intelligent storage solution to Sky Deutschland. The German broadcaster will now benefit from a highly robust workflow, with full built-in redundancy and no single point of failure, for its coverage of premium sports content, such as the Bundesliga and Champions League.

The broadcaster has invested in six new VENICE units, along with a new SpycerNode, delivering a total of 24 channels of scalable centralised and external storage capability as well as support for UHDp50. The upgraded workflow handles ingest to VENICE and transfers files directly into SpycerNode, allowing the Sky Deutschland team to access and edit growing files, pull out clips for highlights packages and playout directly. VENICE also ensures correct flagging and handling of UHD and HDR material, eliminating errors in the delivery chain without the need for external third-party systems.

“Consistency and quality are the bedrock of our brand and two things we pride ourselves on. As we meet our viewers’ demands for higher volumes of richer content, having a rock-solid 24/7 channel playout workflow that can meet our evolving needs is a must have,” said Christian Barth, Director of Production Platforms & Playout at Sky Deutschland. “Rohde & Schwarz has a clear understanding of our requirements and with this new generation of VENICE and SpycerNode, we have all the scalability we need now, and in the future. This upgrade also gives us the redundancy necessary in our playout environment.”

VENICE is specifically designed to support mission-critical broadcast applications, such as group ingest and playout to studio and master control, managing complex signal processing and storage requirements. The system provides scheduled recording, clip transforms and playout, integrating with Rohde & Schwarz shared storage, SpycerNode as well as certified third-party systems. This solution offers support for 4K Ultra HD signals, using either single link 12G SDI or quad link 3G SDI (incl. 2SI option), and is also capable of SMPTE ST-2110.

SpycerNode leverages High Performance Computing (HPC) file systems for high scalability and full redundancy with even the smallest unit. Easily configurable to meet broadcasters’ capacity and bandwidth requirements – including during operation – this solution offers shared storage to support fast turnaround workflows.

In addition, Rohde & Schwarz VSA (Virtual Storage Access) technology is being deployed at Sky Deutschland and provides maximum fail safety and seamless redundancy for all R&S applications for ingest and playout. This software-based solution ensures maximum flexibility for further deployments and system expansion.

“Today broadcasters need efficient ways of working in high-pressure live environments. They also need solutions that deliver an unmatched level of reliability and the scalability to ensure their operations can keep pace with the growing demand for high quality content.,” said Andreas Loges, Vice President of Media Solutions. “At Rohde & Schwarz, we are focused on developing solutions that meet our customers real world challenges today but will also grow with them in future. We are extremely proud that Sky Deutschland has chosen to implement the latest iterations of our VENICE and SpycerNode solutions and extend our long-standing partnership.”

iSIZE BitSave Wins 2021 VideoTech Innovation Award for Sustainability

iSIZE today announces that it has won the 2021 Digital TV Europe (DTVE) VideoTech Innovation Award for Sustainability with its BitSave pre-processor for video.

With global internet traffic set to reach 4.8 zetabytes a year in 2022, and 80% of this resulting from video, there is a significant environmental cost. Today the internet is using an estimated 10% of the world’s total energy consumption, putting its carbon footprint at the same level as air travel. Anything that can reduce the traffic on the internet means a reduction in that environmental impact. The challenge is working with an infrastructure for distributing online and mobile video that is already well established and largely unchangeable. Upgrading codecs takes time and deployment risks are high. Enter BitSave.

iSIZE's BitSave preprocesses the input video prior to encoding and removes imperceptible information that is costly to encode by all existing video encoders. By being encoder agnostic, BitSave brings benefits to all video coding standards, like AVC/H.264, HEVC/H.265, VVC/H.266 or VP9 and AV1. No changes are required in the encoding, stream packaging, streaming and decoding.

Using iSIZE’s deep neural network solution, developed specifically for media content, means valuable bandwidth reduction without compromising the quality. To create BitSave, the iSIZE team undertook extensive scientific research on visual perception and visual quality scoring to deliver a video preprocessing solution that preserves all visually salient characteristics of each input frame, while attenuating details that are imperceptible by viewers and incur significant cost when encoding with standard encoders.

Sergio Grce, CEO at iSIZE commented, “We are delighted to have been recognised for our contribution to making video streaming more sustainable. Instead of the traditional approach of using simple signal-to-noise analysis or broad-brush bitrate reduction, we have deployed our rich artificial intelligence and machine learning to process individual video streams dynamically. While it is a highly sophisticated solution, BitSave runs on standard workstation hardware, making it simple and easy to implement. The consumer benefits from a more reliable service even in marginal conditions, because of the reduced bandwidth.”

TV2 Nord delivers dynamic election day coverage with nxtedition

Broadcast microservices specialists nxtedition played a leading role in customer TV2 Nord’s coverage of the Danish local elections on November 16. The election covered 98 of the country’s municipal seats and five regional councils and the broadcaster relied on its nxtedition system to handle video wall and graphics automation throughout.

nxtedition sent data to CasparCG's dynamic HTML5 graphics templates and delivered live link and data for transmission and the studio's video walls. The system was also used to pull in social media interactions, harvesting comments from the station’s Facebook page and delivering them into graphics templates. The studio presenters controlled data and graphics delivery to the video wall on the fly from their tablet shotboxes.

TV2 Nord’s nxtedition system is its platform of choice for news production and was already integrated into its production workflow, meaning the studio presenters and staff were very familiar with it.

TV2 Nord CTO, Peter Zanchetta said: “Working with nxtedition on the election project made the operation so much smoother. It’s easy to customise the system and the content, which is a huge advantage. And we didn’t need to integrate another technology platform to handle social media like we used to in the past, which again simplifies the whole process. nxtedition has a track record for reliability, which was vital to this project, and are always on hand 24/7 whenever we ask for support.”

Live debates in each municipality or region being contested are always a main part of TV2 Nord’s election coverage. Normally, this means having to set up an OB truck and send journalists and technical crews out on location. This year, they took a different approach, with live debates hosted in their studio. Voters were able to engage and submit questions, in realtime, via the broadcaster’s Facebook channel.

Ola Malmgren, CEO, nxtedition added: “Election coverage means fast response times and the ability to pull up graphics with updating data in an instant. This is where nxtedition shows its power – one platform can handle everything seamlessly, including social media integration.”

nxtedition introduces nxt|cloud

Migrating nxtedition’s ‘best of breed’ microservices to the public cloud

Broadcast microservice specialist nxtedition has further enhanced the consolidated production and playout platform with nxt|cloud, a complete deployment of nxtedition which runs in the public cloud.

Previously, playout in nxtedition utilised the switching, layering and real-time rendering power of CasparCG to achieve high production values using COTS hardware. This latest development has seen nxtedition develop a fully containerised, Linux version of CasparCG, providing the same playout functionality, flexibility and quality as a scalable, elastic and secure microservice in the cloud.

The architecture of nxtedition is unique in being entirely built in JavaScript, ensuring the platform is positioned to take advantage of all developments in web technology. The developers of nxtedition had previously been deeply involved with CasparCG, the open-source broadcast graphics platform now widely used around the world.

The nxtedition solution contains all the elements required for broadcast, from ingest and transcode through asset management and archiving to delivery to multiple platforms, automatically repackaging news stories for social media. The fully virtualized architecture means that systems can be built to precisely match the individual workflow requirements, with the appropriate level of resilience and a large reduction in complexity that microservices bring. This centralisation of content for the users allows more productivity and speed with the content to repurpose it for broadcast, OTT, digital, social, podcasts and radio.

“With nxt|cloud we can offer an identical experience in the cloud: the same quality, the same functionality, the same user experience, the same responsiveness,” said Adam Leah, creative director at nxtedition. “That includes sophisticated added- value features like localisation: we can, for example, take in a single live sports feed over SRT and split into, say, eight identical CasparCG channels, but sending each channel a separate commentary audio and graphics feed in different languages - all driven by the timestamped metadata authoring and the layering in nxtedition.

“Private cloud on premise is the pragmatic choice for most broadcasters when it comes to production. But while that gives them control over their content, it also gives them a concern over disaster recovery. A widespread power failure, for example, could take them off air.”

We designed nxt|cloud to also provide new and existing on-prem clients with a hybrid cloud solution for disaster recovery. By using seamless replication, not only are the playout channels mirrored from the ground, but the scripts and media are mirrored too. If the client has an emergency, then the entire team switches to the cloud and carries on where they left off. The UI is the same, everything is the same – from ground to sky.

Mo-Sys and APG Media Join Forces to Provide End-to-End Virtual Production Solutions 

Mo-Sys™ Engineering (www.mo-sys.com), a leader in virtual production and image robotics, and APG Media (www.apgmedia.com), a leading LED volume rental provider and distributor, have partnered to provide customers with access to complete end-to-end LED virtual production set-ups, combining Mo-Sys' StarTracker and VP Pro XR with APG Media’s HyperPixel, customised LED wall solutions.

The Mo-Sys StarTracker camera/lens tracking system has become the technology of choice for leading-edge virtual productions. The advanced tools in VP Pro XR content server include the unique ability to pull focus seamlessly between real and virtual objects. By partnering with APG Media, Mo-Sys can now offer custom engineered LED tiles and a comprehensive package for tailored LED volume, multi-camera production systems.

“The appetite for high production values means virtual production is no longer exclusive to big budget movies,” said Michael Geissler, CEO of Mo-Sys. “In cinematography, the quality of LED walls has to be as high as possible to deliver the best results and HyperPixel meets this requirement. Through this collaboration with APG Media we are removing the technical complexity from the equation and freeing up production teams to express their full creativity with an immaculate end result.”

David Weatherhead, CEO at APG Media added “Virtual production is taking off as cinematographers and content producers recognise the impact this model can have on their final output - if they use the right technology. In a segment that is already at the cutting edge, Mo-Sys is a pioneer and a market leader, and this partnership brings huge value to both our customers. By combining our offerings, we can now open the doors to the best technology and help cinematographers to create the stunning content that audiences demand.”

The new Mo-Sys Refinery in Los Angeles will feature a HyperPixel high resolution LED wall, featuring ultra-tight seams for the ultimate in immersive visual experiences. APG Media will add a Mo-Sys VP Pro XR and StarTracker to the offering within its rapidly growing specialist virtual production division. The two companies will cooperate on marketing and exhibition, as well as providing support for sales channels.

High Frame Rate Mo-Sys StarTracker and NearTime for LED boost ICVFX workflows

Mo-Sys Engineering (www.mo-sys.com), world leader in image robotics and virtual production solutions, today reveals new solutions for LED virtual production with a high frame rate (HFR) StarTracker camera tracking system and the award-winning NearTime® workflow extended to LED volumes.

NearTime for LED is a smart solution to solve multiple challenges when shooting in-camera visual effects (ICVFX) in an LED volume. At its most basic level it is an automated background re-rendering service for improving the quality and/or resolution of Unreal based virtual production scenes, and is used simultaneously alongside a real-time ICVFX shoot. NearTime solves one of the key challenges of LED ICVFX shoots, which is balancing Unreal image quality whilst maintaining real-time frame rates. Currently every Unreal scene created for ICVFX has to be reduced in quality in order to guarantee real-time playback.

Using NearTime with an LED ‘green frustum’, the same Unreal scene can be automatically re-rendered with higher quality or resolution, and this version can replace the original background Unreal scene. Whilst it takes longer to do this, no traditional post-production costs or time have been used, plus moiré issues can be avoided completely!

Mo-Sys CEO Michael Geissler said, “We patented NearTime almost 8 years ago knowing that real-time graphics quality versus playback frame rate was always going to be an issue with ICVFX shoots. At that time we were focussing on green/blue screen use, but today LED volumes have exactly the same challenge.”

Mo-Sys is also announcing a special StarTracker for LED that provides camera and lens tracking up to 240fps for slo-mo shots, as would be typically used for fight scenes in high impact action films. In this scenario, rather than drive the LED wall at high frame rates to match the camera, which requires significant hardware processing and comes with an Unreal image quality impact, NearTime for LED is used to post-render the Unreal scene at HFR following the HFR ‘green frustum’ shoot of the talent performing. During the slo-mo shot review, the re-rendered Unreal background and the talent foreground are re-combined, delivering a significantly higher quality composited shot.

“Proxies were used as a smart solution to solving computer processing limits in the early days of digital compositing and colour grading,” Geissler commented. “Similarly, HFR StarTracker and NearTime for LED is a smart solution to shooting slo-mo shots in an LED volume.”

The HFR StarTracker for LED has been designed in anticipation of future GPU processing improvements enabling higher frame rate Unreal scene playback. In conjunction with Mo-Sys’ VP Pro XR content server, the HFR StarTracker’s programmable phase shift feature can utilise Brompton Technology’s Frame Remapping or Megapixel VR’s GhostFrame capability for simultaneous multi-camera shoots in an LED volume.

NearTime for LED will be available in Q4 to Mo-Sys customers with VP Pro and VP Pro XR. The HFR StarTracker for LED will be available early Q1 2022.

Aurora relies on GB Labs storage solutions to support live production of four flagship motorsport events 

Places GB Labs storage at the heart of its workflow for Goodwood’s 2021 Festival of Speed and Revival, Extreme E and Formula E .

GB Labs today announces that sports production specialist, Aurora Media Worldwide, has built live production workflows for three flagship motorsport events around its storage solution. Aurora deployed GB Labs FastNAS platform for the 2021 Goodwood Festival of Speed and is using the system for the inaugural Extreme E season which kicked off in April 2021. In addition, the GB Labs SPACE intelligent storage has been central to the production workflow for the 2020/2021 season Formula E. These projects see the continuation of a long-established partnership between Aurora, ccktech and GB Labs.

Aurora has been using GB Labs storage since 2015/2016, having invested in a Midi SPACE 3RU along with an additional SPACE system that was later extended and then upgraded in 2019 to the powerful HYPERSPACE intelligent storage system to gain greater capability and disk capacity. During a live race event, Aurora deploys two identical SPACE units trackside in the bespoke “POD” based OB facility with one acting as a fully redundant backup for the other. A GB Labs FastNAS storage platform was added to the workflow in January 2020 to support Extreme E’s remote production workflows.

Based on its robust and reliable performance Aurora invested in a second dedicated FastNAS for the 2021 Goodwood Festival of Speed on 8 - 11 July and the Revival in August, to help support the sheer scale of the live hours per day that the event requires. FastNAS, which delivers a highly cost-effective way to leverage huge performance and capacity, is also playing a central role during live production of the 2021 Extreme E and Formula E seasons. ccktech has been integral to bringing together the projects for Aurora over the last four years.

“Through our long relationships with GB Labs and ccktech, we are confident that they can meet all our needs, from performance and robustness to reliability,” said Lee Flay, Technical Director at Aurora Media Worldwide. “GB Labs gives us the ability to support all our post production, media management and live playout workflows in the field in a very compact footprint. Everything fits into a single flight case, which means less equipment to transport and a lower carbon footprint, which ties in nicely with the growing environmental consciousness of our industry and events like Extreme E.”

Dan Deadman, Head of Solutions at ccktech explains “We’ve been asked by Aurora to provide shared storage solutions both for on-location and in-house needs on many occasions and we just keep going back to GB Labs. Their ease of use and configuration for editors and tech admins alike make it a no-brainer for reliability in a high-intensity environment that the fast-turnaround world of Aurora's production demands.”

SPACE packs a punch in a small footprint, supporting the highest resolutions and providing fast efficient data transfers across 1Gb, 10Gb, 25Gb, 40Gb and 100Gb connections. The solution offers easy installation, maintenance, and upgrades, while IP connectivity ensures remote dial-in support.

Peter Walshe, UK Sales Manager at GB Labs commented, “Live sports production is one of the most fast-paced environments with zero room for error – and nothing comes close in terms of speed and the pressure for rapid turnaround of content as motorsports. We are extremely proud of the significant role our solutions play in helping Aurora deliver their world-beating service for some of the world’s most avidly followed motorsports events as well as the other premium productions they are involved in.”

Cinegy Air 21.9 and Multiviewer 21.11 Among Updated Solutions Being Showcased at IBC 2021

Cinegy will highlight a range of solutions at IBC 2021 (Hall 7 Stand A01), including Air 21.9 and Multiviewer 21.11, giving customers and partners an opportunity to learn about new functionality and features first hand. Mike Jacobs, Head of Professional Services at Cinegy will also participate in a panel discussion at the show; the session entitled ‘Cloud-based Workflows – what IP can do for production of content' takes place on Saturday 4 December.

Cinegy Managing Director Daniella Weigner commented: “We are excited to re-connect with our customers, industry friends and colleagues face-to-face and have provided a safe meeting space in which to demonstrate all the innovations and updates we have been busy working on since we last had the opportunity to gather together. IBC 2021 will give our customers the opportunity to find out how we can help them make the transition from traditional workflows to IP and cloud more straightforward.”

Products with recent updates being highlighted at the Cinegy stand include:

Cinegy Air 21.9 - Simplifying the increasingly complex process of playout, automation and file delivery, Cinegy Air 21.9 is an integrated suite of software that acts as a broadcast automation front-end and a real-time video server for SD, HD, Ultra HD (4K), and/or 8K playout. For the ultimate flexibility, users can now play compatible format and mixed resolution content, as well as un-rendered edit sequences, straight to air. The system delivers EAS, Nielsen watermarking and Cinegy Titler channel branding in a single software solution.

Cinegy Multiviewer 21.11 - Customers today must deal with a rapidly growing number of streams from satellite, camera feeds and playout devices. Cinegy Multiviewer streamlines the process, displaying all these signals before analyzing them and raising alerts for any detected signal problems. The latest version offers significant improvements to video scaling performance and support for 8K formats for input devices. Running as a service on COTS hardware, video streams can be received over IP via Ethernet or using standard SDI cards.

Cinegy Capture 21.9 - Revolutionizing the acquisition and transcode process, Cinegy Capture now offers a range of updates and new features, such as cloud-ready architecture, full ingest control via a standard internet connection, support for audio input and WDM input devices, and added RS422 timecode source for SDI boards.

Cinegy Titler 21.9 - A straightforward way to add multiple layers of automation controlled, template-based titles, logos, animated graphics, and more. From simple ticker tapes and lower thirds to multi-layer character animations, Cinegy Titler is packed with advanced effects and features. The solution gives production teams an easy way to make changes on the fly as well as alter elements in a pre-created template. The Cinegy Titler template builder and title designer makes the task of building creative templates quick and easy.

The Cloud-based Workflows – what IP can do for production of content session is scheduled for Saturday 4 December, 15:15 - 16:00, on the Production Stage. Panellists will discuss their future vision of remote and live production, the role that cloud will play along with the benefits it delivers, including cost-efficient scalability, faster rapid capturing speeds and the reduction of investment in proprietary technologies.

Intinor joins RIST Forum to boost streaming interoperability

Intinor, Sweden’s leading developer of products and solutions for high-quality video over Internet, has joined The RIST Forum as an associate member. RIST – Reliable Internet Stream Transport – is the proponent of an interoperable, global specification for transporting live video over unmanaged networks.

The RIST Forum draws its membership from across the industry, and its expert group has poured hundreds of years of real-world experience into developing a robust solution, built on existing industry standards. The goal is to achieve consistent quality over the public internet, even when bandwidth is limited.

Detailed enough to ensure interoperability between systems, the RIST specification is still fluid enough to allow for innovation. This is important as it allows Intinor to continue to develop its very high performance systems while being certain they can interwork with other components.

“Interoperability between vendors, and so between production companies and broadcasters, is a vital consideration for us,” said Roland Axelsson, CEO of Intinor. “The RIST transport protocol is rapidly gaining support, and for us to add it was a no-brainer. Our membership of the RIST Forum is a clear signal of our commitment to the causes of seamless interworking.”

Suzana Brady, chair of The RIST Forum, added “Our project depends upon widespread adoption and commitment from both vendors and broadcasters. We are pleased to welcome Intinor as an associate member, and look forward to adding its undoubted experience and expertise to our collective understanding.”

RIST is designed for the reliable transport of professional video over the internet. Use cases include news and sports contribution, remote production, and distribution services. RIST provides a core set of functionality and behaviours across all implementations to ensure interoperability, while giving vendors the freedom to add their own advanced functionality.

Page 5 of 15

Contact us

Let's boost your journey to success, get in touch today

Please contact us with any questions.

Manor Marketing, 4 Olympus House, Calleva Park, Aldermaston, RG7 8SA Get Directions