Client News

manor marketing curve

Mo-Sys To Show Spectacular Realtime Augmented Reality At NAB

Mo-Sys Engineering, world leader in virtual and remote production, will expose more of its virtual production technology stack at NAB Show 2022 (Las Vegas Convention Centre, 24 – 27 April), where this year it is co-exhibiting with APG and Fujifilm on stand C6127.

Mo-Sys, using Fujifilm lenses and a state-of-the-art 1.5mm pixel pitch LED wall from APG, will show the following technology:

LED virtual production – using its VP Pro XR LED content server and StarTracker camera tracking technology, Mo-Sys will show its end-to-end LED production workflow, highlighting the benefits of designing a solution specifically for cinematic and broadcast virtual production. In addition, the team will show the latest multi-camera switching feature for VP Pro XR, along with Cinematic XR Focus for pulling focus between real and virtual elements, managed by a TeradekRT wireless lens controller, as part of a new collaboration between Vitec and Mo-Sys.

Solving real-time VFX graphics quality – NearTime® is Mo-Sys’ patent-pending and HPA Engineering award-winning solution for solving real-time VFX virtual production image quality. The solution comprises a cloud-based auto-re-rendering system utilizing Smart Green, a method of separating talent from the LED background without introducing copious green spill, when combined deliver higher quality real-time VFX content. NearTime also removes Moiré patterning completely and enables the use of lower cost LED panels to deliver an image quality that’s far closer to post-production compositing.

AR for sports – the new heavy-duty Mo-Sys U50 remote head will be shown with Fujifilm’s latest box lens and will be controlled remotely from the Vitec stand using a Vinten 750i remote head with pan bars. Visitors to the stand will be able to experience the smooth precise motion of the U50 combined with the immediate response of the V750i, providing operators with the best possible experience.

In addition, a new Mo-Sys camera plate for the Vinten 750 head, containing Mo-Sys’ precision encoders for camera and lens tracking, will be shown for the first time. This new camera plate simplifies adding camera tracking capability to a static sports/event camera position for delivering precision blended AR graphics.

Robotics for virtual production – Mo-Sys will also show its new G30 gyro-stabilized head, offering a unique combination of tech-less setup, rapid accelerated movement of the heaviest camera payloads, and a smooth stabilized image. Additionally, Mo-Sys will demonstrate its industry-standard L40 cinematic remote head. Both remote heads can be equipped with encoded outputs for 3-axes.

Hitomi Showcases New Latency Feature For Its Flagship MatchBox Solution At NAB 2022

Hitomi Broadcast will showcase its new latency feature for MatchBox, the industry’s premier audio video alignment toolbox, at NAB 2022, delivering complete timing quality control assurance, end to end. This latest enhancement to MatchBox’s capabilities measures the actual time of flight of video signals from the front of multiple cameras or at various points through the broadcast chain with milli-second accuracy. With applications in remote and virtual production, this further strengthens Hitomi’s product offering.

Broadcast equipment adds delay, the amount of which can vary each time it is used. Delays need to be synchronised between different paths for a seamless viewer experience. Determining those offsets theoretically can take up a lot of engineering time but takes just seconds with MatchBox Latency.

A measurement is taken simply by holding an iPhone up in shot running the MatchBox Glass App which is free to download. A signal is then sent back to the MatchBox Analyser located in an OB truck or MCR. Fast, easy to use and accurate, this solution simplifies the task of measuring latency and gives a lip-sync reading as well.

Hitomi’s Broadcast Director, Russell Johnson, said, “When you watch someone being interviewed from a remote location there is often a pause in continuity with the handover from the studio and return. Latency measures how long that pause needs to be for each link in order to achieve perfect timing. It can ascertain if the pause is comfortable for a two-way conversation or if it is too long to be able to interact in a live situation and all questions have to be asked upfront.”

For measuring latency within the broadcast chain, MatchBox Generator can be deployed as a test signal with the Analyser picking up the results downstream. It can be thought of as a multi-meter for signal timing with a probe at each end of the section of interest.

“Much is talked about the need for “low latency”, but it is rarely quantified,” Johnson continued. “Now it can be. MatchBox is already deployed worldwide with major name broadcasters so widespread adoption of this new technology can be rapid. It will bring changes, but they will be measured changes.”

This product is designed to help broadcasters prepare for live transmissions as well as virtual studios and other applications where timing needs to be known, not guessed. It can form part of a toolset helping broadcasters move to more eco-friendly workflows whilst still retaining quality.

Find out more about the MatchBox solutions for lip-sync and latency in this video here: https://youtu.be/LeHiUzhRn8s

Mo-Sys Partners With Assimilate To Integrate StarTracker With Live FX

Mo-Sys Engineering, a world leader in image robotics virtual production and remote production, today announces that its precision camera tracking solution, StarTracker, is now fully supported by Assimilate’s Live FX software. The integration between StarTracker and Live FX allows for a faster way to create high end content with green-screen and LED wall-based virtual productions.

Both companies have a similar approach, with a focus on allowing filmmakers and artists to work in a way that is familiar to them rather than adapting to a program environment within Unreal Engine (or other game engines). The Live FX software, which enables realtime, live compositing for green-screen and LED wall-based virtual productions on set, when combined with Mo-Sys StarTracker, delivers a one-stop-shop software solution for all kinds of virtual production workflows. Creatives gain the added benefit of a workflow that speaks their language and is much easier to learn.

“We are delighted to give customers access to a complete solution to accurately deliver high quality tracking data and compositing assets directly to VFX/post workflows,” said Michael Geissler, CEO of Mo-Sys. “With demand for virtual production rising rapidly, filmmakers and cinematographers can deliver the best results without the need to learn a totally new way of doing things.”

The integrated solution delivered by Mo-Sys and Assimilate allows users to efficiently and accurately composite 2D and 3D elements into the live camera signal, control (and colorgrade) LED walls, control stage lighting based on image content, use Live FX as a simple keyer in green-screen scenarios, and link it up with Unreal Engine if needed.

Jeff Edson, CEO at Assimilate commented, "For Assimilate, it has always been important to create streamlined workflows at the very high end that our users can rely on. We are proud to offer the best possible camera tracking in Live FX with the Mo-Sys StarTracker; together they create an unbeatable, highly accurate live-compositing system with a streamlined connection to post. The support provided by Mo-Sys to help with the implementation on our side was especially valuable to us.”

GB Labs Grows And Strengthens Team To Meet Rising Demand For Its Intelligent Storage Solutions

GB Labs today announces that it has appointed Tom Sheldon to the role of Chief Product Officer (CPO) and David Martin Bautista to Head of Development. Both Sheldon and Martin Bautista have worked their way up through the company, having started out as part of the development team. To meet demand for its intelligent storage solutions, GB Labs has also expanded the number of support staff, and grown its development and production teams, resulting in 20% growth in personnel.

“I am delighted that Tom and David are taking on these new leadership roles within the company and playing an active part in taking GB Labs through the next phase of our growth and evolution,” commented Dominic Harland, CEO and CTO, GB Labs. “I’m extremely proud of the way we have brought both of them up through the ranks and seen them grow in parallel with our company. I know that they will play a key role in bringing our vision for the future to fruition.”

Tom Sheldon has been with GB Labs since November 2010, progressing quickly to head of development, reporting directly to CEO and CTO Dominic Harland. He was responsible for growing the department and driving backend development. As CPO he will be based at GB Labs’ Aldermaston HQ, taking a lead in the company’s product strategy and drive product development. Sheldon will also play an active role in maintaining strong synergy between the development and R&D teams and ensuring the GB Labs product roadmap is commercially solid.

GB Labs Chief Product Officer Tom Sheldon said, “At GB Labs we are passionate about continuously improving functionality and listening to customers to ensure that our products deliver what they need, while keeping the user experience as straightforward as we can. I am delighted to take on this new role and help drive our product roadmap in a direction that remains tightly aligned with our customers’ businesses."

David Martin Bautista joined GB Labs in February 2013 as a developer, taking on responsibility as lead frontend developer as GB Labs grew. As Head of Development, he will be responsible for managing the in-house development team, driving software updates and future innovation – including the UI. Martin Bautista will also act as a link between the development and support teams to ensure seamless communication between the two.

“When I started with GB Labs, the company appealed to me because it offered opportunities for growth and the chance to working with interesting technology. That remains true to this day,” said David Martin Bautista, head of Development at GB Labs. “I have grown with the company, and I believe we are only just getting started; I’m excited to see where we go from here.”

First end-to-end live 5G Broadcast streaming to smartphones at MWC Barcelona 2022 with Qualcomm and Rohde & Schwarz

Rohde & Schwarz, a global leader in broadcast transmitter and media technologies, and Qualcomm Technologies, Inc., the driving force behind the development, launch and expansion of 5G, have joined forces to showcase 5G Broadcast with a full end-to-end live streaming demonstration at Mobile World Congress 2022.

In a live demonstration delivering content to smartphone devices, and showcasing Broadcast/Multicast capabilities over 5G, content provided by Cellnex Telecom will be re-transmitted using a 5G Broadcast signal over-the-air in Barcelona, giving show attendees a firsthand look at an advanced live mobile experience.

5G Broadcast offers network operators and broadcasters opportunities to create exciting consumer experiences across a range of new and existing business areas. All of this while enabling high spectral efficiency and reduced costs.

To bring this live demonstration together, Rohde & Schwarz provided its end-to-end 3GPP compliant solution, comprising a 5G Broadcast enabled R&S TLU9 transmitter, supported by a Spinner filter, and the Broadcast Service and Control Center (BSCC2.0) acting as a core network.

During the demonstration a live signal will be transmitted over-the-airwaves inside the Fira Gran Via in Barcelona, from the Rohde & Schwarz booth, using sectorized antenna systems supplied by Cellnex, to a smartphone form-factor test device from Qualcomm Technologies.

The 5G Broadcast solution is built on the 3GPP Rel-16 feature-set, operating in a Receive-Only Mode (ROM), Free-To-Air (FTA) and without the need for a SIM card (SIM-free reception). The 5G Broadcast dedicated mode will be demonstrated with a standalone broadcast High Power High Tower (HPHT) infrastructure while operating within the UHF band.

“Having the right network will be key to develop 5G Broadcast services,” said Ramon Salat, Commercial Broadcast Director of Cellnex Spain. “Cellnex has the right combination of high and low towers to provide a good coverage for population and territory,” adds Ramon.

“5G Broadcast opens up an exciting new world for the mobile communication ecosystem, bringing with it unrivaled user experience, new revenue opportunities and innovative service models,” commented Manfred Reitmeier, Vice President Broadcasting and Amplifier Systems at Rohde & Schwarz. “We are proud to collaborate with Qualcomm Technologies and Cellnex to bring a live experience that gives MWC2022 attendees a taste of what is now possible – and we are only just scratching the surface of the potential with 5G.”

“This is a unique showcase of the delivery of digital TV content over 3GPP standardized technology,” said Lorenzo Casaccia, Vice President of Technical Standards & Intellectual Property, Qualcomm Technologies, Inc. “We are proud to have collaborated with Cellnex and Rohde & Schwarz teams to bring this demo to life. Our new 5G R&D technology demonstration at Mobile World Congress proves this isn’t just possible, it’s here today on the show floor for all to experience and, perhaps most importantly, without the need for additional chipsets.”

The contents included in the 5G Broadcast transmission included the Spanish public broadcaster Radio Televisión Española news channel, “Canal 24h,” the main RTVE channel “La1” and the Radio Nacional de España regional radio channel, “Radio 4.” The content is encoded using encoders provided by the Spanish manufacturer Cires21.

Broadcast/Multicast over 5G is not restricted to linear and live content distribution. For network operators and media content providers it offers a completely new range of business models for delivering content or data to large numbers of consumers and without affecting the regular cellular 5G mobile network. Venue and automotive sectors are particularly suitable for new consumer applications while the high power, high tower free-to-air/no-SIM offers emergency services and national authorities more secure ways to deliver public messages during natural disasters or emergencies.

Cinegy announces Telstra partnership to expand its broadcast services

Cinegy will highlight a range of solutions at IBC 2021 (Hall 7 Stand A01), including Air 21.9 and Multiviewer 21.11, giving customers and partners an opportunity to learn about new functionality and features first hand. Mike Jacobs, Head of Professional Services at Cinegy will also participate in a panel discussion at the show; the session entitled ‘Cloud-based Workflows – what IP can do for production of content' takes place on Saturday 4 December.

Cinegy Managing Director Daniella Weigner commented: “We are excited to re-connect with our customers, industry friends and colleagues face-to-face and have provided a safe meeting space in which to demonstrate all the innovations and updates we have been busy working on since we last had the opportunity to gather together. IBC 2021 will give our customers the opportunity to find out how we can help them make the transition from traditional workflows to IP and cloud more straightforward.”

Products with recent updates being highlighted at the Cinegy stand include:

Cinegy Air 21.9 - Simplifying the increasingly complex process of playout, automation and file delivery, Cinegy Air 21.9 is an integrated suite of software that acts as a broadcast automation front-end and a real-time video server for SD, HD, Ultra HD (4K), and/or 8K playout. For the ultimate flexibility, users can now play compatible format and mixed resolution content, as well as un-rendered edit sequences, straight to air. The system delivers EAS, Nielsen watermarking and Cinegy Titler channel branding in a single software solution.

Cinegy Multiviewer 21.11 - Customers today must deal with a rapidly growing number of streams from satellite, camera feeds and playout devices. Cinegy Multiviewer streamlines the process, displaying all these signals before analyzing them and raising alerts for any detected signal problems. The latest version offers significant improvements to video scaling performance and support for 8K formats for input devices. Running as a service on COTS hardware, video streams can be received over IP via Ethernet or using standard SDI cards.

Cinegy Capture 21.9 - Revolutionizing the acquisition and transcode process, Cinegy Capture now offers a range of updates and new features, such as cloud-ready architecture, full ingest control via a standard internet connection, support for audio input and WDM input devices, and added RS422 timecode source for SDI boards.

Cinegy Titler 21.9 - A straightforward way to add multiple layers of automation controlled, template-based titles, logos, animated graphics, and more. From simple ticker tapes and lower thirds to multi-layer character animations, Cinegy Titler is packed with advanced effects and features. The solution gives production teams an easy way to make changes on the fly as well as alter elements in a pre-created template. The Cinegy Titler template builder and title designer makes the task of building creative templates quick and easy.

The Cloud-based Workflows – what IP can do for production of content session is scheduled for Saturday 4 December, 15:15 - 16:00, on the Production Stage. Panellists will discuss their future vision of remote and live production, the role that cloud will play along with the benefits it delivers, including cost-efficient scalability, faster rapid capturing speeds and the reduction of investment in proprietary technologies.

Custom Media Solutions Equips US New 40-Foot Broadcast Production Truck with FOR-A HVS-2000 Video Switcher

FOR-A Corporation today announced Custom Media Solutions (CMS), a live event video production company based in Alpharetta, Ga., has anchored its new 40-foot broadcast production truck with a FOR-A HVS-2000 video switcher. The truck has been in use since mid-January.

Marc Shroyer, president of CMS, said the company took delivery of the truck in early November. “We got it to replace our 24-foot production trailer,” he explained. “It was great, but you could only put eight people in it, and there was not a great space for audio. When everything started going virtual, we really needed to create some space for broadcast audio.”

Originally, the truck had been designed to use CRT monitors, so CMS modernised the interior with an eye on flexibility. The infrastructure was designed to support 12G with all I/O via fibre, so the truck can be upgraded by swapping only a few components. Plus, audio is handled through Dante digital audio networking.

Shroyer reported that CMS recently added five new 4K broadcast cameras that produce footage in a variety of frame rates, including 24p, which is a feature that clients had been requesting. The FOR-A switcher supports 24p, as well as a number of HD and SD formats.

CMS currently owns 10 FOR-A switchers, including four HVS-2000s. “The HVS-2000 can do everything we need, and its price point is significantly less than the switchers many companies would expect to find in a truck of this size. “If you want to talk about bang for the buck, we were doing things on this switcher that you normally need a much more expensive switcher to do. I can do anything the client wants to do. It’s our secret sauce.”

“It’s easy to add a panel. You just plug in a network cable and you’re ready to go,” Shroyer added. “We have plenty of I/O, plus plenty of multi-view and keyers and DVEs – and that creates a lot of flexibility.”

Compatible with four different control panels, the HVS-2000 2 M/E video switcher offers 24 inputs and 18 outputs as standard, and can be expanded up to 48/18 or 40/22 with optional I/O cards. The switcher also includes MELite™ technology that previews output from an AUX bus when applying transitions or keying for expanded M/E performance, as well as FLEXaKEY architecture for flexible reassignment of keyers separate from the standard keyers of full M/E buses.

FOR-A provides complete production kit for prestigious Emirates Draw broadcasts

FOR-A has delivered a complete flyaway production kit to Media and Art Production (MAP), based in Fujairah, United Arab Emirates. The kit was designed to provide the technology platform for Emirates Draw, one of the most watched television programmes in the UAE.

MAP provides complete production services, from creative concepts to delivery. Emirates Draw is a live draw and entertainment programme, broadcast each Sunday in prime time. Now, with MAP’s flyaway production system, it broadcasts from a different location each week.

MAP was already familiar with the FOR-A range of production switchers, but once negotiations started it became clear that the company could provide a large part of the technology platform in readily integrated equipment. The complete solution delivered to MAP included the HVS-490 production switcher, MFR-3000 routing switcher with multiviewer, Insight servers and Envivo replay management, ClassX graphics platforms, and multi channels signal processor.

Emirates Draw is currently produced in HD, but a major benefit of the FOR-A architecture is that it can be upgraded to 4k Ultra HD at the flick of a switch. The system is built on SDI (with quad SDI for 4k when required) which allows rapid configuration at each location, and the ability to use existing infrastructure when the programme uses fixed facilities at Dubai Studio City.

“Our business is in delivering live events and television to the highest possible standards,” said Imad Bassil, owner and GM of MAP. “The new production system we developed built on FOR-A kit allows our team to focus on creating exciting, engaging television, knowing that the underlying technology will deliver what we expect, every time, whatever configuration we need for that particular show.”

Mohammed Abu Ziyadeh of FOR-A’s Dubai centre added “This was a great showpiece project for us. We demonstrated that FOR-A could deliver all the quality, reliability and ease of use they demanded; that we could provide technical and operational support to MAP’s in-house engineering team; and we could complete the project to their tight timescales.” The system was first used for a special programme on the UAE National Day, 2 December 2021. Since then, it has been used for the regular Sunday evening Emirates Draw shows.

iSIZE Joins AWS Partner Network and Completes Foundational Technical Review

iSIZE, a specialist in deep learning for video delivery, today announces that it has joined the AWS Partner Network (APN), the global community of businesses using Amazon Web Services (AWS) to build solutions and services for customers.

In addition to joining the APN, iSIZE has completed an AWS Foundational Technical Review (FTR) of its award-winning, patented BitSave deep psychovisual pre-processing technology. BitSave pre-processes the input video prior to encoding and removes imperceptible information that is costly to encode by all existing video encoders. It delivers greater efficiency over the non-preprocessed content encoding, i.e., 15%-25% bitrate and energy savings over the utilized encoder, verified by standard quality metrics (VMAF, SSIM, etc.) that are widely used in the industry. FTR enables AWS Partners to identify and remediate risks in products or solutions. The FTR is led by an AWS Partner Solutions Architect (PSA) who reviews AWS Partner products and solutions against a specific set of requirements based on the Security, Reliability, and Operational Excellence pillars of the AWS Well-Architected Framework.

iSIZE CEO Sergio Grce commented, “We are delighted to join the AWS Partner Network. By being a part of the APN, we can enable our customers to maximise existing technology investments, benefit from greater efficiencies when processing video streams, and provide better video streaming experiences to millions of users worldwide.”

Using iSIZE’s technology prior to conventional encoding makes encoding significantly more efficient in terms of bitrate saving at the same or improved visual quality. An encoder agnostic solution, BitSave also allows for the encoder to be used with simpler encoding recipes, thereby making it faster, saving datacenter cost and significantly improving the sustainability of video streaming. It requires no changes to the encoding, delivery or decoding devices.

Mo-Sys To Demonstrate Ground-Breaking Virtual Production Workflow At HPA Tech Retreat  

Mo-Sys Engineering, a world leader in image robotics virtual production and remote production, will join forces with Moxion, QTAKE and OVIDE to show a ground-breaking collaborative virtual production workflow at the HPA Tech Retreat 2022. As a Gold Sponsor of the in-person event, Mo-Sys will take part in the Supersession entitled “Immerse Yourself in Virtual Production”.

Located in the Innovation Zone, Mo-Sys, Moxion and QTAKE will have a combined stand featuring a Sony Crystal LED wall along with a Sony Venice camera attached to a curve rail. The camera will be equipped with a Mo-Sys StarTracker camera tracking system, and Mo-Sys’ VP Pro XR LED content server will drive the LED wall.

Takes will be captured by QTAKE, the industry’s preferred onset capture tool, whilst VP Pro XR will drive the LED wall, capturing the camera and lens tracking data. The new Mo-Sys NearTime® service, a 2021 HPA Engineering Excellence award winner, will be used to solve one of the key challenges of LED in-camera VFX (ICVFX) production - increasing real-time Unreal image quality versus maintaining real-time frame rates. Higher quality re-renders from NearTime, which utilizes cloud processing from AWS, will then be automatically delivered back to a Moxion Immediates solution, winners of the 2020 HPA Engineering Excellence Award, for review and signoff.

Using NearTime in an LED volume, with a Mo-Sys enabled ‘halo green’ frustum for separating talent from the LED content, the background Unreal scenes can be automatically re-rendered with higher quality or resolution using captured camera and lens tracking data, and then used to replace the original lower quality background Unreal scenes. This process avoids the need for large quantities of on-set rendering nodes, minimizes post-production costs, and significantly eliminates moiré effects completely. This unique approach enables far more efficient workflows than those that exist today.

“We believe there is a better way to bring virtual productions to life cost-effectively and without compromising on image quality,” commented Michael Geissler, CEO of Mo-Sys. “We are excited to meet with customers face-to-face once more at the HPA Tech Retreat and to show them workflows that can take their virtual productions to the next level while allowing them to work with other tools that they are already familiar with. We are proud to partner with some of the leading names in the industry to show what is possible with virtual production today.”

The VP Supersession taking place Tuesday, February 22 will walk attendees through virtual production from conception to delivery, with a special emphasis on the planning and preparation that are critical to a successful outcome. Three LED walls will be showcased from AOTO, Planar and Sony, two of which will feature the Mo-Sys StarTracker camera and lens tracking system. One of the LED walls will be driven by Mo-Sys’ VP Pro XR LED content server, which will also be showing its Cinematic XR Focus feature for pulling focus between real and virtual objects positioned virtually ‘behind’ the wall.

At HPA Tech retreat 2022 Mo-Sys will also host a series of breakfast roundtables, around the theme “Post-production workflows for LED volumes” aimed at helping cinematographers and creative professionals elevating virtual production.

Page 4 of 15

Contact us

Let's boost your journey to success, get in touch today

Please contact us with any questions.

Manor Marketing, 4 Olympus House, Calleva Park, Aldermaston, RG7 8SA Get Directions