Manor

manor marketing curve
Manor

Manor

Mo-Sys VP Pro 4.27 to release simultaneously with Epic Games’ release of Unreal Engine 4.27

Mo-Sys™ Engineering (www.mo-sys.com), a world leader in virtual production and image robotics, today announces the launch of Mo-Sys VP Pro 4.27 with a raft of new features that will support Epic Games’ Unreal Engine 4.27. During the preview period of Unreal Engine 4.27, Mo-Sys has used the new update to complete a full multi-camera shoot with Amazon. The update was also put through a comprehensive, multi-camera technical rehearsal for a major Netflix production.

Mo-Sys has been working on expanding the feature set and capabilities of VP Pro, which runs inside Unreal Engine Editor, for some time now and will launch the new version on the same day as Epic Games’ Unreal 4.27. The VP Pro 4.27 upgrade brings four new key features; an improved compositor, NearTime® rendering, an online lens distortion library, and remote control capability.

Michael Geissler, CEO Mo-Sys Engineering Ltd., said, “Virtual production is seeing a real surge, but it does bring challenges. Typically, directors and cinematographers must make continuous calculations to make the real and virtual elements match in post-production. This is often a costly and repetitive process. Mo-Sys is a pioneer in providing a platform for graphics rendering as well as camera and lens tracking to create higher quality virtual productions. With VP Pro 4.27 we have made it even easier to produce seamless, high-end productions and we are also unique in making our 4.27 update available on the same day as Epic Games launch Unreal Engine 4.27.”

Mo-Sys has done a complete overhaul of its internal compositing system, to change the way compositing is handled across all modes. The latest update of VP Pro now provides improved support for high-end graphics features, such as refraction, and delivers a 15% performance improvement. Other features include:

· Support for reflection and refraction of video in CG objects

· Improved support for advanced ray-tracing features

· Support for fur and groom

· Advanced controls for CG shadows falling onto video

The updated VP Pro 4.27 also widens the Beta program for the new NearTime® rendering workflow to give access to more users. NearTime is an automated, cloud-based re-render that dramatically increases and homogenises the visual quality, and allows for higher resolutions (UHD, 4K, 8K), without the performance restrictions of real-time. The moment the shot is complete, a high-quality, high-resolution optimized, cloud-based NearTime render starts – with no compromise on performance or visual quality. With NearTime, you can “turn up all the dials” for unrestricted real time quality and do it at a fraction of the cost of completing the process in post-production.

As part of the 4.27 update, Mo-Sys is also launching its online lens distortion library giving users access to a wide selection of calibration tools and allowing them to tweak their lenses on-set in a highly cost-effective way.

New for VP Pro 4.27, the VP Remote iPad interface is fully customizable and supports control of multiple engines, and camera chains, from a single control panel. The Beta version was trialled at the technical rehearsal with Netflix.

C.S. Innovation Technology Deploys Cinegy Air Playout Software to Get Summer Games Action to Viewers Across Thailand.

Bangkok-based C.S. Innovation Technology (CSIT) has deployed Cinegy technology to support production and playout for its client Plan B Media at the Summer Games in Tokyo. A long-time Cinegy partner, C.S. Innovation Technology will rely on Cinegy Air PRO, a comprehensive, IP-based software suite for automated SD, HD and/or Ultra HD (4K) playout, to handle multi-channel playout of live feeds from the event effortlessly and inexpensively. Plan B Media, the exclusive Thai rightsholder for the Tokyo Games, is delivering live feeds of all the action to giant LED screens across Thailand.

C.S. Innovation Technology’s Managing Director Chinnarong Ooragool said, “We have complete confidence that the Cinegy system will match our needs during the live events in Tokyo. This excellent software easily allows us to support full HD online playout and GPU encoding to NVidia, making the workflow faster and more cost-effective.”

The Cinegy Air PRO supports various encoding formats, EAS, watermarking and Cinegy Titler channel branding in a single, software platform. This solution also enables high frame rate Ultra HD formats (50/60p), includes integrated HEVC stream encoding, and allows users to offload HEVC and H.264 stream encoding to a NVidia GPU. Air PRO is also IP-enabled and works in fully virtualized or cloud environments, making it easy for users to control multiple channels, insert regionalized commercials and add graphics and channel branding utilizing these next generation broadcast tools.

Cinegy Managing Director Daniella Weigner added, “We are delighted that our Cinegy Air PRO Bundle will play such a key role in delivering the live action from the games to sports fans across Thailand. Our solution gives CSIT the power to get content on air instantly, eliminating the need for multiple complex processes and interconnecting hardware components. As a result, playout can be handled at a fraction of what it would cost by conventional means.”

Using Cinegy Air PRO, the CSIT team can very easily control multiple channels, insert regionalized commercials and add graphics and channel branding.

Brazilian IT media services company, MDotti Tecnologia, has created a dedicated workflow around Cinegy Capture PRO to support ingest for São Paulo-based television network Rede Bandeirantes’ coverage of the Tokyo Games. Cinegy Capture PRO simultaneously delivers 12 channels of ingest directly to a ZBoox shared storage, allowing the production team to access and edit clips via Adobe Premiere Pro workstations while live feeds are still being ingested.

MDotti Tecnologia created a temporary, parallel infrastructure to handle the Games removing the need for Rede Bandeirantes to re-direct any of its existing infrastructure, which is already at capacity, to support the event.

“We researched many solutions and Cinegy Capture PRO was the one that met all our needs, giving us the reliable ingest, multi-destination and collaborative capability that are essential when managing a project of this magnitude. The licence rental model that Cinegy offers also fits to our needs allowing us to spin up capability for events as needed,” commented Lucas Maia, Operations Director at MDotti Tecnologia. “Supporting the live event from Tokyo would not have been possible with Rede Bandeirantes’ existing capability so renting the workflow for the event allowed us to deliver an effective, but low-cost solution.”

Cinegy Capture PRO delivers cost effective, centralized ingest and is built from the ground-up as a totally independent ingest tool. It combines many of the industry-leading components developed for other Cinegy products to unifying the process of ingesting material and generating edit or web proxies. The solution can be used as an application by any user on the network thanks to the simple yet powerful cross-platform control client. Cinegy Capture PRO runs on a standard PC with the addition of one or more SDI cards.

Using the system, the Rede Bandeirantes team is able to record to multiple storage systems simultaneously, ingest live streams in real time XDCAM with edit while ingest capability. Cinegy Capture PRO is also hardware agnostic, allowing it to work seamlessly with MDotti’s regular workstation of choice.

Cinegy Managing Director Daniella Weigner said, “We are very proud of the key role Cinegy Capture PRO is playing in bringing the exciting live action from Tokyo to viewers in Brazil. In any large live sports production environment, a fast turnaround is critical as audiences don’t want to see something on social media before they see it on the big screen. Our Capture solution turns the traditional acquisition and transcode process on its head to deliver a fast, easy, collaborative workflow that is cost effective and straightforward to deploy.”

Rede Bandeirantes plans to turn to the same workflow, supporting five channels of ingest, for a new live show in the coming months.

GB Labs and Archiware today announce integration between the GB Labs storage platforms and the Archiware P5 data management solution to deliver maximum security for ongoing and completed productions. Customers now have the flexibility to choose from different storage devices for backup and archive such as disk, LTO tape and cloud storage.

GB Labs storage platforms, such as SPACE, Echo and FastNAS, allow content creation teams to collaborate on files and projects for increased productivity and creative freedom. These products give Mac, Linux and Windows users simultaneous access to projects using 1 to 100Gb network connections. They are all designed from the ground up to be easy to install, maintain and upgrade, and the system allows easy configuration of access rights, storage quotas and permitted bandwidth for each user. Moving data between tiers of GB Labs units is optimised by the CORE OS intelligence to increase workflow efficiency.

With P5 now able to run natively on GB Labs devices, the latest CORE.4 OS allows users to configure the P5 client via the integrations tab to access files on GB Labs storage products for backup and archive. The integration provides an efficient way to protect production and provide business continuity. The P5 platform offers enormous flexibility in configuration, setup, storage and policies. The synergy between the two systems also means production is protected in multiple ways.

P5 Backup protects ongoing production against accidental deletion, file corruption and any other mishaps. The scheduled automatic Backup is the best way to keep files safe. The optimised restore process hands any file back identically to continue with production (including xattributes, ACLS, etc.). P5 Backup works with disk, tape and cloud storage to provide maximum flexibility and fulfil any requirements. Encryption is available for both transfer and storage.

P5 Archive migrates finished projects and their assets to disk, tape or cloud to preserve them for the long-term. Finding files at a later date is easy with its MAM-like features, customisable metadata fields, thumbnails for still images and proxy clips for videos. Combined search and visual browsing functionality help to locate files when they are needed for re-use, reference and monetisation. Full LTFS integration (ISO/IEC) provides import, export and archiving on LTFS tapes and the system makes it easy to catalogue and include existing third- party LTFS tapes into the P5 Archive.

Howard Twine, Chief Product Officer for GB Labs, said “We are delighted to bring this latest CORE OS update to our users allowing them to easily configure the Archiware P5 client so that it can access files for either backup or archive (or both). This allows our customers the flexibility to choose different storage devices for backup or archive like LTO tape from other vendors offering a ‘best of breed’ approach.”

Dr. Marc Batshkus, Director Marketing and Business Development at Archiware, confirmed, "We are proud to have GB Labs as integration partner for P5. Our shared strength, like the focus on customer experience, help to offer solutions that are powerful, cost-effective and easy to use. Especially in media production, the accessibility of any solution is key to improving productivity."

Both P5 Backup and P5 Archive support LTO tape drives and tape libraries from all vendors, ensuring the highest durability and shelf life of decades and the lowest TCO of all professional storage media starting from 10€/USD/GBP per TB. When using multiple drives, throughput can grow (and almost multiply) with each drive added though P5´s drive parallelisation feature. For maximum security and offsite storage P5 offers tape cloning to create two identical tape sets with two LTO drives.

Mo-Sys™ Engineering recently launched its new Onset VP Services office based in Los Angeles to provide an out-sourced solution for production companies new to virtual production. This has enabled US creative and production specialists Papertown to quickly and accurately create an ambitious virtual production project featuring a passenger plane in a hangar.

Papertown is an experienced creative production agency with specialization in computer graphics (CGI). Its client, Business Made Simple, one of the most successful management coaching organizations in North America, had the concept of a management training course based around the analogy that every business should run like an airplane, so a plane in a hangar was chosen for the video concept.

Renting and preparing an aircraft for shooting inside a hangar would have been expensive and impractical, so Papertown proposed shooting the speaker against a green screen and creating the plane, hangar and other items in CGI. This is where Mo-Sys Onset VP Services came in to play bringing in the virtual production expertize required.

Having just opened a branch in LA offering virtual production services to assist with this new way of filming, the Mo-Sys Onset VP Services team were able to help Papertown set up a virtual production studio to merge the CGI and real world elements together. This arrangement enabled Papertown to focus on imaging and storytelling without the time-drain of organizing their own virtual production workflow or needing advanced knowledge of the latest technology.

Mo-Sys, world leader in camera tracking, image robotics and virtual production solutions, created StarTracker to provide precision 6-axis camera tracking, enabling 3D movement in a virtual production scene. Combined with highly accurate real-time lens data, the full StarTracker data set via Mo-Sys VP Pro drives the Unreal Engine’s virtual graphics so they accurately emulate the real camera shot, delivering convincing virtual scenes.

Using StarTracker’s ability to lock the virtual graphics to the real world, not only could the cinematographer frame each shot easily because the composite image was available in the viewfinder, but the production team were able to record the finished output, eliminating the need for time-consuming post-production compositing. In fact, Papertown was able to shoot the entire three hour video series, and dozens of terabytes of premium cinematic footage in just two studio days.

“We had no experience of Mo-Sys products before this shoot, but we had all the support we needed to quickly put our creative to work,” said Papertown founder and Executive Creative Director Julian Smith. “Being able to use StarTracker, VP Pro and Unreal together extended an insane amount of value to our clients. It took our photo-real CGI to a whole new level. This is a game changer. Papertown is always pushing the boundaries with new ideas and technology and we are grateful to have partnered with a team like Mo-Sys that does the same.”

Michael Geissler, CEO of Mo-Sys, added “Whether for television, movies or corporate, producers are looking for the highest possible quality at the maximum productivity. Shooting final pixel like this is a tremendous boost. But it can only realistically be done with precision camera and lens tracking, real-time compositing, and a virtual production operator who is familiar with the Unreal Engine.

StarTracker and VP Pro are available now. Mo-Sys has just released the VP Pro XR server solution specifically aimed at LED volumes.

Pixel Power, the global leader in master control playout, automation and graphics, has delivered a StreamMaster BRAND software-defined graphics engine to WDSC-TV in Daytona Beach, Florida. The sophisticated graphics and branding system operates autonomously under the control of the station’s existing ADC automation system.

StreamMaster BRAND draws on Pixel Power’s four decades legacy of excellence in graphics to deliver live lower thirds, bugs, promos and other branding. A set of defined rules determines its content and insertion is triggered by the station automation. At WDSC a link between StreamMaster BRAND and the broadcaster’s Myers Pro Track broadcast management system delivers the metadata required to populate the graphics templates and voiceovers.

WDSC had an existing, simpler graphics insertion device which was life-expired and no longer supported. Pixel Power was able to offer a direct, plug-compatible replacement with interface to their ADC automation system with greatly extended functionality and graphics quality. As a software-defined, virtualized device, StreamMaster BRAND is resolution independent (upscaling existing standard definition assets) and can work within IP or SDI environments.

“Pixel Power assured us that the StreamMaster software would be operationally identical to our old system, and so it proved to be,” said Larry Lowe, station manager of WDSC-TV at Daytona State College. “We unplugged the old device, plugged in the StreamMaster BRAND and we were on air. It would be great if all systems were as well conceived and implemented, and in the time of Covid-19 to be able to commission something remotely without the need for teams of engineers getting involved is really important.”

Mike O’Connell of Pixel Power Inc. added “In a cost-effective, compact package we could first meet the current requirements for WDSC, then add the option for much more. The software licencing model for StreamMaster BRAND means that users can determine exactly what functionality they require, like realtime 3D graphics and multi-channel DVE, and precisely meet their branding image and budget.”

The StreamMaster BRAND at WDSC is now on air.

This release is also available in German HERE

SEQUOIA, an R&D project partnership between iSIZE, the BBC R&D and Queen Mary University of London (QMUL) has been awarded £700k from Innovate UK following a competitive grant submission, and a rigorous review and award process (<5% acceptance rate). Innovate UK is part of UKRI, the national funding agency investing in science and industrial research in the UK.

SEQUOIA looks at the way new technology, including artificial intelligence, can discontinuously change the way we distribute video content. It is a response to the pressing need for video streaming to become more sustainable. It addresses the challenges faced by the media sector in tackling the surge in online media consumption, which is posing unprecedented stress on network infrastructures worldwide. As well as imposing content delivery bottlenecks, this massive load on the internet infrastructure affects how content can be distributed efficiently to larger numbers of viewers, and contributes to its environmental footprint.

The project recognises that innovation in video streaming is urgently required. It is looking at perceptual optimisation of video streams as a way of making significant reductions in bandwidth required for equal quality. This is at the heart of iSIZE’s work, and the company has built extensive IP and expertise in this domain. This will be combined with innovations in encoding technologies and optimization, which is pursued by BBC R&D and QMUL.

“The problems facing video streaming are real and represent significant environmental issues,” said Sergio Grce, CEO of iSIZE. “The increase in video encoding complexity is outpacing Moore’s Law’, and some respected researchers suggest that the carbon footprint of the internet is greater than that of aviation. So this is an issue that must be addressed."

“We are very excited to be working with the BBC and QMUL on this project,” he added. “SEQUOIA brings us together with BBC and QMUL to advance the video streaming, incorporating our expertise in deep perceptual optimisation and the latest cutting-edge AI innovation. This project will deliver significant financial and environmental improvements for video streaming.”

Disruptive innovation for video streaming is urgently needed: new pre and post processing, encoding and delivery tools that are device-aware and cross-codec compatible. This is vital to meet the growing demand for online video, reducing processing, energy and storage requirements.

This project will make an impact at every stage in the media distribution chain, demonstrating its results on operational and portable encoder designs, applicable both to video on demand and live streams. This will lead to benefits for the whole sector, demonstrating technology to enable sustainable distribution of Ultra High Definition content, while limiting the impact of video on internet traffic and reducing distribution costs. Extending beyond the commercial benefits, project outcomes will be devised to support environmentally conscious solutions by monitoring and proactively reducing energy consumption at all stages within the media value chain.

The partnership of iSIZE, BBC and QMUL brings unique expert know-how and expertise on AI, video coding standardisation, adaptive video pre/post processing and streaming, perceptual optimisations and interoperable software architectures to collaboratively work towards these challenging objectives.

Red Bee Media, a leading global media services company, has selected the PRISMON multiviewer system from Rohde & Schwarz. It will replace life-expired SDI systems in its playout facilities in the UK, with a highly flexible, distributed, IP-connected multiviewer network.

Red Bee is the trusted delivery partner to some of the world’s strongest broadcasters and media brands, providing a complete delivery outsourcing service for broadcast playout and online streaming. From its broadcast facilities in London and Salford, Red Bee provides playout and disaster recovery facilities for more than one hundred channels in a variety of management and monitoring configurations.

To ensure that each operator can monitor the right signals, Red Bee requires a readily reconfigured multiviewer system, which led it to choose PRISMON from Rohde & Schwarz. PRISMON is a software-defined multiviewer running on COTS hardware, and capable of handling signal formats up to 4k Ultra HD, both for displays and for individual streams within mosaics.

Red Bee selected the SDM (Scalable Distributed Multiviewer) and the MCC (Multiviewer Control Centre) options. SDM, unique to Rohde & Schwarz, enables any input to any output, and allows PRISMON to share resources across a network, for instance to create a view with multiple Ultra HD signals, beyond the decoding capacity of a single system. MCC provides a single point of control for a network of multiviewers. This gives instant reconfiguration of multiple displays, either in response to the intuitive user interface or on command from a system-level automation system.

“Multiviewers are mission critical to our playout centres,” said Robert Luggar, Head of Broadcast Media Solutions and Platforms at Red Bee. “We need to monitor around 450 streams, so we need scale and flexibility. Today those streams are SDI, but Rohde & Schwarz demonstrated that the PRISMON configuration could handle that, but more important could easily be switched to IP in the future.”

Andreas Loges, Vice President Media Technologies at Rohde & Schwarz, commented “We competed fiercely for this contract against a number of other vendors. We won because we could deliver a drop-in solution, a direct replacement for the life-expired multiviewer, and because our software-defined approach is inherently a hybrid platform, supporting SDI and IP signals, in both SMPTE ST 2022-6 and ST 2110.

“Broadcasters around the world are facing the same challenge,” Loges added. “Existing multiviewers are reaching the ends of their lives and were built solely around SDI. Because of this hybrid capability, and the power of SDM, PRISMON is well positioned for many more large-scale playout centres.”

The contract was awarded in December 2020 and, despite the challenges of social distancing and staff protection, it was installed, commissioned and put into use in May 2021.

Mo-Sys™ Engineering, world leader in virtual production and image robotics, has supplied the brand-new Mo-Sys VP Pro XR extended reality server as well as StarTracker camera tracking to ARRI for its new studio in Uxbridge, west of London, making them the first user of this innovative technology.

The new facility is one of the largest permanent LED volume studios in Europe and they have chosen the Mo-Sys technology, which is designed specifically for real time final pixel shooting, to deliver cinematic quality imagery, as well as precise 3D tracking for cameras, allowing directors to see the combined real and virtual images during the filming process and eliminating lengthy post-production compositing.

The new ARRI studio offers 708 square metres of floor space, bounded by 343 square metres of LED wall, including curves and a ceiling. The LEDs extend 360˚ around the studio, allowing the light from the virtual elements to fall naturally onto the real objects and actors, providing realistic soft edge lighting effects.

The facilities will also make full use of the latest VP Pro XR feature, Cinematic XR Focus, a unique and intuitive solution which allows focus pullers to pull focus between the real and virtual environments, using the same lens focussing controllers they are used to.

The LED volume was designed and installed by Creative Technology in collaboration with ARRI. Head of technical services at Creative Technology Tom Burford said, “We are thrilled to showcase this exciting new solution, bringing together best-in-class technology. This is a space designed specifically for mixed reality productions of all kinds, carefully considered to produce the most flexible shooting environment possible.”

Jannie van Wyk, Managing Director of ARRI Rental, which manages bookings for the studio, added: “Mo-Sys’ commitment to ongoing development and integration with ARRI camera systems was the reason for our decision to use the Mo-Sys StarTracker and VP Pro XR system for our mixed reality studio at Uxbridge."

“Producers love the concept of final pixel shooting because of the creative freedom it gives directors plus the time and cost saved during post-production,” said Michael Geissler, CEO of Mo-Sys. “In the past, though, there have been concerns about image quality, shooting options, and workflow. We are solving these issues with VP Pro XR and StarTracker, in a way that fits naturally into today’s production workflows without adding complications or time penalties.

“We are delighted to have provided this ground-breaking solution, and this new ARRI studio is a world-class showcase for what can be achieved,” he added.

The UK ARRI Studio is now open for bookings.

Mo-Sys VP Pro XR is available now.

Pixel Power, the global leader in graphics and software defined master control playout, has announced that it has joined the SRT Alliance, the open-source initiative dedicated to overcoming the challenges of low-latency video streaming. The move will support Pixel Power’s flexible solutions for master control playout with fully integrated graphics and branding.

The SRT Alliance, founded by Haivision in April 2017, already has more than 500 members. Its mission is to overcome the challenges of low-latency live streaming by supporting the collaborative development of SRT (Secure Reliable Transport), the fastest growing open source streaming project. SRT is a free open source video transport protocol and technology stack originally developed and pioneered by Haivision, which enables the delivery of high-quality and secure, low-latency video across the public internet.

“We have fully integrated Open SRT into our StreamMaster integrated playout software,” said James Gilbert, CEO of Pixel Power. “SRT does what its name suggests – it provides a secure way to transport content over the public internet, minimising the effects of packet loss and other impacts on the circuit. That is why it has rapidly become a mainstay of online connectivity for professional video. As a codec-agnostic standard, it gives Pixel Power complete flexibility to match our solution to the requirements.”

StreamMaster is a suite of software products that allow a user to build master control playout and automation solutions to their precise needs. Expanding on Pixel Power’s 30-year history in premium quality broadcast graphics and branding, it includes pre-packaged offerings for specific tasks like branding and promos, or allows the user to configure up to a complete channel playout system. StreamMaster is software-defined and can be deployed on-premise, in a data centre or in the public cloud, using the same software, same licence and with identical performance.

“By joining the SRT Alliance, Pixel Power is part of an industry movement to improve the way the world streams video,” said Suso Carrillo, Director of the SRT Alliance for Haivision. “This provides active support to the movement for the world’s biggest broadcasters and enterprise streaming workflows, recognising it as it becomes the de facto standard for low-latency internet streaming.”

Page 2 of 18

Contact us

Let's boost your journey to success, get in touch today

Please contact us with any questions.

Manor Marketing, 4 Olympus House, Calleva Park, Aldermaston, RG7 8SA Get Directions