Manor

manor marketing curve
Manor

Manor

Mo-Sys™ Engineering is introducing on-set virtual production services for the Los Angeles market. Mo-Sys On-set VP Services provides an out-sourced solution for production companies new to virtual production.

This is a unique Mo-Sys concept designed to empower Cinematographers and Directors who can now focus solely on imaging and storytelling without the time-drain of organising their own virtual production workflow or needing advanced knowledge of the latest technology.

A key element of the service is that all bookings are supported by experienced on-set virtual production technicians who remain with the system for the duration of the shoot. Several of these technicians joined the LA On-set VP team earlier this year, and have been on intensive product training with Mo-Sys specialists since then. Additional LA team members and further roll-out of the solution to other cities, such as London, are planned for the near future.

The news follows the recent announcement of the launch of Mo-Sys VP Pro XR, a new XR server solution for LED volumes, meeting the demands of final pixel XR production for film and TV. Mo-Sys VP Pro XR comprises a hardware and software solution combining multi-node nDisplay architecture, real-time VP Pro compositor/synchroniser and a new Cinematic XR toolset containing unique features such as Cinematic XR Focus.

“Mo-Sys’ new On-set VP Services enable production companies to shoot whilst learning the techniques and processes of virtual production, until they’re comfortable doing it themselves. We are boosting access to advanced production capabilities, and expanding the knowledge pool. We want to help our clients try new things with virtual production, irrespective of the screen technology, workflow or type of virtual production chosen.”

Mo-Sys On-set VP Services are available now with full details available here.

Mo-Sys™ Engineering, world leader in virtual production and image robotics, has released a new multi-node media server solution for LED volumes to meet the demands of final pixel XR production for film and TV.

With virtual production increasing exponentially, in particular in the LED volume space, Mo-Sys utilised its 20+ years of film and broadcast technology innovation and experience to create the next step forward for XR media servers.

Mo-Sys VP Pro XR is a hardware and software solution combining multi-node nDisplay architecture, an improved VP Pro real-time compositor/synchroniser, and a new XR toolset. This XR media server system is focused on delivering cinematic capabilities and standards for Cinematographers and Focus Pullers.

The launch of VP Pro XR follows the recently announced Mo-Sys Cinematic XR Focus capability. Cinematic XR Focus allows focus pullers to pull focus between real and virtual elements in an LED volume – a world first - and is now available on VP Pro XR.

The XR space to date has been predominantly driven by live event equipment companies. Whilst these XR volumes have delivered cost savings by removing post-production compositing and lowering location costs, they have also introduced shooting limitations, and an output image that isn’t yet comparable to non-real-time compositing. This was the reason behind the creation of VP Pro XR.

VP Pro XR is the first product release under Mo-Sys’ Cinematic XR initiative. The over-riding aim of Cinematic XR is to move final pixel XR production forwards in terms of image quality and shooting creativity from its initial roots using live event LED technology, to fit-for-purpose Cinematic XR technology.

Mo-Sys has outlined four key components to Cinematic XR:

  • Improve image fidelity
  • Introduce established cinematic shooting techniques to XR
  • Enable seamless interaction between virtual and real set elements
  • Innovate new hybrid workflows combining final pixel and non-real-time compositing

Michael Geissler, Mo-Sys CEO explains, “With our twenty-year background in film robotics, we regularly get to hear what cinematographers and producers think and need. Whilst producers love the final pixel XR concept, cinematographers worry about image quality, colour pipeline, mixed lighting, and shooting freedom. We started Cinematic XR in response to this, and VP Pro XR is specifically designed to solve the urgent problems we have been made aware of by our cinematographer and focus puller colleagues.”

Mo-Sys will announce the initial VP Pro XR customers in June. VP Pro XR is available immediately.

Rohde & Schwarz, a global leader in broadcast media technologies, has added to the functionality of its R&S®PRISMON multiviewers with Multiviewer Control Centre (MCC), increasing still further the power and productivity of the platform. MCC enables a single point of control for a network of multiviewers, which can be automated, giving instant reconfiguration of multiple displays.

PRISMON is a software-defined multiviewer architecture, designed specifically with IP studio production and playout systems in mind. It fully supports Ultra HD signals, both for displays and for streams within mosaics. The unique, scalable, distributed multiviewer functionality enables “any input to any output” connectivity through IP proxy networks. This also enables PRISMON to share resources across a network, allowing users to set up a view with multiple Ultra HD inputs, beyond the decoding capacity of a single system.

Rohde & Schwarz has now added the ability to control the complete network of multiviewers, along with associated decoders and encoders, from a single point. In environments when studios and control rooms change personality regularly, like news or live sports, MCC means that all the monitoring can be changed either with one touch of a central controller or triggered by studio automation.

Underlying the flexibility this gives to busy live production and delivery, it has already been implemented by one of Europe’s largest playout specialists. MCC will initially be used as part of the augmented facilities to help a major European broadcaster deliver a multi-national football competition in 2021.

“PRISMON is already a popular choice for multiviewers in major installations, because of the flexibility and cost efficiency it brings,” commented Andreas Loges, Vice President Media Technologies at Rohde & Schwarz. “Following conversations with our users, it became clear that they sought the ability to use that flexibility to allow facilities to be reconfigured quickly, accurately and easily. MCC is a direct result of those discussions.”

MCC is an option for the PRISMON system, subject to a separate licence which allows users to configure their monitoring networks to meet their exact requirements. Version 1.0 of MCC was released in April 2021. Version 1.1 added further functionality and will be made available in June 2021.

Mo-Sys Engineering, world leader in precision camera tracking solutions for virtual studios and augmented reality, provided support for Bluman Associates and milkit Studio for a ground-breaking series of idents, part of the ITV Creates programme, to mark Mental Health Awareness Week in May 2021.

Mo-Sys StarTracker was chosen to bring the vision of artist Mamimu (June Mineyama-Smithson) and neuroscientist Dr Tara Swart to life, as a series of channel idents which “combine art and science to create ultimate optimism, inducing happy hormones in your brain,” according to the artist.

The creative idea they chose was to make a physical model of the familiar ITV logo in a mirrored material and sit it in an augmented reality studio where bold, colourful abstract graphics would interact with it. Studio Owner Pod Bluman recognised that central to the success of the creative vision would be absolutely perfect registration between the real and the virtual objects, while giving the camera complete freedom to move.

The shoot was at milkit Studio in north London, a dedicated mixed reality facility with an LED shooting volume – two walls and a floor in high density, 4k resolution.

“The idea of abstract imagery reflecting in the real, very shiny logo would only work if they stayed in perfect registration,” said Ben Tilbrook of Mo-Sys, who provided support on the project. “The camera was mounted on a jib so was moving freely around the logo. The Mo-Sys StarTracker is designed for just this sort of requirement – it gives the director and cinematographer complete freedom while ensuring the graphics computer is updated with positional information in real time.”

Mo-Sys StarTracker is widely regarded as the leader in camera tracking technologies, using a random pattern of reflective dots – “stars” – on the studio ceiling. Once mapped, which takes just a few moments, any StarTracker-equipped camera can be precisely located in three-dimensional space, and for pan, tilt and roll. With the addition of digital lens data, real elements can be placed into the virtual environment. It allows augmented reality sequences like this to be shot live, obviating the need to composite the layers in post-production.

“This was the first time we had used camera tracking on a project like this,” said Pod Bluman of Bluman Associates. “It performed admirably, giving the precision we needed in the shoot, reliably and without fuss or problems.”

Commenting on the finished sequences, Dr Tara Swart said “Mamimu and I had conversations about how the brain works, and about neurotransmitters that are related to happiness and optimism. I was just blown away by what she created – it literally made me happy to see it.”

Broadcast specialist nxtedition has formed a partnership with Singular.live, the cloud-native graphics platform. Together, users will be able to add dynamic, responsive graphics and overlays into playout timelines in an HD-SDI/IP environment, greatly adding to the operational convenience as well as connecting information and engagement into the live output.

Singular enables users to create and control broadcast graphics through its Intelligent Overlay platform. Cloud-native, it allows users to create custom designs and templates from a standard web browser. Completed overlays can be dropped into a nxtedition timeline by simply copying and pasting a URL.

nxtedition converted the open-source CasparCG software system to use HTML5 graphic templates early on in their roadmap, so right out-of-the-box, Singular interfaces directly to CasparCG allowing real-time key and fill graphics output to the vision mixer. The HTML templates Singular generates for a playlist are also cached within the nxtedition servers to bring speed and resilience. The Singular API also makes the templates metadata available to nxtedition's highly-capable NRCS and automation functionality where they can be previewed and populated with content and played out live on-air.

“We developed Singular because we know broadcast graphics, and we know what users really need – flexibility, scalability, ease of use and of course, cost,” said Mike Ward, head of marketing at Singular.live. “We are entirely focused on delivering graphics using Intelligent Overlays, so a partnership with nxtedition allows us to serve the demanding newsroom market. This collaboration is a great example of two companies coming together, pooling their strengths and delivering a really compelling solution.”

Roger Persson, head of sales and marketing at nxtedition, added “The alignment between the two companies is remarkable. We have the same culture in software development, and we have the same goal for our customers of making it easier for them to tell their stories. It is this common language that makes this partnership so exciting.”

Singular embedded in a nxtedition production environment is already in use at a Swiss broadcaster, and the integration is to be included in the 19.4 update to all existing customers. Because Singular Intelligent Overlays are cloud-based and delivered at the time of use, it is easy to set up different graphic formats for varied outputs, or even have the graphics rendered on the user’s device, allowing for personalised engagement.

Hitomi Broadcast, manufacturer of MatchBox, the industry’s premier audio video alignment toolbox, has announced that Dock10, the largest studio complex in the UK, has invested in the Hitomi MatchBox system to help the Virtual Studios crew align their audio and video outputs efficiently and deliver high-quality programs without complex line-ups.

Dock10 has recently added extensive virtual production capabilities to its Salford centre. To ensure perfect synchronisation between audio and video circuits subject to different amounts of processing, Dock10 has now turned to the lip sync experts Hitomi and invested in the Hitomi MatchBox Generator, Analyser and licences to use the Glass iPhone app.

Virtual production is rapidly gaining ground and fits into many of Dock10’s key areas, like sports, magazine programmes and children’s television. For this reason, Dock10 has installed virtual production capabilities in every one of its studios. The challenge is that marrying live and computer-generated images generally results in a delay of a few frames, putting the audio ahead of the video. Sound leading the picture is particularly disturbing for audiences as it cannot occur in “real life”.

The Hitomi MatchBox system is a toolkit designed to streamline live broadcast synchronisation. MatchBox Generator creates a unique signal, including video and up to 16 audio channels. The complementary MatchBox Analyser compares the video and audio and determines precisely the delays in each path. For quick, on set tests, MatchBox Glass uses the ubiquitous iPhone or iPad to generate the test signal, with the phone simply held in front of the camera to be tested.

“We looked at MatchBox at IBC, then we borrowed the kit from our friends at Timeline Television so we could give it a full workout,” said Michael Lodmore, Duty Technology Manager, Dock10. “Compensating for video delays through the various processing engines can be time-consuming and not very satisfactory. We found MatchBox did just what we needed, saving our crews a lot of time to get the audio and video outputs in sync. Not only does it measure by how much it is wrong, it also gives us reassurance when right.

“The future of studio production is going to rely increasingly on virtual reality,” Lodmore added. “It allows us to create bigger, brighter environments through virtual set extensions. Producers love it because they can create looks which make their programmes stand out from the crowd and enhance the format. MatchBox means we can get on with delivering that quality without waiting for complex line-ups.”

Russell Johnson, director of Hitomi Broadcast, added “It is not an either/or situation – you do not make a choice between a completely virtual set or a completely physical set. In a typical production, some sets – and some cameras – will not have synchronised graphics, some will. Producers and directors will be making decisions on how the output of each camera will be processed as it happens, so the ability to pull back lip sync as you need it is a huge benefit. To be able to simply hold an iPhone or iPad running the MatchBox Glass app in front of a camera for synchronisation is a huge time-saver.”

Rohde & Schwarz, a global leader in broadcast media technologies, has extended the functionality of its R&S®VENICE platform for live studio production. In particular, it has moved to support 4k Ultra HD signals, using either 12G or 2SI formats.

VENICE is a high-performance, high-resilience media platform designed to manage complex signal processing and storage requirements, particularly in live studio production applications. It provides uninterrupted operations including scheduled recording, clip transforms and playouts, across a network of processors and both Rohde & Schwarz storage and third-party qualified storage sub-systems. It also bridges the SDI and IP environments. VENICE servers can now be configured for multiple HD and Ultra HD signals, and units can work together as a single resource from a single user interface.

The high bandwidth of Ultra HD signal demands special handling, beyond the traditional 3G environment. Ultra HD infrastructures today are typically built around either 12G – a single, very high-bandwidth connection which has some technical limitations – or 2SI, which spreads the signal pixel-by-pixel across four conventional SDI cables. 2SI also provides a degree of resilience, as the loss of one channel simply means a drop in resolution, not a complete loss of all or part of the picture.

The Rohde & Schwarz VENICE media platform now supports both 12G and 2SI, enabling it to fit comfortably into any environment. Because each VENICE processor is software defined, a channel can easily switch between an Ultra HD signal and HD signals. With VENICE sharing resources across a network from a single management user interface, it means resources can be allocated for maximum efficiency and utilisation.

The ability to achieve performance by making efficient use of the available hardware is one of the chief benefits of the VENICE architecture. It allows resources to be shared, for instance through a method called Transform Channel Assignment, which allows a source to be recorded, viewed and processed – trimmed, merged or edited, for example – simultaneously, while making the optimum use of available resources. Using Record Scheduler, VENICE looks ahead to ensure there are sufficient channels for planned recordings while serving current requests for access.

“With VENICE, we set out to offer a platform for excellence in mission-critical broadcast applications like playout,” commented Andreas Loges, Vice President Media Technologies at Rohde & Schwarz, “Central to that is the ability to be flexible, to meet the workflows our users demand, and to provide a seamless opportunity to handle Ultra HD signals as they are required.”

Transform Channel Assignment and Record Scheduler are already available for all VENICE installations. Support for 12G and 2SI is available to order now.

Robert Nagy, lead developer and co-founder of innovative broadcast company nxtedition, has been appointed to the Technical Steering Committee of Node.js.

Node.js is seen as a key building block for scalable network applications with several major companies across the spectrum relying on the powerful open-source runtime environment, including Microsoft, Netflix and PayPal.

The Technical Steering Committee is the primary controlling body of Node.js, overseeing the technical direction of the project. With so many developers in many industries worldwide relying on the technology, ensuring continuity, security and continuing improvement is a major challenge.

“The members of the TSC come from major IT players around the world,” said Roger Persson, head of sales and marketing at nxtedition. “At nxtedition we are in awe of the technical skills and understanding that Robert brings to our products, and we are proud that he has been recognised as a leader on the wider stage.

“At nxtedition we are committed to opensource developments, which give us speed and agility,” Persson added. “Robert’s appointment as one of the guiding lights of Node.js also means that the specific requirements of media will be in the forefront of future developments of the project.”

Robert was just 17 when he was head-hunted by Swedish national broadcaster SVT to become lead developer on CasparCG, the software-defined broadcast graphics platform. Once operational, SVT took the decision to make CasparCG open source and available to all.

While developing solutions based on CasparCG he met the team which was in the process of forming nxtedition. He took command of the development programme, making the decision – radical at the time – that the whole development should be in JavaScript, thereby positioning nxtedition to ride on the crest of every development in web technology.

The decision to exclusively develop in JavaScript allowed nxtedition to lead the media industry with its microservices-based virtualisation. It also established him as a pioneer and innovator in JavaScript, now recognised by his appointment to the board of Node.js.

• Investment will enable iSIZE to accelerate its traction and to continue strengthening its technical team and patent portfolio

• iSIZE has already secured licensing agreements with leading technology and streaming companies

iSIZE, a deep-tech company that applies deep learning to optimize video streaming and delivery, today announces that it has raised a further $6.3 million in funding as it seeks to make streaming more environmentally friendly without reducing quality.

The round was led by Octopus Ventures, with participation from existing investors including TD Veen and Patrick Pichette, Chairman of Twitter and ex-CFO of Google. This brings the total funding raised by the company to $8.2 million.

The amount of video streamed over the internet is at all-time high, a trend which has been accelerated by the pandemic and the shift to working from home. At the same time, streaming and content companies are facing pressure from users and advertizers to deliver ever-increasing video quality. With forecasts projecting video to reach 82% of total global internet traffic by 2022, there is also growing awareness of its carbon footprint, with research indicating that it already contributes to more than 1% of global emissions.

As a result, streaming and content providers are increasingly turning to technology to address the challenge of delivering a reliable and high-quality experience while managing the financial and environmental costs of doing so.

To help solve this problem, iSIZE has pioneered deep-learning solutions that optimize video streaming quality while reducing bitrate requirements, allowing for a significant reduction in data and energy consumption.

The potential impact of its technology is huge and iSIZE has already attracted attention from some of the world’s largest technology companies to whom they already licensed their BitSave technology.

Headquartered in London, iSIZE was founded by Sergio Grce and Dr. Yiannis Andreopoulos who saw an opportunity to tackle the challenges caused by the explosion of video streaming. The founding team combines many years of research in machine learning, neural networks and video signal processing, evidenced by dozens of research publications. The company is also a graduate of the Creative Destruction Lab Oxford 2019-2020 programme where it received advice and investment from expert mentors.

iSIZE intends to use the funding raised to accelerate its traction in the U.S and to further strengthen its technical team and patent portfolio to continue improving the results and innovations it delivers to its customers.

Sergio Grce, Founder and CEO of iSIZE, commented: “Today there are more people streaming more video than ever before. Our customers recognize both the commercial opportunity and their social responsibility to optimize their video delivery pipelines with our pioneering technology. We are excited to partner with Octopus Ventures to tap into their network and expertize in building world-changing companies.”

Simon King, Partner and deep tech investor at Octopus Ventures, said: “The technology iSIZE has created is pioneering and is already being used by some of the world’s largest companies to reduce the costs and energy used in streaming. Consumer demand for high quality video is only going to increase as our devices are upgraded, so it’s vital that we find new ways to reduce the environmental impact. We are very familiar with this space having been an investor in Magic Pony and Sergio is one of those visionary founders who we believe can build something truly special.”

iSIZE’s leading product is a proprietary AI-trained, deep perceptual optimizer that is trained to ‘see with the human eye’ in order to optimize video quality and deliver significant bitrate savings. Its technology has applications across VoD, live streaming, gaming and IoT and bolts-on to the existing conventional video delivery pipeline while integrating with all video encoding standards (including AVC, HEVC and AV1) – all without requiring changes to the streaming process or to end-users’ devices. This allows its customers to improve the end-user experience and reduce costs without breaking standards and with minimal deployment risk.

Mo-Sys Engineering (www.mo-sys.com), world leader in precision camera tracking solutions for virtual production, has now integrated its virtual production software - Mo-Sys VP Pro - with the Sony Venice camera. Capturing the dynamic camera settings data direct from the Sony Venice, users can simplify and speed-up virtual production workflows.

The Sony Venice full-frame digital cinematography camera is rapidly gaining popularity, not least for the large number of software-controlled resolution and aspect ratio settings available. Mo-Sys VP Pro links directly to the camera software, monitoring every take and shot to highlight any mis-matched settings, whilst capturing the Venice camera data to assist in down-stream post-production compositing.

The integration means that Mo-Sys VP Pro goes into record mode whenever the Sony Venice camera starts recording. Mo-Sys VP Pro captures the file naming format the camera uses to name the media files, and uses the same name for the metadata file containing the camera settings data, ensuring that matching the two together in post-production is simple.

Mo-Sys VP Pro is now the most versatile and up-to-date virtual production solution for film, TV, game cinematics and live broadcast. Mo-Sys VP Pro is integrated with Unreal Engine and supports either live or recorded virtual production workflows, using either green/blue screen studios or LED volumes. It provides compositors with additional lens data – like F-stop, T-stop and shutter angle – making VFX compositing much easier.

“Virtual production is a powerful creative tool, for film as well as for television,” said Michael Geissler, CEO of Mo-Sys. “It is a complex business, though, and smart systems that speed synchronization and set-up in post-production give a huge boost to productivity. Once more, Mo-Sys is leading the industry with intelligent, practical, integrations, whilst also supplying camera tracking solutions with the highest levels of precision and data resilience available on the market today.

Mo-Sys recently announced the Cinematic XR Focus feature for Mo-Sys VP Pro, allowing camera operators to pull focus between real and virtual elements within an LED volume. This type of focus pull was previously impossible to achieve but using a Preston lens controller, Mo-Sys StarTracker and Mo-Sys VP Pro, it is now easily achievable.

The Sony Venice integration is available immediately, and all Mo-Sys VP Pro subscription customers have already received this new capability. To book a free trial of Mo-Sys VP Pro, contact .

Page 3 of 18

Contact us

Let's boost your journey to success, get in touch today

Please contact us with any questions.

Manor Marketing, 4 Olympus House, Calleva Park, Aldermaston, RG7 8SA Get Directions