Mo-Sys VP Pro 4.27 to release simultaneously with Epic Games’ release of Unreal Engine 4.27
Mo-Sys™ Engineering (www.mo-sys.com), a world leader in virtual production and image robotics, today announces the launch of Mo-Sys VP Pro 4.27 with a raft of new features that will support Epic Games’ Unreal Engine 4.27. During the preview period of Unreal Engine 4.27, Mo-Sys has used the new update to complete a full multi-camera shoot with Amazon. The update was also put through a comprehensive, multi-camera technical rehearsal for a major Netflix production.
Mo-Sys has been working on expanding the feature set and capabilities of VP Pro, which runs inside Unreal Engine Editor, for some time now and will launch the new version on the same day as Epic Games’ Unreal 4.27. The VP Pro 4.27 upgrade brings four new key features; an improved compositor, NearTime® rendering, an online lens distortion library, and remote control capability.
Michael Geissler, CEO Mo-Sys Engineering Ltd., said, “Virtual production is seeing a real surge, but it does bring challenges. Typically, directors and cinematographers must make continuous calculations to make the real and virtual elements match in post-production. This is often a costly and repetitive process. Mo-Sys is a pioneer in providing a platform for graphics rendering as well as camera and lens tracking to create higher quality virtual productions. With VP Pro 4.27 we have made it even easier to produce seamless, high-end productions and we are also unique in making our 4.27 update available on the same day as Epic Games launch Unreal Engine 4.27.”
Mo-Sys has done a complete overhaul of its internal compositing system, to change the way compositing is handled across all modes. The latest update of VP Pro now provides improved support for high-end graphics features, such as refraction, and delivers a 15% performance improvement. Other features include:
· Support for reflection and refraction of video in CG objects
· Improved support for advanced ray-tracing features
· Support for fur and groom
· Advanced controls for CG shadows falling onto video
The updated VP Pro 4.27 also widens the Beta program for the new NearTime® rendering workflow to give access to more users. NearTime is an automated, cloud-based re-render that dramatically increases and homogenises the visual quality, and allows for higher resolutions (UHD, 4K, 8K), without the performance restrictions of real-time. The moment the shot is complete, a high-quality, high-resolution optimized, cloud-based NearTime render starts – with no compromise on performance or visual quality. With NearTime, you can “turn up all the dials” for unrestricted real time quality and do it at a fraction of the cost of completing the process in post-production.
As part of the 4.27 update, Mo-Sys is also launching its online lens distortion library giving users access to a wide selection of calibration tools and allowing them to tweak their lenses on-set in a highly cost-effective way.
New for VP Pro 4.27, the VP Remote iPad interface is fully customizable and supports control of multiple engines, and camera chains, from a single control panel. The Beta version was trialled at the technical rehearsal with Netflix.
Mo-Sys™ Engineering recently launched its new Onset VP Services office based in Los Angeles to provide an out-sourced solution for production companies new to virtual production. This has enabled US creative and production specialists Papertown to quickly and accurately create an ambitious virtual production project featuring a passenger plane in a hangar.
Papertown is an experienced creative production agency with specialization in computer graphics (CGI). Its client, Business Made Simple, one of the most successful management coaching organizations in North America, had the concept of a management training course based around the analogy that every business should run like an airplane, so a plane in a hangar was chosen for the video concept.
Renting and preparing an aircraft for shooting inside a hangar would have been expensive and impractical, so Papertown proposed shooting the speaker against a green screen and creating the plane, hangar and other items in CGI. This is where Mo-Sys Onset VP Services came in to play bringing in the virtual production expertize required.
Having just opened a branch in LA offering virtual production services to assist with this new way of filming, the Mo-Sys Onset VP Services team were able to help Papertown set up a virtual production studio to merge the CGI and real world elements together. This arrangement enabled Papertown to focus on imaging and storytelling without the time-drain of organizing their own virtual production workflow or needing advanced knowledge of the latest technology.
Mo-Sys, world leader in camera tracking, image robotics and virtual production solutions, created StarTracker to provide precision 6-axis camera tracking, enabling 3D movement in a virtual production scene. Combined with highly accurate real-time lens data, the full StarTracker data set via Mo-Sys VP Pro drives the Unreal Engine’s virtual graphics so they accurately emulate the real camera shot, delivering convincing virtual scenes.
Using StarTracker’s ability to lock the virtual graphics to the real world, not only could the cinematographer frame each shot easily because the composite image was available in the viewfinder, but the production team were able to record the finished output, eliminating the need for time-consuming post-production compositing. In fact, Papertown was able to shoot the entire three hour video series, and dozens of terabytes of premium cinematic footage in just two studio days.
“We had no experience of Mo-Sys products before this shoot, but we had all the support we needed to quickly put our creative to work,” said Papertown founder and Executive Creative Director Julian Smith. “Being able to use StarTracker, VP Pro and Unreal together extended an insane amount of value to our clients. It took our photo-real CGI to a whole new level. This is a game changer. Papertown is always pushing the boundaries with new ideas and technology and we are grateful to have partnered with a team like Mo-Sys that does the same.”
Michael Geissler, CEO of Mo-Sys, added “Whether for television, movies or corporate, producers are looking for the highest possible quality at the maximum productivity. Shooting final pixel like this is a tremendous boost. But it can only realistically be done with precision camera and lens tracking, real-time compositing, and a virtual production operator who is familiar with the Unreal Engine.
StarTracker and VP Pro are available now. Mo-Sys has just released the VP Pro XR server solution specifically aimed at LED volumes.
Mo-Sys™ Engineering, world leader in virtual production and image robotics, has supplied the brand-new Mo-Sys VP Pro XR extended reality server as well as StarTracker camera tracking to ARRI for its new studio in Uxbridge, west of London, making them the first user of this innovative technology.
The new facility is one of the largest permanent LED volume studios in Europe and they have chosen the Mo-Sys technology, which is designed specifically for real time final pixel shooting, to deliver cinematic quality imagery, as well as precise 3D tracking for cameras, allowing directors to see the combined real and virtual images during the filming process and eliminating lengthy post-production compositing.
The new ARRI studio offers 708 square metres of floor space, bounded by 343 square metres of LED wall, including curves and a ceiling. The LEDs extend 360˚ around the studio, allowing the light from the virtual elements to fall naturally onto the real objects and actors, providing realistic soft edge lighting effects.
The facilities will also make full use of the latest VP Pro XR feature, Cinematic XR Focus, a unique and intuitive solution which allows focus pullers to pull focus between the real and virtual environments, using the same lens focussing controllers they are used to.
The LED volume was designed and installed by Creative Technology in collaboration with ARRI. Head of technical services at Creative Technology Tom Burford said, “We are thrilled to showcase this exciting new solution, bringing together best-in-class technology. This is a space designed specifically for mixed reality productions of all kinds, carefully considered to produce the most flexible shooting environment possible.”
Jannie van Wyk, Managing Director of ARRI Rental, which manages bookings for the studio, added: “Mo-Sys’ commitment to ongoing development and integration with ARRI camera systems was the reason for our decision to use the Mo-Sys StarTracker and VP Pro XR system for our mixed reality studio at Uxbridge."
“Producers love the concept of final pixel shooting because of the creative freedom it gives directors plus the time and cost saved during post-production,” said Michael Geissler, CEO of Mo-Sys. “In the past, though, there have been concerns about image quality, shooting options, and workflow. We are solving these issues with VP Pro XR and StarTracker, in a way that fits naturally into today’s production workflows without adding complications or time penalties.
“We are delighted to have provided this ground-breaking solution, and this new ARRI studio is a world-class showcase for what can be achieved,” he added.
The UK ARRI Studio is now open for bookings.
Mo-Sys VP Pro XR is available now.
Mo-Sys™ Engineering is introducing on-set virtual production services for the Los Angeles market. Mo-Sys On-set VP Services provides an out-sourced solution for production companies new to virtual production.
This is a unique Mo-Sys concept designed to empower Cinematographers and Directors who can now focus solely on imaging and storytelling without the time-drain of organising their own virtual production workflow or needing advanced knowledge of the latest technology.
A key element of the service is that all bookings are supported by experienced on-set virtual production technicians who remain with the system for the duration of the shoot. Several of these technicians joined the LA On-set VP team earlier this year, and have been on intensive product training with Mo-Sys specialists since then. Additional LA team members and further roll-out of the solution to other cities, such as London, are planned for the near future.
The news follows the recent announcement of the launch of Mo-Sys VP Pro XR, a new XR server solution for LED volumes, meeting the demands of final pixel XR production for film and TV. Mo-Sys VP Pro XR comprises a hardware and software solution combining multi-node nDisplay architecture, real-time VP Pro compositor/synchroniser and a new Cinematic XR toolset containing unique features such as Cinematic XR Focus.
“Mo-Sys’ new On-set VP Services enable production companies to shoot whilst learning the techniques and processes of virtual production, until they’re comfortable doing it themselves. We are boosting access to advanced production capabilities, and expanding the knowledge pool. We want to help our clients try new things with virtual production, irrespective of the screen technology, workflow or type of virtual production chosen.”
Mo-Sys On-set VP Services are available now with full details available here.
Mo-Sys™ Engineering, world leader in virtual production and image robotics, has released a new multi-node media server solution for LED volumes to meet the demands of final pixel XR production for film and TV.
With virtual production increasing exponentially, in particular in the LED volume space, Mo-Sys utilised its 20+ years of film and broadcast technology innovation and experience to create the next step forward for XR media servers.
Mo-Sys VP Pro XR is a hardware and software solution combining multi-node nDisplay architecture, an improved VP Pro real-time compositor/synchroniser, and a new XR toolset. This XR media server system is focused on delivering cinematic capabilities and standards for Cinematographers and Focus Pullers.
The launch of VP Pro XR follows the recently announced Mo-Sys Cinematic XR Focus capability. Cinematic XR Focus allows focus pullers to pull focus between real and virtual elements in an LED volume – a world first - and is now available on VP Pro XR.
The XR space to date has been predominantly driven by live event equipment companies. Whilst these XR volumes have delivered cost savings by removing post-production compositing and lowering location costs, they have also introduced shooting limitations, and an output image that isn’t yet comparable to non-real-time compositing. This was the reason behind the creation of VP Pro XR.
VP Pro XR is the first product release under Mo-Sys’ Cinematic XR initiative. The over-riding aim of Cinematic XR is to move final pixel XR production forwards in terms of image quality and shooting creativity from its initial roots using live event LED technology, to fit-for-purpose Cinematic XR technology.
Mo-Sys has outlined four key components to Cinematic XR:
Michael Geissler, Mo-Sys CEO explains, “With our twenty-year background in film robotics, we regularly get to hear what cinematographers and producers think and need. Whilst producers love the final pixel XR concept, cinematographers worry about image quality, colour pipeline, mixed lighting, and shooting freedom. We started Cinematic XR in response to this, and VP Pro XR is specifically designed to solve the urgent problems we have been made aware of by our cinematographer and focus puller colleagues.”
Mo-Sys will announce the initial VP Pro XR customers in June. VP Pro XR is available immediately.
Mo-Sys Engineering, world leader in precision camera tracking solutions for virtual studios and augmented reality, provided support for Bluman Associates and milkit Studio for a ground-breaking series of idents, part of the ITV Creates programme, to mark Mental Health Awareness Week in May 2021.
Mo-Sys StarTracker was chosen to bring the vision of artist Mamimu (June Mineyama-Smithson) and neuroscientist Dr Tara Swart to life, as a series of channel idents which “combine art and science to create ultimate optimism, inducing happy hormones in your brain,” according to the artist.
The creative idea they chose was to make a physical model of the familiar ITV logo in a mirrored material and sit it in an augmented reality studio where bold, colourful abstract graphics would interact with it. Studio Owner Pod Bluman recognised that central to the success of the creative vision would be absolutely perfect registration between the real and the virtual objects, while giving the camera complete freedom to move.
The shoot was at milkit Studio in north London, a dedicated mixed reality facility with an LED shooting volume – two walls and a floor in high density, 4k resolution.
“The idea of abstract imagery reflecting in the real, very shiny logo would only work if they stayed in perfect registration,” said Ben Tilbrook of Mo-Sys, who provided support on the project. “The camera was mounted on a jib so was moving freely around the logo. The Mo-Sys StarTracker is designed for just this sort of requirement – it gives the director and cinematographer complete freedom while ensuring the graphics computer is updated with positional information in real time.”
Mo-Sys StarTracker is widely regarded as the leader in camera tracking technologies, using a random pattern of reflective dots – “stars” – on the studio ceiling. Once mapped, which takes just a few moments, any StarTracker-equipped camera can be precisely located in three-dimensional space, and for pan, tilt and roll. With the addition of digital lens data, real elements can be placed into the virtual environment. It allows augmented reality sequences like this to be shot live, obviating the need to composite the layers in post-production.
“This was the first time we had used camera tracking on a project like this,” said Pod Bluman of Bluman Associates. “It performed admirably, giving the precision we needed in the shoot, reliably and without fuss or problems.”
Commenting on the finished sequences, Dr Tara Swart said “Mamimu and I had conversations about how the brain works, and about neurotransmitters that are related to happiness and optimism. I was just blown away by what she created – it literally made me happy to see it.”
Hitomi Broadcast, manufacturer of MatchBox, the industry’s premier audio video alignment toolbox, has announced that Dock10, the largest studio complex in the UK, has invested in the Hitomi MatchBox system to help the Virtual Studios crew align their audio and video outputs efficiently and deliver high-quality programs without complex line-ups.
Dock10 has recently added extensive virtual production capabilities to its Salford centre. To ensure perfect synchronisation between audio and video circuits subject to different amounts of processing, Dock10 has now turned to the lip sync experts Hitomi and invested in the Hitomi MatchBox Generator, Analyser and licences to use the Glass iPhone app.
Virtual production is rapidly gaining ground and fits into many of Dock10’s key areas, like sports, magazine programmes and children’s television. For this reason, Dock10 has installed virtual production capabilities in every one of its studios. The challenge is that marrying live and computer-generated images generally results in a delay of a few frames, putting the audio ahead of the video. Sound leading the picture is particularly disturbing for audiences as it cannot occur in “real life”.
The Hitomi MatchBox system is a toolkit designed to streamline live broadcast synchronisation. MatchBox Generator creates a unique signal, including video and up to 16 audio channels. The complementary MatchBox Analyser compares the video and audio and determines precisely the delays in each path. For quick, on set tests, MatchBox Glass uses the ubiquitous iPhone or iPad to generate the test signal, with the phone simply held in front of the camera to be tested.
“We looked at MatchBox at IBC, then we borrowed the kit from our friends at Timeline Television so we could give it a full workout,” said Michael Lodmore, Duty Technology Manager, Dock10. “Compensating for video delays through the various processing engines can be time-consuming and not very satisfactory. We found MatchBox did just what we needed, saving our crews a lot of time to get the audio and video outputs in sync. Not only does it measure by how much it is wrong, it also gives us reassurance when right.
“The future of studio production is going to rely increasingly on virtual reality,” Lodmore added. “It allows us to create bigger, brighter environments through virtual set extensions. Producers love it because they can create looks which make their programmes stand out from the crowd and enhance the format. MatchBox means we can get on with delivering that quality without waiting for complex line-ups.”
Russell Johnson, director of Hitomi Broadcast, added “It is not an either/or situation – you do not make a choice between a completely virtual set or a completely physical set. In a typical production, some sets – and some cameras – will not have synchronised graphics, some will. Producers and directors will be making decisions on how the output of each camera will be processed as it happens, so the ability to pull back lip sync as you need it is a huge benefit. To be able to simply hold an iPhone or iPad running the MatchBox Glass app in front of a camera for synchronisation is a huge time-saver.”
Mo-Sys Engineering (www.mo-sys.com), world leader in precision camera tracking solutions for virtual production, has now integrated its virtual production software - Mo-Sys VP Pro - with the Sony Venice camera. Capturing the dynamic camera settings data direct from the Sony Venice, users can simplify and speed-up virtual production workflows.
The Sony Venice full-frame digital cinematography camera is rapidly gaining popularity, not least for the large number of software-controlled resolution and aspect ratio settings available. Mo-Sys VP Pro links directly to the camera software, monitoring every take and shot to highlight any mis-matched settings, whilst capturing the Venice camera data to assist in down-stream post-production compositing.
The integration means that Mo-Sys VP Pro goes into record mode whenever the Sony Venice camera starts recording. Mo-Sys VP Pro captures the file naming format the camera uses to name the media files, and uses the same name for the metadata file containing the camera settings data, ensuring that matching the two together in post-production is simple.
Mo-Sys VP Pro is now the most versatile and up-to-date virtual production solution for film, TV, game cinematics and live broadcast. Mo-Sys VP Pro is integrated with Unreal Engine and supports either live or recorded virtual production workflows, using either green/blue screen studios or LED volumes. It provides compositors with additional lens data – like F-stop, T-stop and shutter angle – making VFX compositing much easier.
“Virtual production is a powerful creative tool, for film as well as for television,” said Michael Geissler, CEO of Mo-Sys. “It is a complex business, though, and smart systems that speed synchronization and set-up in post-production give a huge boost to productivity. Once more, Mo-Sys is leading the industry with intelligent, practical, integrations, whilst also supplying camera tracking solutions with the highest levels of precision and data resilience available on the market today.
Mo-Sys recently announced the Cinematic XR Focus feature for Mo-Sys VP Pro, allowing camera operators to pull focus between real and virtual elements within an LED volume. This type of focus pull was previously impossible to achieve but using a Preston lens controller, Mo-Sys StarTracker and Mo-Sys VP Pro, it is now easily achievable.
Mo-Sys Engineering (www.mo-sys.com), world leader in precision camera tracking solutions for virtual studios and augmented reality, is installing a StarTracker system in the Los Angeles studio of StandardVision. This extensive facility builds on StandardVision’s unrivalled experience in design and implementation of large-scale displays, with an LED studio for virtual production.
StandardVision has an enviable reputation for both the creative and technical aspects of architectural lighting and digital media. Projects have been completed in California and around the world, from the new airport in Doha, Qatar to Hong Kong. The unique StandardVision value is both to integrate and install class-leading technology, and to create compelling visual content for the displays.
As part of their creative services, StandardVision Studios, in the heart of Los Angeles, operates a 10,000 square foot studio with a green screen and an LED volume for creating display content. These screens are now enhanced with the Mo-Sys StarTracker system enabling all types of virtual production.
“Alongside work for our own clients, we found we were being approached by movie and television people who wanted to try out ideas,” said Alberto Garcia, CTO of StandardVision. “Our plans for the studio matched well with theirs, particularly around the Unreal Engine for virtual environments.
“We needed to precisely place real cameras into virtual worlds,” he explained. “We looked around but it was clear that Mo-Sys had a strong reputation in the industry due to StarTracker’s precision and operational resilience. The producers we have coming in recognized Mo-Sys as the way to go, so it was a simple decision to choose StarTracker.”
StarTracker is widely regarded as the leader in camera tracking technologies, using a random pattern of reflective dots – “stars” – on the studio ceiling. Once mapped, which takes just a few moments, any StarTracker camera can be precisely located in three-dimensional space, and for 6-axes of movement. In addition, precision lens data enables the CGI elements to be distorted to match the camera lens for even greater realism.
“StandardVision is a remarkable company, developing creative solutions on a massive scale,” said Michael Geissler, CEO of Mo-Sys. “We are very excited to be working with them on their studio, and we look forward to further challenges in the future.”
Mo-Sys installed StarTracker in the StandardVision studio in mid-April 2021.
Large number of Mo-Sys camera tracking systems available for creative producers across all the Salford studios.
Mo-Sys Engineering (www.mo-sys.com), world leader in precision camera tracking solutions for virtual studios and augmented reality, is extending the capabilities for virtual and augmented reality at Dock10’s studio complex in Salford. Dock10 already had five channels of camera tracking, and it is now adding a further nine, ensuring the technology is available for multiple clients simultaneously.
In any creative production marrying live action with computer-generated scenic elements, the CGI needs to know exactly what the camera is seeing to ensure the combined image remains precisely locked. StarTracker is widely regarded as the leader in camera tracking technologies, using a random pattern of reflective dots – “stars” – on the studio ceiling. Once mapped, which takes just a few moments, any StarTracker camera can be precisely located in three-dimensional space, and for 6-axes of movement. In addition, precision lens data enables the CGI elements to be distorted to match the camera lens for even greater realism.
Dock10 initially invested in StarTracker for the premium sports show Match of the Day for BBC Television, along with the educational service BiteSize Daily. By adding nine more StarTrackers, Dock10 can provide camera tracking and virtual production across multiple studios and productions simultaneously.
“Directors want to be able to work in different sized studios, to accommodate a range of performers, physical sets and mixed reality,” said Andy Waters, head of studios at Dock10. “To meet this growing demand from our clients, we took our experience with key enabling technologies like StarTracker, and super-sized it across all our studios.”
Michael Geissler, CEO of Mo-Sys, added “Dock10 offers a unique capability in the UK – producers have the freedom to choose the size of studio right for their job, and put in it the technology they need. Virtual and augmented reality is an important part of so many productions today, from commercials to game shows, talent contests to sports analysis. By its very simplicity, StarTracker makes set-up for virtual production fast and easy.”