Little Mix put into virtual stadium setting thanks to precision camera tracking
London, UK, 25 November 2020: Mo-Sys Engineering, a world leader in precision camera tracking solutions for virtual studios and augmented reality, took a central role in the Covid-safe production of a live online entertainment awards show. Normally held in a stadium venue, this year production company and media technologists Bild Studios used virtual reality with the intent of keeping the same scope and excitement as any physical show in the past.
Under the lead of Creative Director Paul Caslin and Production Designer Julio Himede from Yellow Studio, Bild created an enormous 360 virtual stadium cladded with LED screens all around – a design that wouldn’t have been achievable in the real world but more than possible in a virtual environment. Bild’s unique content workflows allowed the show content applied to the virtual LED screens to be operated live, on a cue per cue basis – just like in a physical environment. Hosts Little Mix were shot in a large green screen studio, which thanks to Mo-Sys VP Pro virtual studio software combined with Unreal Engine graphics, became filled with cheering music fans. The links were shot with three Arri cameras: one on a crane, one on a rail and one on a tripod.
Each camera was fitted with a Mo-Sys StarTracker camera tracking system. These track and capture a camera’s precise position and orientation in six axes thanks to infrared cameras tracking reflective dots – stars – on the ceiling. By also tracking the lens focus, zoom, and distortion, the live talent can be perfectly and seamlessly combined with the virtual world, irrespective of the camera movement or lens adjustment. All composited camera outputs were created in real-time, ready for the for the director to live-cut to, making the camera work for the virtual production environment not too dissimilar for any other live broadcast show.
“Producing the real-time VFX content and suppling the technical engineering for this event was a very great pleasure,” said David Bajt, Co-Founder & Director at Bild Studios. “Special thanks go to Mo-Sys Engineering for the amazing camera tracking and virtual production technology.”
Mo-Sys CTO James Uren, who acted as technical consultant on the production, explained, “This was a very fast turnaround production – just five days for any post-production work required – so we needed to get all the virtual studio materials in camera. Because we capture all the component video elements (green screen, key, and graphics for each camera) plus all camera movement and lens data for each camera, for maximum flexibility we could offer Bild the option of three workflows.
“First, we had the combined real and virtual elements from the three cameras, with the option of cleaning up the greenscreen in post,” he explained. “Second, if there had been a problem with the graphics, we could keep the live shots and replace the virtual background. And third, the post team could just go crazy and change everything, whilst still keeping the same camera/lens movements around Little Mix.”
Mo-Sys VP Pro is powerful software that plugs directly into the Unreal Engine Editor. It receives the camera and lens tracking data from StarTracker, synchronises this with the camera video and the Unreal Engine virtual graphics, and frame accurately merges the two image streams together to provide a real-time high quality composite output. VP Pro’s uncompromised graphics power was designed with the specific needs of live virtual production in mind. For the event, each Arri camera had its own VP Pro and Unreal Engine instance. This gave the producers the output quality they demanded, with very low latency. “This is not pre-visualisation,” Uren explained. “This is premium production quality.”