Displaying items by tag: MoSys

manor marketing curve

Mo-Sys™ Engineering, world leader in virtual production and image robotics, has released a new multi-node media server solution for LED volumes to meet the demands of final pixel XR production for film and TV.

With virtual production increasing exponentially, in particular in the LED volume space, Mo-Sys utilised its 20+ years of film and broadcast technology innovation and experience to create the next step forward for XR media servers.

Mo-Sys VP Pro XR is a hardware and software solution combining multi-node nDisplay architecture, an improved VP Pro real-time compositor/synchroniser, and a new XR toolset. This XR media server system is focused on delivering cinematic capabilities and standards for Cinematographers and Focus Pullers.

The launch of VP Pro XR follows the recently announced Mo-Sys Cinematic XR Focus capability. Cinematic XR Focus allows focus pullers to pull focus between real and virtual elements in an LED volume – a world first - and is now available on VP Pro XR.

The XR space to date has been predominantly driven by live event equipment companies. Whilst these XR volumes have delivered cost savings by removing post-production compositing and lowering location costs, they have also introduced shooting limitations, and an output image that isn’t yet comparable to non-real-time compositing. This was the reason behind the creation of VP Pro XR.

VP Pro XR is the first product release under Mo-Sys’ Cinematic XR initiative. The over-riding aim of Cinematic XR is to move final pixel XR production forwards in terms of image quality and shooting creativity from its initial roots using live event LED technology, to fit-for-purpose Cinematic XR technology.

Mo-Sys has outlined four key components to Cinematic XR:

  • Improve image fidelity
  • Introduce established cinematic shooting techniques to XR
  • Enable seamless interaction between virtual and real set elements
  • Innovate new hybrid workflows combining final pixel and non-real-time compositing

Michael Geissler, Mo-Sys CEO explains, “With our twenty-year background in film robotics, we regularly get to hear what cinematographers and producers think and need. Whilst producers love the final pixel XR concept, cinematographers worry about image quality, colour pipeline, mixed lighting, and shooting freedom. We started Cinematic XR in response to this, and VP Pro XR is specifically designed to solve the urgent problems we have been made aware of by our cinematographer and focus puller colleagues.”

Mo-Sys will announce the initial VP Pro XR customers in June. VP Pro XR is available immediately.

Published in Client News

Mo-Sys Engineering, world leader in precision camera tracking solutions for virtual studios and augmented reality, provided support for Bluman Associates and milkit Studio for a ground-breaking series of idents, part of the ITV Creates programme, to mark Mental Health Awareness Week in May 2021.

Mo-Sys StarTracker was chosen to bring the vision of artist Mamimu (June Mineyama-Smithson) and neuroscientist Dr Tara Swart to life, as a series of channel idents which “combine art and science to create ultimate optimism, inducing happy hormones in your brain,” according to the artist.

The creative idea they chose was to make a physical model of the familiar ITV logo in a mirrored material and sit it in an augmented reality studio where bold, colourful abstract graphics would interact with it. Studio Owner Pod Bluman recognised that central to the success of the creative vision would be absolutely perfect registration between the real and the virtual objects, while giving the camera complete freedom to move.

The shoot was at milkit Studio in north London, a dedicated mixed reality facility with an LED shooting volume – two walls and a floor in high density, 4k resolution.

“The idea of abstract imagery reflecting in the real, very shiny logo would only work if they stayed in perfect registration,” said Ben Tilbrook of Mo-Sys, who provided support on the project. “The camera was mounted on a jib so was moving freely around the logo. The Mo-Sys StarTracker is designed for just this sort of requirement – it gives the director and cinematographer complete freedom while ensuring the graphics computer is updated with positional information in real time.”

Mo-Sys StarTracker is widely regarded as the leader in camera tracking technologies, using a random pattern of reflective dots – “stars” – on the studio ceiling. Once mapped, which takes just a few moments, any StarTracker-equipped camera can be precisely located in three-dimensional space, and for pan, tilt and roll. With the addition of digital lens data, real elements can be placed into the virtual environment. It allows augmented reality sequences like this to be shot live, obviating the need to composite the layers in post-production.

“This was the first time we had used camera tracking on a project like this,” said Pod Bluman of Bluman Associates. “It performed admirably, giving the precision we needed in the shoot, reliably and without fuss or problems.”

Commenting on the finished sequences, Dr Tara Swart said “Mamimu and I had conversations about how the brain works, and about neurotransmitters that are related to happiness and optimism. I was just blown away by what she created – it literally made me happy to see it.”

Published in Client News

Mo-Sys Engineering (www.mo-sys.com), world leader in precision camera tracking solutions for virtual production, has now integrated its virtual production software - Mo-Sys VP Pro - with the Sony Venice camera. Capturing the dynamic camera settings data direct from the Sony Venice, users can simplify and speed-up virtual production workflows.

The Sony Venice full-frame digital cinematography camera is rapidly gaining popularity, not least for the large number of software-controlled resolution and aspect ratio settings available. Mo-Sys VP Pro links directly to the camera software, monitoring every take and shot to highlight any mis-matched settings, whilst capturing the Venice camera data to assist in down-stream post-production compositing.

The integration means that Mo-Sys VP Pro goes into record mode whenever the Sony Venice camera starts recording. Mo-Sys VP Pro captures the file naming format the camera uses to name the media files, and uses the same name for the metadata file containing the camera settings data, ensuring that matching the two together in post-production is simple.

Mo-Sys VP Pro is now the most versatile and up-to-date virtual production solution for film, TV, game cinematics and live broadcast. Mo-Sys VP Pro is integrated with Unreal Engine and supports either live or recorded virtual production workflows, using either green/blue screen studios or LED volumes. It provides compositors with additional lens data – like F-stop, T-stop and shutter angle – making VFX compositing much easier.

“Virtual production is a powerful creative tool, for film as well as for television,” said Michael Geissler, CEO of Mo-Sys. “It is a complex business, though, and smart systems that speed synchronization and set-up in post-production give a huge boost to productivity. Once more, Mo-Sys is leading the industry with intelligent, practical, integrations, whilst also supplying camera tracking solutions with the highest levels of precision and data resilience available on the market today.

Mo-Sys recently announced the Cinematic XR Focus feature for Mo-Sys VP Pro, allowing camera operators to pull focus between real and virtual elements within an LED volume. This type of focus pull was previously impossible to achieve but using a Preston lens controller, Mo-Sys StarTracker and Mo-Sys VP Pro, it is now easily achievable.

The Sony Venice integration is available immediately, and all Mo-Sys VP Pro subscription customers have already received this new capability. To book a free trial of Mo-Sys VP Pro, contact .

Published in Client News

Mo-Sys Engineering (www.mo-sys.com), world leader in precision camera tracking solutions for virtual studios and augmented reality, is installing a StarTracker system in the Los Angeles studio of StandardVision. This extensive facility builds on StandardVision’s unrivalled experience in design and implementation of large-scale displays, with an LED studio for virtual production.

StandardVision has an enviable reputation for both the creative and technical aspects of architectural lighting and digital media. Projects have been completed in California and around the world, from the new airport in Doha, Qatar to Hong Kong. The unique StandardVision value is both to integrate and install class-leading technology, and to create compelling visual content for the displays.

As part of their creative services, StandardVision Studios, in the heart of Los Angeles, operates a 10,000 square foot studio with a green screen and an LED volume for creating display content. These screens are now enhanced with the Mo-Sys StarTracker system enabling all types of virtual production.

“Alongside work for our own clients, we found we were being approached by movie and television people who wanted to try out ideas,” said Alberto Garcia, CTO of StandardVision. “Our plans for the studio matched well with theirs, particularly around the Unreal Engine for virtual environments.

“We needed to precisely place real cameras into virtual worlds,” he explained. “We looked around but it was clear that Mo-Sys had a strong reputation in the industry due to StarTracker’s precision and operational resilience. The producers we have coming in recognized Mo-Sys as the way to go, so it was a simple decision to choose StarTracker.”

StarTracker is widely regarded as the leader in camera tracking technologies, using a random pattern of reflective dots – “stars” – on the studio ceiling. Once mapped, which takes just a few moments, any StarTracker camera can be precisely located in three-dimensional space, and for 6-axes of movement. In addition, precision lens data enables the CGI elements to be distorted to match the camera lens for even greater realism.

“StandardVision is a remarkable company, developing creative solutions on a massive scale,” said Michael Geissler, CEO of Mo-Sys. “We are very excited to be working with them on their studio, and we look forward to further challenges in the future.”

Mo-Sys installed StarTracker in the StandardVision studio in mid-April 2021.

Published in Client News

London, UK, 22 September 2020: Mo-Sys, a world leader in precision camera tracking solutions for virtual studios and augmented reality, will be showcasing its award-winning StarTracker Studio at BroadcastAsia 2020’s all-new virtual reality experience running from September 29 to October 1 2020.

Having recently won a Virtual Best of Show Award during IBC 2020, Mo-Sys StarTracker Studio, the world’s first pre-configured virtual production system, brings moving camera virtual production within reach of all market sectors.

StarTracker from Mo-Sys is proven as the most precise and reliable camera tracking technology, using dots on the studio ceiling (“stars”) which are placed at random and tracked to plot camera positions with extraordinary accuracy.

StarTracker Studio combines StarTracker camera tracking with a powerful all-in-one virtual production system capable of working with green screen studios or LED volumes. The system uses Mo-Sys VP Pro software to connect camera tracking data from up to 16 cameras to the Unreal Engine graphics. StarTracker Studio uses a smart system design to reduce the typical hardware required for multi-camera virtual production, and the whole system comes pre-configured in a flight-cased rack.

“We’re looking forward to demonstrating the performance, flexibility and simplicity StarTracker Studio offers to companies who need to create virtual studio and augmented reality content,” said Michael Geissler, CEO of Mo-Sys. 

Mo-Sys will also be detailing how its VP Pro and StarTracker technologies operate with LED volumes for virtual productions that want to use on-set finishing techniques.

“LED-wall technology now offers a viable alternative to the traditional green screen / post-production workflow for visual effects (VFX) shooting. Specifically, LED walls enable a composited shot to be captured on-set, rather than just previewed on-set, thereby removing the need for downstream post-production,” Geissler continued. “LED walls won’t replace green screen, both will co-exist going forwards as each is suited to a different type of VFX shot. The benefit of StarTracker Studio is that it handles both workflows”.

To register for the BCA Virtual Event and visit Mo-Sys’s virtual booth, please visit: https://l.feathr.co/bca-exhibitor-landing-page--mo-sys-engineering

Published in Client News

London, UK, 27 August 2020: Mo-Sys, world leader in precision camera tracking solutions for virtual studios and augmented reality, provided key technology to EVOKE Studios, which allowed it to develop exciting extended reality environments for the performances at this year’s AIM Awards, from the Association for Independent Music, on the 12th August 2020. Mo-Sys’ StarTracker provided the precision camera tracking data, enabling EVOKE to seamlessly blend (real-time compositing) live action with photo-realistic virtual elements.

“The challenges with extended reality lie in the smoothness of tracked content, frame delays, and having a close to faultless set extension,” said Vincent Steenhoek, founder of EVOKE Studios. “Our experience with StarTracker is that it gives us ultra-reliable, highly accurate positional data and ultra low latency. Building on technologies like StarTracker enables awards shows like the AIM Awards to be presented in broadcast quality virtual environments.”

Critical for virtual studio and augmented reality production is to track the position of each camera in three-dimensional space and with all 6 degrees of movement (pan, tilt, roll, x, y, z) plus lens focal length and focus. StarTracker from Mo-Sys is proven as the most precise and reliable camera tracking package, using dots on the studio ceiling (“stars”) which are placed at random and tracked to plot camera positions with extraordinary accuracy.

EVOKE and its creative partners shot a number of guest performances for the awards show, recording as live with no post production. The shoot took place at the new Deep Space studios at Creative Technology in Crawley, a studio which is already set up for StarTracker, including a camera jib for free movement.
Performances captured in extended reality included AJ Tracey and MoStack surrounded by larger-than-life burgers and fries, and Pioneer Award winner Little Simz who was made to appear underwater.

“This is a great example of what StarTracker delivers,” said Michael Geissler, CEO of Mo-Sys. “It is designed for live work, providing completely reliable positioning data into the graphics engines. It allowed the EVOKE team to build really complex extended reality and virtual environments in combination with LED walls and floor, then let the performers go with the music confident that they would capture all the action flawlessly.”

For more information on StarTracker, please visit the Mo-Sys website: www.mo-sys.com

Published in Client News

By Mo-Sys Staff | Mo-Sys Engineering | Published 13th March 2020

On Friday 6th March Mo-Sys were thrilled to be named Business of the Year at the Best of Royal Greenwich Business Awards 2020. Hosted by TV presenter and journalist Steph McGovern at the InterContinental Hotel, the awards ceremony recognises the achievements of businesses across the borough. In addition, Mo-Sys also came out victorious in the Made in Greenwich category, an award which celebrates the talent and creativity of local innovators who are bringing exciting products, good and services to market.

For those that don’t know who we are or what we do, we are based in Morden Wharf on the Greenwich Peninsula and we manufacture sophisticated camera technology for television broadcasts and feature films.

If you have watched the BBC’s Match of the Day programme this season or if you’ve seen any of the recent General Election nights on BBC or Sky, then you would have experienced our technology in action. Our specialist equipment which gives precise camera tracking data, broadcasters can add informational graphics and immersive virtual sets into their productions, giving audiences a more engaging and satisfying viewing experience.

Since relocating to Greenwich in 2015, we’ve been on a mission to deliver the best technology that will transform the way films, TV series and broadcasts are made. Now, our innovations are making greenscreen shooting cheaper and simpler than ever before, in turn giving smaller companies the same opportunities as the big players that have long since dominated both the film and broadcast industries.

Michael Geissler CEO and Founder of Mo-Sys:

“After being shortlisted in the 2019 Awards but unfortunately missing out in the Micro to Small Business Award category to the well-deserved BeGenio – we were delighted to be named winners in the Made in Greenwich category and receiving the Business of the Year award.”

“It’s very easy to get swept up in day to day business activity but entering the business awards was a great opportunity to take a step back and reflect on our achievements over the last year. It’s even more fantastic to be recognised for the work we do and to come out victorious. Thank you to everyone all of the judges and the council for recognises for our innovation and all of the hard work we do”

Mo-Sys’ credits include: Aquaman, Stranger Things, The Walking Dead, Life of Pi, The Shape of Water, House of Cards and Westworld. Broadcast customers include BBC, Sky, Fox, ESPN, CNN Discovery Channel and The Weather Channel, Netflix and Sony.

 

FULL ARTICLE AVAILABLE HERE

Published in Articles

Contributions by Micheal Geissler | TVBEurope | Published in the April 2020 edition

TVBEurope asks eight experts for their views on the benefits, challenges and future of remote production

WHAT DO YOU VIEW AS THE BENEFITS OF REMOTE PRODUCTION?

WOLFGANG HUBER, PR MANAGER, LAWO
The primary aim of live remote production is to move as much equipment and as many staff as possible from the remote location back to the studio facility. Thus, remote production offers the possibility of both dramatically reducing production costs at the higher-end and improving quality at the lower-end. This is achieved by redesigning the production workflow such that the majority of tasks take place in the studio rather than at the remote site. Ideally, the only task taking place at the remote site would be signal acquisition - the capture and conversion of camera and microphone signals into a form transportable over wide-area networks. In this model, all signals are transported back to a studio facility, where the program production takes place. Contrast this to conventional remote broadcasts, where the production process happens on-site, and only the finished pictures and audio are backhauled to a broadcast facility for distribution.

MICHAEL GEISSLER, CEO, MO-SYS ENGINEERING
When you take travel and rest time into consideration, the top operators might only be doing the job at which they excel for maybe 40 days a year. With remote production, those people could be working for maybe 200 days a year. That is not just a massive boost in productivity, it represents a huge reduction in carbon footprint, by eliminating long-distance travel and specialist clothing. Production companies are already talking about saving half a million dollars a year.

PETER MAAG, CCO AND EVP STRATEGIC PARTNERSHIPS, HAIVISION
The beauty of employing a remote production model is that it allows broadcasters to do more, with less. Remote production offers broadcasters the opportunity to produce more high-quality content to meet the rapidly growing demand while simultaneously creating efficiencies. And the efficiencies gained by using a remote production model are dramatic. By eliminating the costly and complex logistics associated with deploying OB trucks full of expensive equipment and production teams to the field, broadcasters can instead focus on optimising the use of their resources to produce more high-quality content. For example, a replay operator on-site at a sporting event might be only utilised for three hours during a four-day period. If the replay operator is at home, however, they could be running replays around the world, all the time.

DAVE LETSON, VP OF SALES, CALREC
Like the rest of the industry, we see multiple benefits to remote production, though it’s important to highlight that while the principals are the same not all remote productions are created equally in terms of scale. Firstly, using OB trucks is no longer necessary at every single live event. This is beneficial in multiple ways: fewer staff need to travel and therefore better employee welfare; far less equipment is required on-site; it’s environmentally more sustainable; and there is far less equipment downtime at the central production location. There’s also the fact that multiple sports events – football matches being a prime example – can be covered in a day or over a weekend because the centralised production technologies aren’t committed to a single event. In the long run all this adds up to significant cost savings.

ON WHICH AREAS OF CONTENT DO YOU SEE IT HAVING THE BIGGEST IMPACT?

DR REMO ZIEGLER, HEAD OF PRODUCT MANAGEMENT, VIZRT SPORTS
Remote production will have its biggest impact on any live production that is typically faced with big travel and logistic costs. Sports and news production are prime examples. With regards to sports, various aspects will benefit from remote production. The bigger leagues and federations with dedicated connections to stadiums enable the transmission of high-quality, low-latency signals to a centralised production location. Only small production crews, comprised mostly of camera and audio technicians, are required on-site, while the rest of the production equipment and operations staff remain back at the production centre. The impact of remote production is amplified when one considers the opportunity of distribution through OTT. Many more signals, or specialised cuts, can be produced which are tailored to different customer segments. Producing all these outputs is heavily facilitated through remote production.

WH: The impact applies to all productions covering events that happen outside the studio, like in arenas or stadiums, for which many signals are required to cover the event - and particularly sports with long distances between the camera and microphone positions, like football, rugby, biathlon, motorsports, but also open-air concerts of a large scale. It also opens up the possibility of extending the depth and range of live event coverage into areas previously inaccessible through cost. For example, more specialised sports, lower league and regional coverage, even to college and university level. The industry sees an unquenchable public thirst for sports and other live events coverage and remote production provides the means to broadcast more of it.

MG: Anything that is somewhere for a short time will feel the impact – anything that today is covered by an OB truck or flyaway kit. Obviously, sport heads the list, but it also includes music and entertainment. It will also have a huge impact on corporate events. Product launches and business presentations will be raised in quality, not least through the ability to afford more cameras. Production-as-a-Service will have an impact on linked events: fashion weeks, for example, could see the same skilled production team covering every major event.

PM: Remote production has the most significant impact on events – and it’s not just limited to live events. Whether it’s for a sporting event, esports, a press conference, or a political panel, a remote production model reduces the number of people and resources required on-site, allowing production costs to be lowered. Even for events that aren’t broadcast live but where speed is critical, remote production can dramatically accelerate the production process. At-home/REMI workflows are particularly attractive options when it comes to tier two and tier three events such as college football, for example, where deploying resources on-site is simply not costeffective. In this instance, remote production enables broadcasters to expand their coverage to meet demand while keeping production costs in check.

HOW SOON WILL IT BECOME THE INDUSTRY NORM?

RICHARD MCCLURG, VICE PRESIDENT MARKETING, DEJERO
We’re already there. Dejero has enabled remote production workflows for over a decade. Dejero enabled the first live coverage of the Vancouver 2010 Olympic Games torch relay, delivering unprecedented live coverage following the torch as it travelled 45,000km across Canada. In 2013, another first enabled Sky Sports to broadcast live from all 92 English Football Clubs in a single day. Using revolutionary wireless technology at the time, Dejero blended multiple cellular connections and provided enough bandwidth to deliver high-quality live broadcast content, at significantly less expense and complexity than traditional video transport technology.

NORBERT PAQUET, HEAD OF LIVE PRODUCTION SOLUTIONS, SONY PROFESSIONAL SOLUTIONS EUROPE
Consumers are demanding more content, available whenever and wherever they choose, without any drop in quality. With this escalating pressure on broadcasters, remote production will naturally become the norm and act as a silver bullet to help keep up with growing industry demands. We’ve already seen overall connectivity (mobile or fibre networks) become a major game-changer for our industry, particularly when it comes to live production. And, with remote production set-ups, resource sharing and a more collaborative, faster turnaround time, it’s becoming even more popular. At Sony, we’ve been at the forefront of this revolution and have, to date, worked with many customers around the world to develop remote production set-ups in news, magazine and live production.

LARISSA GOERNER, DIRECTOR OF ADVANCED LIVE SOLUTIONS, GRASS VALLEY
The biggest hurdle to widespread adoption of at-home models is the challenge of latency. As we look to the future, though, broadcasters and production companies will continue to drive toward more captivating experiences that draw in viewers, using higher resolutions and more camera angles that will put greater stress on the network and available bandwidth. More efficient encoding solutions – JPEG2000, JPEG-XS and MPEG – offer an attractive alternative.These options deliver ultra-low delay, comparable to transporting the signal over fibre, and come at a significantly reduced cost while ensuring there is no difference in the viewers’ experience.

RZ: When we look at some of our leading customers, remote production is already the norm, since they have adopted Vizrt solutions that facilitate a remote production workflow. However, our larger customers are not the only ones benefiting from remote production. Many smaller productions also take advantage of the same concepts to reduce their cost and the size of their footprint. The rise of IP and the availability of software-defined productions tools, which in turn can be virtualised in the Cloud, will make remote production the norm for the majority of media productions.

HOW BIG A PART DO YOU THINK 5G WILL PLAY IN REMOTE PRODUCTION?

NP: 5G will play a key role in powering remote production for many organisations. Firstly, the low-latency transmission offered by 5G is crucial for any productions such as sporting events or news broadcasts where delays are unforgivable. Secondly, the higher bandwidth 5G offers helps deliver less compressed content to the mobile viewer but also unlocks additional applications for remote production too. Finally, given 5G enables Cloud-based production models, it helps reduce the deployment of physical OB resources, which makes productions much more sustainable.

LG: As an emerging technology, 5G will play a significant role in the broadcast industry as a reliable way to deliver content to consumers. In terms of remote production, 5G can be utilised for its greater capacity benefits. However, bandwidth is not unlimited in 5G and as we are still seeing an increasing uptick for live UHD content, baseband cannot be transported over 5G and has to be encoded in order to handle demand for this format. This adds another step in the creation process and will slow down adoption for high-end live sports production. Currently, for tier two and three productions, 5G is a means to end in providing easier contribution to the remote location and is a good candidate to enable more creativity in content production.

PM: There’s nothing mystical about 5G: it’s a faster, wireless, mobile network. As 5G gathers momentum and begins to easily handle multiple video streams from a venue it will definitely act as a catalyst to accelerate the adoption of remote production. However, very fast, reliable, and affordable network pipes are already available from any venue today and it’s important to remember that a network is just a network and all networks are getting faster; both wired and wireless. What’s more impactful than the network are technologies like the open source Secure Reliable Transport protocol (SRT), pioneered by Haivision, which enables video transport over any network.

DL: From a Calrec perspective, little will change with 5G. RP1 one was deliberately designed to be transport agnostic. From our perspective, it does not matter whether we are piggy-backing audio on a camera feed via a JPEG2000 path or via a closed AES67 wide-area network. For our clients though, 5G offers vast potential. It’s not implausible to consider a camera at a field of play sending pictures directly back to base over 5G. Companies have already achieved this with multiple 4G links. 5G technology could offer a true paradigm shift in areas ranging from traditional SNG to Premier League football. However, where there are local commentators or reporters, some local IFB mixing will still be needed and RP1 becomes even more relevant.

HOW DO YOU INCORPORATE SUSTAINABLE PRACTICES IN REMOTE PRODUCTION?

RM: Remote production is no doubt reducing the industry’s carbon footprint. The amount of kit and crews required to travel to live events is greatly reduced compared to the traditional production workflows. Less OB and larger SNG trucks are on the road. Centralising production staff at the broadcast facility means that fewer people are having to travel to the field, cutting airmiles and transport. Initiatives, such as ‘Find a Provider’, which is featured in Dejero’s Cloud-based management system, enables broadcasters to find freelancers across the globe, making it easier to find local resources to acquire content. Dejero’s MultiPoint Cloud service enables broadcasters to share field resources and contribute the pool feed simultaneously to many broadcasters, geographically dispersed.

LG: In general, the decrease in travel brought about by remote production already has a significantly lower impact on the environment. However, we believe more can be done. Enabling workflow consistency for a variety of content productions is a goal for us. Grass Valley cameras, switchers and replay products all enable the highest flexibility for any workflow, therefore allowing the creative talent to be where they are most needed to add better value. Recurring tasks can easily be centralised and produced with fewer operators, ultimately allowing more content to be created at a consistently high quality. Our DirectIP solution, for example, enables almost all production and technical staff to work from a centralised location. We also give customers the flexibility to locate creative talent either at the venue or the production hub. We continuously strive to innovate across the entire portfolio providing the latest software and hardwaretechnology to enable sustainable production in the market.

RZ: Remote production reduces the number of required people and equipment on-site. That means fewer people travelling, and fewer pieces of production equipment shipping, by plane, train, and automobile. Furthermore, a remote dedicated production centre, designed around software-defined production practices, reduces hardware usage, power consumption, and the need for active cooling versus inefficient mobile units.

MG: The headline benefit is that fewer people need to travel to the event, meaning is a significant reduction in carbon footprint. As remote production becomes ever more sophisticated – with remote camera operation, for example – so the reductions become greater. This does depend upon complex technologies becoming mainstream and commoditised, to simplify the installation and the power consumption of rig and connectivity. The recent coronavirus outbreak is seeing a very large reduction in business travel. The ability to control cameras from a central hub in any place of the world will be extremely attractive to productions, not least because of the reduced environmental impact. WHAT WILL BE THE CHALLENGES FOR REMOTE PRODUCTION AS IT GROWS?

WH: Growing demand for content and tighter schedules of events to be covered are challenging on the administrative side, as equipment needs to be available reliably at any time for a new production as soon as it is not used anymore for the previous one. Access to and reliability of the fibre infrastructure must be guaranteed. There the System Monitoring and Realtime Telemetry for Broadcast Networks like Lawo’s SMART come into play to allow for constant control and monitoring over the complete IP network installation from capturing to playout. And the more concurrent productions that are happening, the more essential it is to have such a monitoring system in place to ensure signal, sync and packet integrity and thus flawless operation.

MG: The real issue will be the management of change, particularly for people. It is a different pitch for operators: taking them away from immersion in the action and giving them comfortable, familiar working environments in exchange for greater productivity. The people issues, and the shifts in budgeting, are cultural changes, which always see a natural resistance.

DL: Connectivity is a key issue. Also getting staff to understand and adapt to it, though our customers tell us that once it’s been explained and tried, this stops being an issue! The other thing, of course, is reliability. For Calrec, this hasn’t proved an issue either. Lastly, for quick turnaround projects, or where there are multiple events in a row/ across a season, technical and workflow practices have to be set in stone. But we don’t see any reason that remote production use won’t grow significantly from here.

FULL ARTICLE AVAILABLE HERE

Published in Articles

London, UK, 16 July 2020: Mo-Sys, a world leader in precision camera tracking solutions for virtual studios and augmented reality, has brought virtual studio production within reach of everyone with StarTracker Studio, the world’s first pre-assembled production package. The system is scalable to any size production, and can support 4K Ultra HD.

Critical for virtual studio and augmented reality production is to track the position of each camera in three-dimensional space across all six axes of movement. StarTracker from Mo-Sys is proven as the most precise and reliable camera tracking package, using dots on the studio ceiling (“stars”) which are placed at random and tracked to plot camera positions with extraordinary accuracy.

To make virtual studio production accessible by all, StarTracker Studio bundles the tracking with cameras and mounts and a high-performance virtual graphics system based on the latest version of the Unreal Engine and the Ultimatte keyer. Mo-Sys’s unique plug-in interfaces directly between the camera tracking and the Unreal Engine, for extreme precision with virtually no latency.

All the hardware – including Lenovo PCs with Titan RTX GPUs – is mounted in a rolling rack cabinet and pre-wired, and all the software is loaded and configured. All the user has to do is design their unique virtual environment, power up the StarTracker Studio rack and start shooting.

“StarTracker Studio enables any organisation to create premium virtual studio content easily and simply, at an attractive price point,” said Michael Geissler, CEO of Mo-Sys. “We have made it simple by bringing the best equipment together, from respected vendors like Blackmagic Design/Ultimatte, Lenovo, Canon and Cartoni.

“That hardware works with the incredible virtual graphics power from the Unreal Engine, tied with perfect precision to real objects thanks to Mo-Sys tracking,” he continued. “It gives the user access to top end effects like hyper-realistic reflections, soft shadows to emulate real-world lighting, continual depth of field processing to emulate lens performance, and occlusion handling, so talent can walk around virtual objects. In other words, top-end, uncompromised virtual production, in a simple one-stop package.”

The standard package is supplied with three Blackmagic Ursa Mini 4.6K cameras with Canon 18 – 80 zoom lenses, and paired with Mo-Sys StarTracker tracking units. The kit also includes a camera jib, a rolling tripod and a camera rail set. An 8 channel Atem production switcher provides the live output for broadcast or streaming, and three video recorders are included for separate programme, key and graphics recording. Smart switching means that only one Ultimatte keyer is required for the system, rather than the more conventional one keyer per camera.

The package also includes the Mo-Sys Beam In kit to bring remote guests into the virtual studio. Every element can be used in HD or 4K Ultra HD. Three radio microphones and an eight-channel audio mixer are also part of the solution.The system is scalable up to the largest size of virtual production. Rack kits are available to support either eight or 16 cameras. The complete system is pre-configured by Mo-Sys before shipping, and Mo-Sys will also provide all support, including training in technical and creative aspects where required.

“We have all had to find new and inventive ways to keep up production over the last few months,” said Geissler. “StarTracker Studio is ideal for the new normal, especially including our Beam In kit for remote contributions. This is a powerful production platform aimed at anyone who wants to create virtual studio and augmented reality content, without the time and investment setting it up themselves. Hang a green screen and, with StarTracker Studio, you are good to go.”

Additional information will be available through a Webinar on 23July 2020 at 10am BST and 6pm BST.

Published in Client News

London, UK, 19 June 2020: Mo-Sys Engineering, a global leader in real time camera tracking and remote systems, has announced a revolutionary approach to bringing the atmosphere back to live sport amid covid-19 restrictions. Providing precision, zero-latency tracking for any camera (including ultra-long box lenses for sport), the Mo-Sys camera tracking kit interfaces directly to the Unreal Engine or any broadcast render engine, allowing production companies to add virtual crowds to stands.

“After so many weeks, sports fans are desperate for any action,” said Michael Geissler, CEO of Mo-Sys. “But the frustration will turn to disappointment if the atmosphere of the game falls flat because of empty stands. We have developed a camera tracking kit which any outside broadcast can implement quickly and simply, capable of filling the stands with a virtual, but enthusiastic, crowd.”

The Mo-Sys camera tracking encoders are quickly mounted onto broadcast standard Vinten Vector heads, with no impact on the camera’s perfect balance and no backlash when panning and tilting. Zoom data is collected either by gear encoders or by a serial data link to digital lenses. The combined tracking data is sent over ethernet to the workstation hosting the augmented reality software.

“We are known for the absolute precision and stability of our camera tracking – that’s why Hollywood relies on our technology,” Geissler added. “In this application, we deliver precise tracking, including compensation for lens distortion, even when a 100:1 lens is zoomed fully.”

Mo-Sys has worked with Epic Games to develop a tight interface to the Unreal Engine, including support for the latest version 4.25 software. The result is that highly photo-realistic augmented reality – such as crowds filling the stands – can be integrated into live production with no limitations and negligible latency. The kit includes the bolt-on encoding kit for Vinten heads and the lens calibration tools.

Users can see the technology in action in a Mo-Sys LiveLab webinar, which will also include contributions from Epic Games and Canon. The webinars are on 30 June, at 10.00 (register at https://bit.ly/2N1Ve44) and repeated at 18.00 (register at https://bit.ly/30GqFZP). Michael Geissler of Mo-Sys will also join a distinguished panel with the RTS Thames Valley Creative Centre’s look at production techniques for audience shows in a time of pandemic on Thursday 25 June at 17.00 (register free at https://rts.org.uk/event/future-studio-audience).

Published in Client News
Page 1 of 2

Contact us

Let's boost your journey to success, get in touch today

Please contact us with any questions.

Manor Marketing, 4 Olympus House, Calleva Park, Aldermaston, RG7 8SA Get Directions