Displaying items by tag: MoSys

manor marketing curve

London, UK, 22 September 2020: Mo-Sys, a world leader in precision camera tracking solutions for virtual studios and augmented reality, will be showcasing its award-winning StarTracker Studio at BroadcastAsia 2020’s all-new virtual reality experience running from September 29 to October 1 2020.

Having recently won a Virtual Best of Show Award during IBC 2020, Mo-Sys StarTracker Studio, the world’s first pre-configured virtual production system, brings moving camera virtual production within reach of all market sectors.

StarTracker from Mo-Sys is proven as the most precise and reliable camera tracking technology, using dots on the studio ceiling (“stars”) which are placed at random and tracked to plot camera positions with extraordinary accuracy.

StarTracker Studio combines StarTracker camera tracking with a powerful all-in-one virtual production system capable of working with green screen studios or LED volumes. The system uses Mo-Sys VP Pro software to connect camera tracking data from up to 16 cameras to the Unreal Engine graphics. StarTracker Studio uses a smart system design to reduce the typical hardware required for multi-camera virtual production, and the whole system comes pre-configured in a flight-cased rack.

“We’re looking forward to demonstrating the performance, flexibility and simplicity StarTracker Studio offers to companies who need to create virtual studio and augmented reality content,” said Michael Geissler, CEO of Mo-Sys. 

Mo-Sys will also be detailing how its VP Pro and StarTracker technologies operate with LED volumes for virtual productions that want to use on-set finishing techniques.

“LED-wall technology now offers a viable alternative to the traditional green screen / post-production workflow for visual effects (VFX) shooting. Specifically, LED walls enable a composited shot to be captured on-set, rather than just previewed on-set, thereby removing the need for downstream post-production,” Geissler continued. “LED walls won’t replace green screen, both will co-exist going forwards as each is suited to a different type of VFX shot. The benefit of StarTracker Studio is that it handles both workflows”.

To register for the BCA Virtual Event and visit Mo-Sys’s virtual booth, please visit: https://l.feathr.co/bca-exhibitor-landing-page--mo-sys-engineering

Published in Client News

London, UK, 27 August 2020: Mo-Sys, world leader in precision camera tracking solutions for virtual studios and augmented reality, provided key technology to EVOKE Studios, which allowed it to develop exciting extended reality environments for the performances at this year’s AIM Awards, from the Association for Independent Music, on the 12th August 2020. Mo-Sys’ StarTracker provided the precision camera tracking data, enabling EVOKE to seamlessly blend (real-time compositing) live action with photo-realistic virtual elements.

“The challenges with extended reality lie in the smoothness of tracked content, frame delays, and having a close to faultless set extension,” said Vincent Steenhoek, founder of EVOKE Studios. “Our experience with StarTracker is that it gives us ultra-reliable, highly accurate positional data and ultra low latency. Building on technologies like StarTracker enables awards shows like the AIM Awards to be presented in broadcast quality virtual environments.”

Critical for virtual studio and augmented reality production is to track the position of each camera in three-dimensional space and with all 6 degrees of movement (pan, tilt, roll, x, y, z) plus lens focal length and focus. StarTracker from Mo-Sys is proven as the most precise and reliable camera tracking package, using dots on the studio ceiling (“stars”) which are placed at random and tracked to plot camera positions with extraordinary accuracy.

EVOKE and its creative partners shot a number of guest performances for the awards show, recording as live with no post production. The shoot took place at the new Deep Space studios at Creative Technology in Crawley, a studio which is already set up for StarTracker, including a camera jib for free movement.
Performances captured in extended reality included AJ Tracey and MoStack surrounded by larger-than-life burgers and fries, and Pioneer Award winner Little Simz who was made to appear underwater.

“This is a great example of what StarTracker delivers,” said Michael Geissler, CEO of Mo-Sys. “It is designed for live work, providing completely reliable positioning data into the graphics engines. It allowed the EVOKE team to build really complex extended reality and virtual environments in combination with LED walls and floor, then let the performers go with the music confident that they would capture all the action flawlessly.”

For more information on StarTracker, please visit the Mo-Sys website: www.mo-sys.com

Published in Client News

By Mo-Sys Staff | Mo-Sys Engineering | Published 13th March 2020

On Friday 6th March Mo-Sys were thrilled to be named Business of the Year at the Best of Royal Greenwich Business Awards 2020. Hosted by TV presenter and journalist Steph McGovern at the InterContinental Hotel, the awards ceremony recognises the achievements of businesses across the borough. In addition, Mo-Sys also came out victorious in the Made in Greenwich category, an award which celebrates the talent and creativity of local innovators who are bringing exciting products, good and services to market.

For those that don’t know who we are or what we do, we are based in Morden Wharf on the Greenwich Peninsula and we manufacture sophisticated camera technology for television broadcasts and feature films.

If you have watched the BBC’s Match of the Day programme this season or if you’ve seen any of the recent General Election nights on BBC or Sky, then you would have experienced our technology in action. Our specialist equipment which gives precise camera tracking data, broadcasters can add informational graphics and immersive virtual sets into their productions, giving audiences a more engaging and satisfying viewing experience.

Since relocating to Greenwich in 2015, we’ve been on a mission to deliver the best technology that will transform the way films, TV series and broadcasts are made. Now, our innovations are making greenscreen shooting cheaper and simpler than ever before, in turn giving smaller companies the same opportunities as the big players that have long since dominated both the film and broadcast industries.

Michael Geissler CEO and Founder of Mo-Sys:

“After being shortlisted in the 2019 Awards but unfortunately missing out in the Micro to Small Business Award category to the well-deserved BeGenio – we were delighted to be named winners in the Made in Greenwich category and receiving the Business of the Year award.”

“It’s very easy to get swept up in day to day business activity but entering the business awards was a great opportunity to take a step back and reflect on our achievements over the last year. It’s even more fantastic to be recognised for the work we do and to come out victorious. Thank you to everyone all of the judges and the council for recognises for our innovation and all of the hard work we do”

Mo-Sys’ credits include: Aquaman, Stranger Things, The Walking Dead, Life of Pi, The Shape of Water, House of Cards and Westworld. Broadcast customers include BBC, Sky, Fox, ESPN, CNN Discovery Channel and The Weather Channel, Netflix and Sony.

 

FULL ARTICLE AVAILABLE HERE

Published in Articles

Contributions by Micheal Geissler | TVBEurope | Published in the April 2020 edition

TVBEurope asks eight experts for their views on the benefits, challenges and future of remote production

WHAT DO YOU VIEW AS THE BENEFITS OF REMOTE PRODUCTION?

WOLFGANG HUBER, PR MANAGER, LAWO
The primary aim of live remote production is to move as much equipment and as many staff as possible from the remote location back to the studio facility. Thus, remote production offers the possibility of both dramatically reducing production costs at the higher-end and improving quality at the lower-end. This is achieved by redesigning the production workflow such that the majority of tasks take place in the studio rather than at the remote site. Ideally, the only task taking place at the remote site would be signal acquisition - the capture and conversion of camera and microphone signals into a form transportable over wide-area networks. In this model, all signals are transported back to a studio facility, where the program production takes place. Contrast this to conventional remote broadcasts, where the production process happens on-site, and only the finished pictures and audio are backhauled to a broadcast facility for distribution.

MICHAEL GEISSLER, CEO, MO-SYS ENGINEERING
When you take travel and rest time into consideration, the top operators might only be doing the job at which they excel for maybe 40 days a year. With remote production, those people could be working for maybe 200 days a year. That is not just a massive boost in productivity, it represents a huge reduction in carbon footprint, by eliminating long-distance travel and specialist clothing. Production companies are already talking about saving half a million dollars a year.

PETER MAAG, CCO AND EVP STRATEGIC PARTNERSHIPS, HAIVISION
The beauty of employing a remote production model is that it allows broadcasters to do more, with less. Remote production offers broadcasters the opportunity to produce more high-quality content to meet the rapidly growing demand while simultaneously creating efficiencies. And the efficiencies gained by using a remote production model are dramatic. By eliminating the costly and complex logistics associated with deploying OB trucks full of expensive equipment and production teams to the field, broadcasters can instead focus on optimising the use of their resources to produce more high-quality content. For example, a replay operator on-site at a sporting event might be only utilised for three hours during a four-day period. If the replay operator is at home, however, they could be running replays around the world, all the time.

DAVE LETSON, VP OF SALES, CALREC
Like the rest of the industry, we see multiple benefits to remote production, though it’s important to highlight that while the principals are the same not all remote productions are created equally in terms of scale. Firstly, using OB trucks is no longer necessary at every single live event. This is beneficial in multiple ways: fewer staff need to travel and therefore better employee welfare; far less equipment is required on-site; it’s environmentally more sustainable; and there is far less equipment downtime at the central production location. There’s also the fact that multiple sports events – football matches being a prime example – can be covered in a day or over a weekend because the centralised production technologies aren’t committed to a single event. In the long run all this adds up to significant cost savings.

ON WHICH AREAS OF CONTENT DO YOU SEE IT HAVING THE BIGGEST IMPACT?

DR REMO ZIEGLER, HEAD OF PRODUCT MANAGEMENT, VIZRT SPORTS
Remote production will have its biggest impact on any live production that is typically faced with big travel and logistic costs. Sports and news production are prime examples. With regards to sports, various aspects will benefit from remote production. The bigger leagues and federations with dedicated connections to stadiums enable the transmission of high-quality, low-latency signals to a centralised production location. Only small production crews, comprised mostly of camera and audio technicians, are required on-site, while the rest of the production equipment and operations staff remain back at the production centre. The impact of remote production is amplified when one considers the opportunity of distribution through OTT. Many more signals, or specialised cuts, can be produced which are tailored to different customer segments. Producing all these outputs is heavily facilitated through remote production.

WH: The impact applies to all productions covering events that happen outside the studio, like in arenas or stadiums, for which many signals are required to cover the event - and particularly sports with long distances between the camera and microphone positions, like football, rugby, biathlon, motorsports, but also open-air concerts of a large scale. It also opens up the possibility of extending the depth and range of live event coverage into areas previously inaccessible through cost. For example, more specialised sports, lower league and regional coverage, even to college and university level. The industry sees an unquenchable public thirst for sports and other live events coverage and remote production provides the means to broadcast more of it.

MG: Anything that is somewhere for a short time will feel the impact – anything that today is covered by an OB truck or flyaway kit. Obviously, sport heads the list, but it also includes music and entertainment. It will also have a huge impact on corporate events. Product launches and business presentations will be raised in quality, not least through the ability to afford more cameras. Production-as-a-Service will have an impact on linked events: fashion weeks, for example, could see the same skilled production team covering every major event.

PM: Remote production has the most significant impact on events – and it’s not just limited to live events. Whether it’s for a sporting event, esports, a press conference, or a political panel, a remote production model reduces the number of people and resources required on-site, allowing production costs to be lowered. Even for events that aren’t broadcast live but where speed is critical, remote production can dramatically accelerate the production process. At-home/REMI workflows are particularly attractive options when it comes to tier two and tier three events such as college football, for example, where deploying resources on-site is simply not costeffective. In this instance, remote production enables broadcasters to expand their coverage to meet demand while keeping production costs in check.

HOW SOON WILL IT BECOME THE INDUSTRY NORM?

RICHARD MCCLURG, VICE PRESIDENT MARKETING, DEJERO
We’re already there. Dejero has enabled remote production workflows for over a decade. Dejero enabled the first live coverage of the Vancouver 2010 Olympic Games torch relay, delivering unprecedented live coverage following the torch as it travelled 45,000km across Canada. In 2013, another first enabled Sky Sports to broadcast live from all 92 English Football Clubs in a single day. Using revolutionary wireless technology at the time, Dejero blended multiple cellular connections and provided enough bandwidth to deliver high-quality live broadcast content, at significantly less expense and complexity than traditional video transport technology.

NORBERT PAQUET, HEAD OF LIVE PRODUCTION SOLUTIONS, SONY PROFESSIONAL SOLUTIONS EUROPE
Consumers are demanding more content, available whenever and wherever they choose, without any drop in quality. With this escalating pressure on broadcasters, remote production will naturally become the norm and act as a silver bullet to help keep up with growing industry demands. We’ve already seen overall connectivity (mobile or fibre networks) become a major game-changer for our industry, particularly when it comes to live production. And, with remote production set-ups, resource sharing and a more collaborative, faster turnaround time, it’s becoming even more popular. At Sony, we’ve been at the forefront of this revolution and have, to date, worked with many customers around the world to develop remote production set-ups in news, magazine and live production.

LARISSA GOERNER, DIRECTOR OF ADVANCED LIVE SOLUTIONS, GRASS VALLEY
The biggest hurdle to widespread adoption of at-home models is the challenge of latency. As we look to the future, though, broadcasters and production companies will continue to drive toward more captivating experiences that draw in viewers, using higher resolutions and more camera angles that will put greater stress on the network and available bandwidth. More efficient encoding solutions – JPEG2000, JPEG-XS and MPEG – offer an attractive alternative.These options deliver ultra-low delay, comparable to transporting the signal over fibre, and come at a significantly reduced cost while ensuring there is no difference in the viewers’ experience.

RZ: When we look at some of our leading customers, remote production is already the norm, since they have adopted Vizrt solutions that facilitate a remote production workflow. However, our larger customers are not the only ones benefiting from remote production. Many smaller productions also take advantage of the same concepts to reduce their cost and the size of their footprint. The rise of IP and the availability of software-defined productions tools, which in turn can be virtualised in the Cloud, will make remote production the norm for the majority of media productions.

HOW BIG A PART DO YOU THINK 5G WILL PLAY IN REMOTE PRODUCTION?

NP: 5G will play a key role in powering remote production for many organisations. Firstly, the low-latency transmission offered by 5G is crucial for any productions such as sporting events or news broadcasts where delays are unforgivable. Secondly, the higher bandwidth 5G offers helps deliver less compressed content to the mobile viewer but also unlocks additional applications for remote production too. Finally, given 5G enables Cloud-based production models, it helps reduce the deployment of physical OB resources, which makes productions much more sustainable.

LG: As an emerging technology, 5G will play a significant role in the broadcast industry as a reliable way to deliver content to consumers. In terms of remote production, 5G can be utilised for its greater capacity benefits. However, bandwidth is not unlimited in 5G and as we are still seeing an increasing uptick for live UHD content, baseband cannot be transported over 5G and has to be encoded in order to handle demand for this format. This adds another step in the creation process and will slow down adoption for high-end live sports production. Currently, for tier two and three productions, 5G is a means to end in providing easier contribution to the remote location and is a good candidate to enable more creativity in content production.

PM: There’s nothing mystical about 5G: it’s a faster, wireless, mobile network. As 5G gathers momentum and begins to easily handle multiple video streams from a venue it will definitely act as a catalyst to accelerate the adoption of remote production. However, very fast, reliable, and affordable network pipes are already available from any venue today and it’s important to remember that a network is just a network and all networks are getting faster; both wired and wireless. What’s more impactful than the network are technologies like the open source Secure Reliable Transport protocol (SRT), pioneered by Haivision, which enables video transport over any network.

DL: From a Calrec perspective, little will change with 5G. RP1 one was deliberately designed to be transport agnostic. From our perspective, it does not matter whether we are piggy-backing audio on a camera feed via a JPEG2000 path or via a closed AES67 wide-area network. For our clients though, 5G offers vast potential. It’s not implausible to consider a camera at a field of play sending pictures directly back to base over 5G. Companies have already achieved this with multiple 4G links. 5G technology could offer a true paradigm shift in areas ranging from traditional SNG to Premier League football. However, where there are local commentators or reporters, some local IFB mixing will still be needed and RP1 becomes even more relevant.

HOW DO YOU INCORPORATE SUSTAINABLE PRACTICES IN REMOTE PRODUCTION?

RM: Remote production is no doubt reducing the industry’s carbon footprint. The amount of kit and crews required to travel to live events is greatly reduced compared to the traditional production workflows. Less OB and larger SNG trucks are on the road. Centralising production staff at the broadcast facility means that fewer people are having to travel to the field, cutting airmiles and transport. Initiatives, such as ‘Find a Provider’, which is featured in Dejero’s Cloud-based management system, enables broadcasters to find freelancers across the globe, making it easier to find local resources to acquire content. Dejero’s MultiPoint Cloud service enables broadcasters to share field resources and contribute the pool feed simultaneously to many broadcasters, geographically dispersed.

LG: In general, the decrease in travel brought about by remote production already has a significantly lower impact on the environment. However, we believe more can be done. Enabling workflow consistency for a variety of content productions is a goal for us. Grass Valley cameras, switchers and replay products all enable the highest flexibility for any workflow, therefore allowing the creative talent to be where they are most needed to add better value. Recurring tasks can easily be centralised and produced with fewer operators, ultimately allowing more content to be created at a consistently high quality. Our DirectIP solution, for example, enables almost all production and technical staff to work from a centralised location. We also give customers the flexibility to locate creative talent either at the venue or the production hub. We continuously strive to innovate across the entire portfolio providing the latest software and hardwaretechnology to enable sustainable production in the market.

RZ: Remote production reduces the number of required people and equipment on-site. That means fewer people travelling, and fewer pieces of production equipment shipping, by plane, train, and automobile. Furthermore, a remote dedicated production centre, designed around software-defined production practices, reduces hardware usage, power consumption, and the need for active cooling versus inefficient mobile units.

MG: The headline benefit is that fewer people need to travel to the event, meaning is a significant reduction in carbon footprint. As remote production becomes ever more sophisticated – with remote camera operation, for example – so the reductions become greater. This does depend upon complex technologies becoming mainstream and commoditised, to simplify the installation and the power consumption of rig and connectivity. The recent coronavirus outbreak is seeing a very large reduction in business travel. The ability to control cameras from a central hub in any place of the world will be extremely attractive to productions, not least because of the reduced environmental impact. WHAT WILL BE THE CHALLENGES FOR REMOTE PRODUCTION AS IT GROWS?

WH: Growing demand for content and tighter schedules of events to be covered are challenging on the administrative side, as equipment needs to be available reliably at any time for a new production as soon as it is not used anymore for the previous one. Access to and reliability of the fibre infrastructure must be guaranteed. There the System Monitoring and Realtime Telemetry for Broadcast Networks like Lawo’s SMART come into play to allow for constant control and monitoring over the complete IP network installation from capturing to playout. And the more concurrent productions that are happening, the more essential it is to have such a monitoring system in place to ensure signal, sync and packet integrity and thus flawless operation.

MG: The real issue will be the management of change, particularly for people. It is a different pitch for operators: taking them away from immersion in the action and giving them comfortable, familiar working environments in exchange for greater productivity. The people issues, and the shifts in budgeting, are cultural changes, which always see a natural resistance.

DL: Connectivity is a key issue. Also getting staff to understand and adapt to it, though our customers tell us that once it’s been explained and tried, this stops being an issue! The other thing, of course, is reliability. For Calrec, this hasn’t proved an issue either. Lastly, for quick turnaround projects, or where there are multiple events in a row/ across a season, technical and workflow practices have to be set in stone. But we don’t see any reason that remote production use won’t grow significantly from here.

FULL ARTICLE AVAILABLE HERE

Published in Articles

London, UK, 16 July 2020: Mo-Sys, a world leader in precision camera tracking solutions for virtual studios and augmented reality, has brought virtual studio production within reach of everyone with StarTracker Studio, the world’s first pre-assembled production package. The system is scalable to any size production, and can support 4K Ultra HD.

Critical for virtual studio and augmented reality production is to track the position of each camera in three-dimensional space across all six axes of movement. StarTracker from Mo-Sys is proven as the most precise and reliable camera tracking package, using dots on the studio ceiling (“stars”) which are placed at random and tracked to plot camera positions with extraordinary accuracy.

To make virtual studio production accessible by all, StarTracker Studio bundles the tracking with cameras and mounts and a high-performance virtual graphics system based on the latest version of the Unreal Engine and the Ultimatte keyer. Mo-Sys’s unique plug-in interfaces directly between the camera tracking and the Unreal Engine, for extreme precision with virtually no latency.

All the hardware – including Lenovo PCs with Titan RTX GPUs – is mounted in a rolling rack cabinet and pre-wired, and all the software is loaded and configured. All the user has to do is design their unique virtual environment, power up the StarTracker Studio rack and start shooting.

“StarTracker Studio enables any organisation to create premium virtual studio content easily and simply, at an attractive price point,” said Michael Geissler, CEO of Mo-Sys. “We have made it simple by bringing the best equipment together, from respected vendors like Blackmagic Design/Ultimatte, Lenovo, Canon and Cartoni.

“That hardware works with the incredible virtual graphics power from the Unreal Engine, tied with perfect precision to real objects thanks to Mo-Sys tracking,” he continued. “It gives the user access to top end effects like hyper-realistic reflections, soft shadows to emulate real-world lighting, continual depth of field processing to emulate lens performance, and occlusion handling, so talent can walk around virtual objects. In other words, top-end, uncompromised virtual production, in a simple one-stop package.”

The standard package is supplied with three Blackmagic Ursa Mini 4.6K cameras with Canon 18 – 80 zoom lenses, and paired with Mo-Sys StarTracker tracking units. The kit also includes a camera jib, a rolling tripod and a camera rail set. An 8 channel Atem production switcher provides the live output for broadcast or streaming, and three video recorders are included for separate programme, key and graphics recording. Smart switching means that only one Ultimatte keyer is required for the system, rather than the more conventional one keyer per camera.

The package also includes the Mo-Sys Beam In kit to bring remote guests into the virtual studio. Every element can be used in HD or 4K Ultra HD. Three radio microphones and an eight-channel audio mixer are also part of the solution.The system is scalable up to the largest size of virtual production. Rack kits are available to support either eight or 16 cameras. The complete system is pre-configured by Mo-Sys before shipping, and Mo-Sys will also provide all support, including training in technical and creative aspects where required.

“We have all had to find new and inventive ways to keep up production over the last few months,” said Geissler. “StarTracker Studio is ideal for the new normal, especially including our Beam In kit for remote contributions. This is a powerful production platform aimed at anyone who wants to create virtual studio and augmented reality content, without the time and investment setting it up themselves. Hang a green screen and, with StarTracker Studio, you are good to go.”

Additional information will be available through a Webinar on 23July 2020 at 10am BST and 6pm BST.

Published in Client News

London, UK, 19 June 2020: Mo-Sys Engineering, a global leader in real time camera tracking and remote systems, has announced a revolutionary approach to bringing the atmosphere back to live sport amid covid-19 restrictions. Providing precision, zero-latency tracking for any camera (including ultra-long box lenses for sport), the Mo-Sys camera tracking kit interfaces directly to the Unreal Engine or any broadcast render engine, allowing production companies to add virtual crowds to stands.

“After so many weeks, sports fans are desperate for any action,” said Michael Geissler, CEO of Mo-Sys. “But the frustration will turn to disappointment if the atmosphere of the game falls flat because of empty stands. We have developed a camera tracking kit which any outside broadcast can implement quickly and simply, capable of filling the stands with a virtual, but enthusiastic, crowd.”

The Mo-Sys camera tracking encoders are quickly mounted onto broadcast standard Vinten Vector heads, with no impact on the camera’s perfect balance and no backlash when panning and tilting. Zoom data is collected either by gear encoders or by a serial data link to digital lenses. The combined tracking data is sent over ethernet to the workstation hosting the augmented reality software.

“We are known for the absolute precision and stability of our camera tracking – that’s why Hollywood relies on our technology,” Geissler added. “In this application, we deliver precise tracking, including compensation for lens distortion, even when a 100:1 lens is zoomed fully.”

Mo-Sys has worked with Epic Games to develop a tight interface to the Unreal Engine, including support for the latest version 4.25 software. The result is that highly photo-realistic augmented reality – such as crowds filling the stands – can be integrated into live production with no limitations and negligible latency. The kit includes the bolt-on encoding kit for Vinten heads and the lens calibration tools.

Users can see the technology in action in a Mo-Sys LiveLab webinar, which will also include contributions from Epic Games and Canon. The webinars are on 30 June, at 10.00 (register at https://bit.ly/2N1Ve44) and repeated at 18.00 (register at https://bit.ly/30GqFZP). Michael Geissler of Mo-Sys will also join a distinguished panel with the RTS Thames Valley Creative Centre’s look at production techniques for audience shows in a time of pandemic on Thursday 25 June at 17.00 (register free at https://rts.org.uk/event/future-studio-audience).

Published in Client News

London, UK, 2 June 2020: Mo-Sys Engineering, a global leader in real time camera tracking and remote systems, today announced the launch of U50. Designed and manufactured during the lockdown period, the U50 is a heavy-duty remote head that enables camera operators to safely return to work and control the biggest box lenses and heaviest broadcast camera set-ups remotely.

Offering the ultimate for camera remote control and remote production, U50 combats the current challenges and unique requirements in the Covid-19 affected world. Making it possible to operate large box lenses and cameras remotely, U50 achieves the same precision and agility when camera operators are controlling manually and on-site. With no other remote head capable of smoothly and quickly operating box lenses and with on-site production crews dramatically decreasing, this innovation offers a new solution to tackle current challenges.

Danny Zemanek, Freelance Camera Operator explained, “Previously remote production technology was an additional cost for broadcasters, but now it is essential to ensure camera operators can work safely. The U50 remote head from Mo-Sys represents a new way that operators can control big box lenses with the same precision and agility that we have come to expect.”

Built on the same high-precision backlash free drives in the successful Mo-Sys L40 remote head, the U50 head enables fast acceleration and deceleration of heavy broadcast camera packages with super telephoto lenses of up to 50kg. The unique combination of a U-shape space-frame design and zero backlash drives give U50 enough strength to achieve fast panning shots with no bouncing, even when zoomed in. Whether you are following a puck in an Ice Hockey game or trailing F1 cars on a racetrack, U50 allows you to precisely follow the action.

The U50 is specifically designed to allow remote operation, eliminating the need for operators to manually pan and tilt the camera onsite. Instead they can move the camera remotely from a control room with an intuitive pan bar that translates operator movements 1:1 to the remote head. And with the Mo-Sys TimeCam option soon to be made available, U50 has the potential to be operated anywhere in the world with virtually no delay by using Mo-Sys’ delay compensated global remote-control technique.

Michael Geissler, CEO Mo-Sys Engineering Ltd said, “Our U50 is ideal for sporting events and long-range filming environments. At the core of the U50 are our highly robust pan and tilt motors with zero backlash, providing lag-free operation. Using the power and sturdiness of the Mo-Sys L40 pan and tilt motors, but now with a double space-frame design, the U50 is Mo-Sys’ strongest head to date.”

Extremely strong and backlash-free drive units give operators instant and delay-free control. The precise gear drives enable flawless operation even with the biggest zoom lenses, unlike worm gear drives which can cause judder. The hole through the centre of the gear boxes for cable pass-through removes the need for slip-rings and makes it future-proof for high bandwidth digital video. This also allows for tangle free direct cabling.

Geissler added, “Just like the rest of our broadcast robotics and film remote heads, the U50 can be operated with a variety of input devices such as hand-wheels, pan-bar or joystick. It can be connected through a bus cable, ethernet or fibre for long distances. The button-console interface provides controls for pan/tilt velocity adjustment, input smoothing and direction, user defined position limits with feathering and axes zeroing.

“Whether you’re a freelance cameraman, work in production, broadcast, OBs, rental, virtual events, corporate or sports,” Geissler concluded, “Our U50 simplifies the operation of the head in remote-mode making it ideal for all those operating and involved – be they technical or not.”

Published in Client News

London, UK, 4 May 2020: Mo-Sys Engineering, a global leader in real time camera tracking and remote systems, is proud to announce a collaboration with Panasonic. The AW-UE150 and the Mo-Sys StarTracker module will empower AR and virtual studio graphics by together creating engaging content with natural depth and changing the perspective of the virtual background.

Previously planned to launch at NAB, the webinars will be co-hosted by Panasonic and Mo-Sys and will showcase the new Panasonic AW-UE150 - a 4K, wide-angle lens, PTZ head with absolute camera tracking for AR and virtual studio application. With Mo-Sys StarTracker, the venue or event is no longer restricted to a fixed camera position and can be more creative using the PTZ head on a jib, crane or a dolly for unlimited camera motion.

There are two sessions on offer:

10:00 (BST) - Thursday 7th May 2020 - Sign-up to webinar

18:00 (BST) - Thursday 7th May 2020 - Sign-up to webinar

Mo-Sys CEO Michael Geissler said, “Although disappointed that we could not launch at NAB, these webinars are a fantastic opportunity to demonstrate this innovative collaboration, helping a wide range of customers generate immersive and engaging content for AR and virtual studio applications. We are proud of our partnership with Panasonic and excited to showcase what is possible.”

 

Published in Client News

Mo-Sys

Wednesday, 18 March 2020
Published in Clients

NAB 2020, 19-22 April, Las Vegas, Booths C5047 and N5333: Mo-Sys Engineering, a global leader in real time camera tracking and camera remote systems, will demonstrate how to remotely operate cameras on the other side of the world, without perceived delay, with the launch of TimeCam. The solution will be shown onsite with a link between Mo-Sys’s two points of presence at NAB, on booths C5047 and N5333 (Future Park). The two booths also allow Mo-Sys to reveal the latest in virtual studio and augmented reality: Unreal Unleashed.

TimeCam represents a triple benefit to production companies. First, there is the saving in cost and environmental impact in sending camera operators to site. Second, it means that the most in-demand operators can be much more productive, providing excellent coverage at a live event each day rather than losing time through travel. Third, it means that you can add cameras to your coverage without adding headcount: for instance, a downhill ski race might have eight cameras along the course, with one operator controlling cameras 1, 3, 5 and 7 and a second controlling 2, 4, 6 and 8.

“’Traditional’ remote production puts the control back at base, but still needs camera operators to travel to the location,” explained Mo-Sys CEO Michael Geissler. “By compensating for latency in transmission and compression/decoding, TimeCam means that operators too can stay at base and be much more productive by operating on several events, when normally they could only be on one.”

As well as unveiling TimeCam, Mo-Sys will also demonstrate its flagship virtual studio technology StarTracker, which uses a constellation of dots on the studio ceiling as a camera tracking system. This is now extensively used by many prestige broadcasters to provide the tracking for augmented reality studios and is increasingly being built-in to studio cameras.

Whereas most virtual studios use a proprietary graphics system which in turn uses the Unreal gaming engine from Epic Games, StarTracker Studio (in a 19” rack) now features the plug-in Mo-Sys VP which we like to call Unreal Unleashed, a direct interface between the camera tracking and the UE4 render engine. Through the plug-in, control is direct, and no other software layer is wrapped around the Unreal Engine allowing full access to the latest UE4 features.

At NAB, the complete system will be demonstrated as a turnkey package including the computer and video hardware. In a small, wheeled-rack cabinet there is sufficient power to render for 16 concurrent 4K cameras in real time, using just two render engines and one Ultimatte keyer.

“Both TimeCam and StarTracker with Unreal Unleashed are transformative technologies, capable of bringing new creativity and productivity to the broadcast and movie worlds, allowing sophisticated productions around the globe,” Geissler said.

Published in Client News
Page 1 of 2

Contact us

Let's boost your journey to success, get in touch today

Please contact us with any questions.

Manor Marketing, 4 Olympus House, Calleva Park, Aldermaston, RG7 8SA Get Directions