Manor

manor marketing curve
Manor

Manor

Intinor, Sweden’s leading developer of products and solutions for high-quality video over internet, has added a user-configurable delay to its Direkt encoders for streaming video distribution. The new functionality is in response to requests from esports companies, but the ability to add different delays to individual destinations from the same software encoder is valuable in many applications.

Using the simple web interface, users can set delays for each destination, up to a maximum determined by the amount of memory in the encoder, but typically some hundreds of seconds. The original request, from esports producers, was to be able to serve premium customers first, with others receiving a delayed signal.

The ability to delay each output individually also means that signals could be synchronised to remote commentary applications. This may also be important for betting applications to ensure every subscriber is fairly treated.

“Intinor encoding and distribution has become a market leader in streaming, not least in the esports market,” said Roland Axelsson, Intinor CEO. “In talking to our customers, it became clear that the ability to dial in delay was something they needed, so we are pleased to add the functionality. We look forward to seeing what other applications our users find for it.”

The new software, which adapts the web user interface as well as adding the functionality, can be downloaded now.

Find out more by watching our tutorial on youTube https://youtu.be/S6kaxgI0Eb4

Hitomi Broadcast, manufacturer of MatchBox, the industry’s premier audio video alignment toolbox, has announced that NEP Connect has purchased a 4K MatchBox system - Generator and Analyser complete with a Glass license - for use in NEP Connect’s MediaCity Network Operations Centre in Salford, UK.

NEP Connect (formerly known as SIS LIVE), is a leading provider of global critical connectivity services delivering thousands of hours of news, sports and entertainment content to millions of viewers worldwide via comprehensive satellite coverage and Anylive® fibre infrastructure.

Head of Network Operations for NEP Connect Phil Goulden explained, “MatchBox has been adopted by many of our partners and customers which clearly demonstrates the strength of the product. In order to align with their test equipment and ensure that we continue to add value during technical line-ups, this purchase was essential to expand our suite of test and analysis equipment.”

MatchBox is a state-of-the-art toolkit that streamlines live broadcast synchronisation and is relied on by broadcasters worldwide to ensure the timing of live sports, news and events. It offers an intuitive identing feature, rapid measurement of lip sync, audio coherence, audio levels, phase inversions and video timing. The MatchBox Analyser tool sits in the OB truck or MCR and looks and listens for the specific Hitomi video and audio signatures produced by the hardware MatchBox Generator (which produces 4K test patterns) and/or the Glass app.

A new essential for high-profile remote productions, the Glass app allows precision measurement of lip-sync and video to video timing alignment at the point of capture. The user holds an iPhone or iPad running the free iOS app in front of up to four cameras and feeds the signal back into the Analyser.

Hitomi Broadcast Director Russell Johnson said, “Eyes and ears can be highly subjective so Hitomi’s answer is to take the guesswork out of the equation, replacing human estimates with a fast and accurate, electronic testing system. MatchBox enables a reliable and consistent measurement of any signal delays and provides reassurance when everything is perfectly aligned.”

Goulden concluded, “NEP Connect continue to invest in the latest technology, not only for event contribution and distribution encoding, but also test and analysis. The Hitomi MatchBox fits into this strategy and enables us to deliver the highest possible levels of service to our biggest broadcast customers.”

Mo-Sys Engineering www.mo-sys.com Industry leaders in virtual production and image robotics, is proud to be a gold sponsor of the inaugural 2021 virtual HPA Tech Retreat (15 – 24 March) and is looking forward to being part of this year's HPA sessions, hosting a thought leadership presentation on near-time virtual production workflow.

“The HPA Tech Retreat is widely recognized as a prestigious occasion and a great opportunity for the most innovative professionals in the industry to come together and have open and collaborative conversations,” said Michael Geissler, CEO of Mo-Sys. “We're very excited to be taking such a prominent role in this event, especially at a time of rapid changes and turmoil when we’re seeing an increase in creative solutions.” 

Today virtual production (VP) workflows are clearly split into two camps: real-time (on set finishing, final pixel, in camera VFX) and non-real-time (on set previz followed by post-production compositing). Real-time workflows force a graphics quality threshold in order to maintain real-time playback frame rates, which works well for most scenarios, but not so well when fine detail is involved. Non-real-time workflows provide an increase in graphics quality, but at the expense of significant increases in time and cost. Mo-Sys will be presenting a new dual pipeline virtual production workflow, one that combines a real-time onset VP workflow with a near-time VP workflow running in parallel, and will be explaining how it delivers higher quality graphics without the delay or cost of using a traditional post-production workflow.

Additionally, Mo-Sys will be leading breakfast and lunch roundtable sessions on ways to automate virtual production to increase graphics quality, and how to get enhanced resolution on LED walls without heavy hardware investments, and is joining with AWS to discuss virtual production in the cloud.

Full information on the event can be found here: https://www.mo-sys.com/news/events/hpa-tech-retreat-2021/.

Mo-Sys Engineering (www.mo-sys.com), leaders in virtual production and image robotics, and HYPER Studios (www.hyperstudios.co.uk), leaders in cloud broadcast graphics, have created the first virtual production system with data-driven sports graphics.
 
StarTracker Sports Studio combines a complete virtual production system, based on Epic Games’ Unreal Engine, and capable of generating moving camera virtual studios, augmented reality (AR), and extended reality (xR), with a state-of-the-art HTML5 sports graphics system.
 
An ever-increasing number of sports broadcasters use virtual studios and AR with data-driven graphics, where each graphic type is generated by two separate graphics systems. However, rather than simply keying standard overlay graphics on top of a virtual set, sports broadcasters are now asking to integrate data-driven graphics inside the virtual studio, for example, making them appear inside a virtual monitor or synchronising them to an AR animation where the graphics need to match the style and branding of the AR virtual graphics.
 
StarTracker Sports Studio enables in-context design of photo-realistic virtual graphics with data-driven sports graphics, simplifying the creation and operation of this new approach to virtual sports graphic presentation.
 
“We have engaged with many sports broadcasters in order to capture the full spectrum of graphics functionality and workflows that they need now and going forwards, in order to cover major events,” said Michael Geissler, CEO of Mo-Sys. “Working with HYPER Studios, we have designed a system that allows a sports broadcaster to create integrated and in-context virtual graphics content, and as a result produce more engaging and differentiated content for their viewers.”
 
Mo-Sys will launch StarTracker Sports Studio at the SVG Europe Sports Graphics Summit on 5 March (https://www.svgeurope.org/graphics-spotlight/). It is available to order 2 April 2021.
 

GB Labs, innovators of powerful and intelligent storage solutions for the media and entertainment industries, is now a member of the Wasabi partner network. Wasabi is known for its Hot Cloud Storage, a disruptively simple storage technology that delivers speed, productivity and price benefits.

“Everyone is talking about cloud storage, but the typical cloud model does not match what media users need,” said Howard Twine, chief product officer at GB Labs. “Our industry needs to store a lot of large files, potentially for a long time, and it needs to be able to download those files when it needs to broadcast or repurpose content. The Wasabi cloud model plays straight into those requirements, making it the perfect fit for media.”

Wasabi has disrupted the cloud storage model by significantly reducing costs and eliminating the egress charges that distort the offering for media users and their large file downloads. Add faster and ready access, and Wasabi offers a simple route to effectively infinite storage capacity. This makes it ideal for media applications, and particularly for archive storage.

GB Labs identified Wasabi as an ideal partner for its Unify Hub appliance. This media management platform applies intelligence to combine on premise storage – from GB Labs and other vendors – with the cloud, providing a working environment which is simple and fast, with tools for maximum productivity in production, post production and continuing storage. Unify Hub’s ability to intelligently manage data presents all content as a single, secure and coherent source, while minimising delays in moving material between the user and the cloud.

“Unify Hub and Wasabi make a perfect partnership, because our software brings intelligent management of storage, moving content between local stores and the cloud, taking capacity and speed of access into account, so you get to be as productive as possible,” said Twine.

The new partnership, already benefitting one leading US production company, means that users can create their own Wasabi account online, which takes moments, and immediately integrate it into a Unify Hub network. Users without Unify Hub can still get all the benefits of Wasabi cloud storage by linking it to GB Labs on-site storage hardware.

GB Labs, innovators of powerful and intelligent storage solutions for the media and entertainment industries, is proud to be involved in the inaugural 2021 virtual HPA Tech Retreat (15 – 24 March) as a gold sponsor. Through debating the change, turmoil and incredible technological advancements seen last year, and being planned, GB Labs is looking forward to being part of this year's HPA sessions – giving a thought leadership presentation, as well as discussing how remote and home working can co-exist with top-end creative media workflows. 

“We are taking a very active role in the HPA Tech Retreat this year because it is widely recognized as a place where professionals in the industry come together and really share knowledge and experience,” Dominic Harland, CEO/CTO GB Labs said. “The open and collaborative atmosphere reflects what we are looking to achieve in creating shared production workflows. We are excited to be part of a broad sweep of industry leaders debating the creative future.” 

As well as presenting on how vision and ambition can work in harmony with powerful and intuitive technology, GB Labs is also hosting a number of roundtables debating the storage challenges of the future and how remote working can be designed for ultimate and secure workflows. 

Full information on the event can be found HERE.

Cinegy GmbH, global leader for broadcast playout software in the cloud, is pleased to announce Akratek as the latest of its business partners. Cinegy AŞ opened in Istanbul at the beginning of 2020 and since then, the Turkish office has been busy expanding its business relationships with local service integrators and representatives.

Akratek is currently celebrating its 10th anniversary and has been serving the media industry for many years both as a system integrator and as a provider of a wide range of locally produced equipment. CEO Selahattin Acaroglu from Akratek said, “Throughout our 10 years’ of business, we have worked with dozens of local, satellite and national TV channels, have helped set-up most of the university radio and television studios located in Turkey and have had the opportunity to work closely with foreign TV channels of Middle Eastern origin.”

Acaroglu went on to describe how they have been following Cinegy's product developments for many years. “We are delighted that we can now offer the whole suite of their integrated and modular solutions to our customers. Cinegy AŞ has been quickly established with strong staff who know both the products and the broadcast industry very well.” 

Soon after the partnership agreement was signed, detailed training was given by Cinegy to the Akratek project team, enabling them to quickly get to know the products and solution opportunities - Video Server, News Automation, Archive, Editing, Graphic CG, Prompter, etc.

"Technology brands and products are very close in terms of functionality and quality” Acaroglu continued, “And in the field of automation, it is not easy to find companies who can provide a wide range of integrated solutions. Cinegy, as a brand, can satisfy customers easily with an efficient workflow, ease of installation and local support.

Software and information technology infrastructures have become indispensable parts of the media world. Conventional device manufacturers cannot easily adapt to the fast-paced IT developments such as the move to UHD and 8K. Software-based products like those offered by Cinegy, however, are developing faster with high added value.

Murat Küçüksaraç COO of Cinegy AŞ confirmed, “We are delighted to strengthen our existing business by welcoming Akratek to our partner infrastructure. Their knowledge and experience perfectly complement the other partners, as well as our focus, and we are looking forward to working with them more closely in the future.

Cinegy's customer portfolio in Turkey consists of many of the largest local media groups, training institutions and corporate customers. With its business partners, Cinegy AŞ can better respond to its customers' ever-changing and developing workflow and transformation needs faster and with a better quality of service.

Production automation specialists nxtedition worked with Talbot Productions and Broadcast Solutions to create a virtual, streamed gala evening for the UK’s celebration of Australia Day 2021. The event, streamed live to invited guests, included music, special messages from famous Australians living in the UK and even a wine tasting.

The Australia Day Foundation, which raises funds to provide financial help for Australians to further their education in the UK, holds an annual gala, on Australia Day (23 January), in Australia House in London. The ban on gatherings because of covid-19 ruled this out in 2021, but the organisers were keen to create the same atmosphere virtually.

While the host, comedian Bec Hill, and musical theatre star Daniel Koek and band were in Australia House, to maintain social distancing there was a minimum crew on location, with nxtedition technology providing remote production. The 1080p output of three cameras, plus the mixed live audio, were packaged using Dream Clip Barracuda encoders into an SRT stream and sent over the public internet to nxtedition’s data centre, 1000km away in Malmö, Sweden.

nxtedition’s Adam Leah, working from his home studio in Marieholm, a further 30km away, switched the programme live using nxteditions automation and shotbox to control the router, video servers, graphics servers, vision mixer and audio desk back at the Malmö data centre. He also played in pre-recorded elements of the evening to create the complete live event. As well as being delivered to the ticket holders, the nxtedition system also sent a reverse video feed of the live output back to London, again as SRT over the internet, along with the prompter script and Trilogy communications.

“The client wanted the same high production values as we are used to delivering in real life,” said Charlie Talbot of Talbot Productions. “I had worked with nxtedition before, and I knew just how incredibly powerful their remote production capabilities are. Everyone was really impressed with the way we tackled the challenges of creating an engaging and immersive virtual gala dinner. Even when we can ease the restrictions on people, remote production is going to be the way to go on events like this, reducing the crew we need on site without compromising quality or capabilities.”

For nxtedition, Adam Leah said “We created an entire evening’s event, with six people on site: three cameras, two lighting operators and an audio mixer. Because of the way that nxtedition is designed, I could monitor and switch the show based on the script, with pre-recorded inserts dropping in as we got to the right point in the show. Even though it was a long event and I was the only person in control of the production it was not stressful. This is another exciting demonstration of the capabilities of remote production to deliver very high standards to a whole range of events.”

StarTracker is Mo-Sys’s precision 6-axis camera and lens tracking system, designed for use with xR stages, LED volumes, and green/blue screens, for broadcast or for film VFX production. Along with real-time photo-realistic graphics, and real-time compositing, these three technologies have enabled the current explosion in virtual production for both VFX-heavy feature films, television series, and sports production.

“We are delighted that not only has disguise selected StarTracker for demo usage in its local offices,” said Mike Grieve, Commercial Director of Mo-Sys, “but also that their customers are adopting StarTracker technology for xR shows, events, and virtual production.”

Virtual production, according to multiple industry reports, is set to grow substantially throughout 2021. Key cost savings over traditional production workflows include savings on locations, as these can be scanned using photogrammetry techniques and re-created as highly detailed real-time 3D environments. In addition, by capturing VFX shots in camera, post-production compositing is either reduced or removed entirely.

Tom Rockhill, Chief Commercial Officer of disguise, commented, “It’s important to us that disguise offers customers flexibility as they embark on a journey with our xR technology. Mo-Sys StarTracker is a great choice for customers looking to get into xR or virtual production. It’s a robust and well liked solution and many of our clients have made an investment in StarTracker to support their stage and LED volume builds.”

Discover the disguise xR solution here: web.disguise.one/learnmore

Mo-Sys Engineering (www.mo-sys.com), world leader in precision camera tracking solutions for virtual studios and augmented reality, has joined in partnership with VividQ, pioneers in computer-generated holography for next-generation augmented reality (AR) displays. This allows 3D holographic projections to be placed precisely in real space, enabling users of future AR devices, like smart glasses, to explore virtual content in context with the natural environment. 

Mo-Sys StarTracker is a proven and powerful camera tracking technology, widely used in television production and other creative environments for applications from virtual studios to realtime set extensions. It provides precise location for the camera in XYZ space, and with free rotation. 

VividQ software for computer-generated holography is used in innovative display applications from AR wearables, to head-up displays. Holography - the holy grail of display technologies - relies on high-performance computation of complex light patterns to project realistic objects and scenes, for example in AR devices. VividQ generates holographic projections which, thanks to the precision location of Mo-Sys, can be displayed to the user at the correct place in the real environment. This is a major advance on today’s AR devices where flat (stereoscopic) objects are mismatched with the real world. By presenting holographic projections with depth, the user’s eyes can focus naturally as they scan the scene. 

“The possibilities and applications of augmented reality in realtime devices are only just being explored,” said Michael Geissler, CEO of Mo-Sys Engineering. “We are at the cutting edge of camera tracking; VividQ is at the cutting edge of computer-generated holography, and we are excited to work together to bring some of these concepts to reality.” 

Darran Milne, CEO of VividQ added “Our partnership with Mo-Sys is key to understanding the potential of computer-generated holography in future AR applications, developing experiences where virtual objects can blend seamlessly into the real world.” 

Page 3 of 16

Contact us

Let's boost your journey to success, get in touch today

Please contact us with any questions.

Manor Marketing, 4 Olympus House, Calleva Park, Aldermaston, RG7 8SA Get Directions