By Hitomi Staff | Hitomi | Published 6th April 2020
These are challenging times for the broadcast industry. As more media organisations enforce home working, the spread of the coronavirus may well be speeding up the transition to remote production. Here at Hitomi we're working remotely ourselves but are still available to help and support our customers new and old.
We're delighted to be able to make our own contribution for easier, professional-looking remote broadcasting with the launch of our free iOS app for fast and precise remote lip-sync measurement that can be used by reporters or presenters wherever they are.
Simply hold an iphone or ipad running the Glass app in shot and the production unit, wherever they are, be it in a studio on the other side of the world, can align the measurements for lip-sync accuracy in seconds using the MatchBox Analyser.
“Synchronisation is often one of the last items on a field production checklist and can happen right up to going live or on-air. Often, it can be discovered too late that there are sync issues, which can be costly and, at the very least, embarrassing to resolve at the last second.”
- Hitomi Broadcast Director Russell Johnson
Hitomi's MatchBox solutions have been delivering peace of mind to the broadcast industry around the world for many years, providing an easy and accurate method for near instant timing alignment of sound and vision.
Now our highly anticipated free MatchBox Glass app is available to download from the Apple app store and to make things easier still we're offering a free 30-day trial of the licence to use the app on your MatchBox Analyser* to help you get started.
Free iOS app
Measure lip-sync from the lens of the camera with an easily accessible free app your reporter, presenter or talent can use.
Accurate alignment & synchronisation that can be demonstrated on departure to avoid potential disputes.
Going live in five?
Forget clapper boards and tapping mics. Check audios are in line and get lip-sync accuracy in seconds.
By Jennie Priestley | TVBEurope | Published in the Jan/Feb 2020 Edition
When broadcasting a huge awards show to an international audience it’s imperative to make sure both sound and vision are in sync. It’s even more important when the show features numerous musical acts because fans will be quick to take to social media if there are any issues with their favourite performers.
In order to ensure fans around the world got the best possible experience while watching November’s MTV EMA Awards in Seville, Spain, Viacom employed Hitomi Broadcast’s Glass and Matchbox products to ensure perfect synchronisation between the microphones and cameras at the venue, and in the international feeds. “The biggest crime in broadcast is to have your audio out of sync with your video,” explains Matt Okotie, lead engineer, Viacom International Media Networks. “We time that down to milliseconds or point zero of a millisecond. Hitomi gave us a box that could read that signal, and basically send the signal with picture from London to anywhere in the world.”
The Matchbox software gives the user a reading of how much the audio is out of sync with the video, and enables them to line it back up on all their equipment. “With global distribution on an event such as the MTV EMAs, we were sending eight different feeds, HD, UHD, mains and backups back to London, and that’s globally distributed into the US, Latin America, Asia Pac, all over Russia, all over Europe,” explains Okotie. “You need to make sure everything’s in sync every step of the chain so we use Hitomi boxes globally to sync up everybody together.”
Having used Matchbox previously, the EMAs was the first time Viacom had used Hitomi’s newest product, Glass. “We initially went to demonstrate our Matchbox product to Matt Okotie at the Viacom offices in Camden, North London. As a result, he hired a unit for the 2018 MTV EMA event,” explains Russell Johnson, Hitomi’s managing director. “In September 2019, we showed him Glass on our stand at IBC and he said he would love to field trial it for us. We had to say yes to such a great opportunity!”
“The one bit that was missing previously was linking the camera to the OB compound sync,” adds Okotie. “With Glass you hold up an iPad and then get sync from multiple cameras back to the OB compound. We had 14 cameras at the main venue in Seville and then multiple cameras around the other venues and on the red carpet. We needed to make sure all the cameras were synced so that as you cut to each one, they’re all in sync as well.”
Of course, when you’re sending a feed around the world, there must be latency issues depending on where it’s going. It’s likely to arrive in Manchester faster than it will in Moscow or Manhattan. How does the Viacom production team work around that? “We measure it at each point,” explains Okotie. ‘We measure the latency from Seville to London, London to New York. From London we transmit to everywhere, so we do every section individually.” To help with the synchronisation, Viacom used multiple Hitomi Matchboxes. “We had one in Seville, one in London, India and New York,” says Okotie. “For UHD distribution we used the Hitomi technology and then we sort of down-converted it from UHD and distributed it as HD to countries that are using older kit. We also use the Hitomi boxes internally in our London studios quite a lot to sync up all our cameras there. The boxes work on any broadcast that need sync.”
According to Okotie, the Hitomi products are “the most accurate on the market” and Viacom is already looking at using the kit on more events. “We’re also looking to invest across our main data centres in New York and London over the next couple of quarters.
By Contributor | TVBEurope | Published 8th April 2020
Storage provider GB Labs and Ortana, the creator of Cubix, the asset orchestration, management and automation software, have come together to provide a unique customer experience.
Ortana Founder and CTO James Gibson said: “People talk about media asset management, but orchestration is what people are really interested in. MAM is just a by-product.
“But if an orchestrator can’t accurately understand the devices it’s talking to or what is taking place with a piece of technology at any moment in time, it’s not much use.”
To demonstrate interoperability, several years ago Ortana conducted a proactive Cubix integration project in conjunction with GB Labs storage as a best-of-breed exercise.
According to Gibson: “I worked with GB Labs for many years as a customer and have great respect for their expertise. One major benefit of working with them on various projects was their consistency in approach of establishing and ensuring a technical commonality across their product range.
“What that means is that although their products are designed to suit a wide range of needs, it is technically consistent. From a standalone LTO device to their high-end SSD storage and everything in between, all are driven with in-built intelligence anchored by their CORE.4 OS.”
GB Lab’s new CORE.4 is a high-performance custom OS specifically designed to serve media files with an additional intelligence layer that delivers ultimate stability and quality of service for every user. Moreover, its power-saving intelligence means that CORE.4 ensures consistent, reliable performance whilst using the least amount of disks. Its expanded range of demonstrably useful features are all engineered to further enhance users’ ability to manage and enhance online workflows.
Gibson added: “When it comes to integrating those storage systems with our asset orchestration technology, CORE.4 OS enables it to be done simultaneously and seamlessly. Establishing interoperability with the most crucial component needed for orchestration, i.e., storage, is painless with GB Labs.
“That synergy is because the modular approach to product development that both companies take is very similar, which benefits customers of both. In applications for which they are deployed, Cubix and GB Labs can work independently, but to achieve optimum performance, they benefit from working together. The manner in which they are individually architected means that both systems know exactly what is expected of the other to work in tandem, whether it’s ingest, content discovery, archive, workflow orchestration, tape ingest or one of many other tasks. They just ‘get each other’.”
Another key parallel, and benefit, is reusability.
Gibson said: “Many products these days have a working life that can easily exceed the life of the project for which they were purchased. Cubix orchestration software and GB Labs storage products, on the other hand, can be easily redeployed to address changing business requirements without having to justify and endure another round of CAPEX.”
And it’s those differing needs that Ortana soon plans to address in conjunction with GB Labs by co-parenting “Kiosk”, an exciting new approach based on the concept of “bring your own storage”.
It has long been a tenant of both Ortana and GB Labs that to use their respective technologies there is no need to rip out existing infrastructures. Both are able to sit as a layer on top, and make better use, of what is already there.
Ortana has designed Kiosk to make managing media simpler and more affordable by wrapping orchestration around existing storage until the time comes to upgrade or expand.
Gibson said: “The concept of Kiosk is that, if you have legacy storage, or storage you are contracted to, you can reinvigorate it with an orchestrator that includes a fast way to find and retrieve assets or anything else that you specifically need it to do. GB Labs and Cubix are respectively renowned for enabling users to make use of what they already have. We’ve taken a page from what GB Labs has done with its award-winning Mosaic software. In a sense, Kiosk reimagines Mosaic for its own purposes.”
GB Labs storage systems Mosaic is a combination of AI and intelligent storage that culminate in an automatic, vastly enriched way to track and find media assets. Kiosk is a complementary technology designed to fully examine the movement of media through an active workflow. In cases that include GB Labs storage, Kiosk and Mosaic work in concert to exploit the intelligence of both.
Kiosk is initially targeted at, what have traditionally been, smaller clients, and Gibson anticipates that Kiosk will help people understand that Ortana can layer orchestration on top of their existing storage, if that’s what they prefer.
Gibson concludes: “We have a great relationship with GB Labs, but for those who are not quite ready to upgrade their storage speed and reliability, Kiosk can assume initial responsibility for an existing infrastructure and drive what they have, if that’s all they want for now.
“However, we work with more than 50 integration partners, and we all share a belief in each other’s products and a confidence that when we work together, we can deliver what we promise. The pairing of minds at Ortana and GB Labs is an ideal illustration of partners who know and trust one another.
“I have the greatest respect for the GB Labs team; our business model commonality; technical expertise; and like-minded approach to thinking differently about how to further improve life for our customers. In my view, it’s a perfect pairing of orchestration and storage that enables its users to thrive in a rapidly changing content creation, transmission and distribution market.”
Video Interview| Kitplus Daily | Published 7th May 2020
Chief Solutions Officer Duncan Beattie featured as a guest on the KitPlus Daily Show. Watch the video below to find out how our products and solutions are ideally suited to remote working, remain secure and are versatile enough for markets outside of the broadcast industry.
By Ben Pearce | TVBEurope | Published in the July/August 2020 Edition
CBO Asia and Co-Founder of GB Labs, Ben Pearce talks to TVBEurope about how the industry is evolving and its knock on affects on opex.
Until recently, operating expenses – opex to most of us – were defined as the expenses a company incurs through normal business such as rent, equipment, inventory, marketing, payroll, insurance, plus R&D.
It’s long been a central tenet of business, and broadcast in particular, to continually strike the right balance between keeping operating expenses in check, or reducing them, without significantly impacting a company’s ability to compete.
It’s obvious that the majority of capital expenditures have stalled for the time being, but opex carries on, although under increased scrutiny.
And that pressure in recent years is due to a wide range of reasons as the broadcast industry reinvents many aspects of itself; so much so that opex reduction has been forcibly recalibrated to include, “How do I sensibly mitigate my financial and operational risks but stay in business if disaster strikes?”
The industry was already heading that way, but has had a major fire lit under it that has accelerated the need to ensure operational security even if under unexpected pressure…and be able to ensure it from anywhere in the world.
An Asia-Pacific customer of GB Labs’ regional dealer realised late last year that its disaster recovery system, often thought of a ‘nice to have’, was costing it more money in maintenance and substandard performance that it was delivering, and in what turned out to be a prescient move, contacted GB Labs about installing CloakDR, which is the most complex system in our portfolio. Nevertheless, its installation is typically a straightforward process of working closely with the client to determine their specific requirements; configuring a system to suit those needs; and spending several days on-site with a small team of local engineers from the dealer and the customer to ensure the install goes smoothly.
But that’s impossible when you suddenly find, between the point of ordering and the installation date, that you’re not able to get within physical proximity of each other, let alone within thousands of miles. It’s one thing to reduce opex, but this was not how anyone foresaw achieving it.
GB Labs is quite used to doing remote installs. Installing a standard storage system is pretty easy whether the customer is in the middle of the Sahara or the Arctic. But a sophisticated CloakDR system is a different closet of cloaks and would normally require several days on-site. In this case, ancillary components had to be ordered that, again, would normally be sourced on-site and integrated in-situ, but on-site sources, or a visit, weren’t options and the customer needed the system as soon as possible.
To mitigate any obstacles, we safely assembled the core system at our Berkshire HQ, including the ancillary components we ordered in. We then shipped the complete system to our dealer with each section clearly demarcated for connection. CloakDR requires two units to work together over a highly advanced networking system to provide full resilience across switches, storage, client connections, and a great many other devices and connections, which doesn’t lend itself to a ‘quick start’ process. And, because it relies heavily on seamless networking, it’s not usually something you would try to establish from the other side of the world, but we had no choice.
Once the kit arrived the local engineers followed our instructions, overcoming considerable language barriers, with support provided by us remotely. We all worked together in challenging circumstances to get a necessary job done fast. It not only worked, but was achieved far cheaper than would otherwise have been the case.
I say that because it’s interesting to note that the new disaster recovery system was up and running in only three days, which coincidentally is roughly the same amount of time it would have taken had we been physically on-site. That we’re able to do so much on such a complex system, and do it all from half way around the world, gives our local dealer and the end user comfort, and it saves us all a heck’uva lot of, let’s face it, often unnecessary travel.
I’m not saying that remote installation will be right for every scenario. There’s still no substitute for hands-on, face-to-face deployment, but if you have no other choice, it’s satisfying to know that remote installation, even for complicated projects, is not only highly doable, but may increasingly be seen as preferable.
So, have the multiple challenges of 2020 so far accelerated the inevitable, i.e., fast-tracked the adoption of new ways of working that not only drive down opex by enforcing more financially and environmentally efficient operations? Or is it fundamentally redefining what opex should really be about?
It’s too soon to tell, but I would not be surprised if opex and capex were soon replaced by acronyms to be defined later (ATBDL). And we all love our acronyms, don’t we?
By Jenny Priestley | TVBEurope | Published 23rd March 2020
How modern archive systems can help media companies make the most of their legacy content
As numerous linear TV schedules shudder to a halt with the cancellation of live sport, or postponement of production on continuing dramas, broadcasters are looking to alternative programming to fill the sizeable void left behind.
TVBEurope asks three experts how modern archive systems can help media companies make the most of their legacy content.
“If broadcasters already have them in place right now filled with the relevant metadata, this allows them instantly to call up content and build new schedules around themes or topics to captivate the audience,” explains Jan Weigner, CEO at Cinegy.
“Better yet, if archive systems were also used during the production, new programming can be created out of the existing raw material. This is perfectly illustrated by BBC NHU’s Planet Earth series archived raw footage, which was ‘gold mined’ for dozens of other projects. The same can be done for documentaries in general, but also educational content, training, reality TV, other unscripted formats as well as news and sports.
“This of course requires keeping more material in the archive than only just what went into the first aired programme,” Weigner continues. “As storage is getting increasingly cheaper and video compression better, while the range of formats and platforms that need to be served increases, it is a wasted opportunity to not keep as much of the original raw footage as possible. Or to vary a popular phrase: One man’s B-roll is another man’s new programme!”
With many members of staff now working from home, how easy is it for production teams to access digitised archive content? According to Jeff Braunstein, director of product management at Spectra Logic, this is where the Cloud can shine: “The use of Cloud is becoming more and more a part of the storage and workflow landscape,” he explains. “Storage management software supports movement of files to popular Cloud platforms, be it for Cloud-based workflows or disaster recovery purposes.
“Broadcasting organisations can leverage this software to create a multi-tier private Cloud to store a copy of frequently accessed assets on online disk, and archive infrequently or rarely accessed assets indefinitely on the Perpetual Storage Tier (which can consist of Cloud, object storage disk, and tape). This enables sharing content globally by utilising the public Cloud’s inherent infrastructure to make content available to disparate users and sites worldwide.”
But what if a broadcaster doesn’t have content stored in the Cloud? Are there fast turnaround options/solutions for providers sitting on a hoard of legacy content but who lack, perhaps, a fit-for-purpose archive or MAM system?
“It depends on the type of content and where that content sits, but there are many fast turnaround options based on content discovery and media indexing tools,” says Julian Fernandez-Campon, CTO at Tedial. “These are modern MAM solutions that allow a quick scan and bulk ingest of content into the MAM without moving it while generating some basic information. It can also create a proxy to provide first access to content, which can be enriched later with AI or any other automatic analysis tools.”
“There are no miracles especially if the content is still on tape,” adds Weigner. “File-based material can be imported with the speed being largely scalable. Tapes on the other hand are normally ingested in real-time, but this can also be scaled to some extent. The bigger problem is that if the content is still on video tape then you are fighting a losing battle against technology extinction and availability of playback equipment to make this happen at all. In this regard celluloid-based content is the lesser problem, but of course needs to be telecined as well (and maybe in UHD this time around).”
Of course, any drastic changes to profitable scheduling will have a financial impact for the providers. What are the monetisation options for the distribution of archive content through linear and OTT channels to help media companies maintain revenue?
“There are many and they will depend on the target audience,” says Fernandez-Campon. “With new OTT platforms, it’s possible to monetise media by creating focused channels or a series of content for specific target audiences, that cannot find what they want on other platforms. We have seen this in some productions on platforms like Netflix where series or documentaries are published to engage a specific sector.”
“The monetisation options are almost binary. If you don’t know what you have and can’t access it in real-time you stand to make no money at all,” says Weigner. “If you have real-time access to all your content and it is all digital already, you can slice and dice and package and sell it in dozens of different ways immediately. If it just sits on some shelves, no matter whether is on video tape, celluloid, data tape, or so on, it is just dead data.”
By Adrian Pennington | InBroadcast | Published 14th April 2020
A review of technologies enabling production companies and broadcasters to deliver high quality content to viewers while optimising costs, resources, and eliminating travel.
Whilst the world grapples with the emergency outbreak of the coronavirus, we are seeing not only how people modify their behavior but will see how businesses must modify theirs. Events being canceled, travel being scaled back and replaced with teleconferencing. Many corporations have sent staff home to work where it is possible to do so.
This is all made possible because we as a society have already have much of the technology to facilitate flexible working. Give your office-based staff a laptop and access to the internet, and they are ready to sit in their home office or at their kitchen table.
“What has changed in the last few weeks is that working remotely is no longer a work-life balance argument, or a nice-to-have, it is now a question of business continuity,” says Jan Weigner, CEO, Cinegy. “The crisis is forcing companies to reevaluate their ways of working and finally act upon it. The technological infrastructure is in place and we have the tools ready to go – from acquisition over production to distribution, all can be handled remotely and / or in the cloud.”
With bases in the UK, mainland Europe, Middle East, Australia and North America, Never.no’s teams are able to service regional customers without the risk of the virus affecting workflows or production needs. Bee-On is its cloud-based audience engagement platform runs on AWS for access anywhere with a web browser and internet connection, “so there is no need for production teams to be managed under one roof,” CEO Scott Davies says.
“Individual projects can be pre-planned and packaged with audience generated content and dynamic visualisations prior to delivery / broadcast of live or pre-recorded content. Viewers continue to watch, more-so during a crisis, so content producers need to continue programming and deliver captivating content, with audience engagement a priority – Bee-On can help deliver this.”
He adds, “We’re seeing a need for packaged end-to-end solutions that utilise cloud-production and seamlessly integrates ‘off-the-shelf’ graphics and compatibility with native broadcast graphics for a wide range of programming, such as news, live events and popular chat shows. Gone are the days where production is managed and delivered from one hub.”
Demand for Quicklink’s video call management system has never been higher, according to CEO Richard Rees. The firm is releasing a completely browser-based cloud supported workflow with automated Panasonic PTZ camera and lighting.
“A journalist could sit at home and interview someone located elsewhere live to air while a colleague edits the video online (in Adobe Premiere) and in realtime,” says CEO Richard Rees. “That edit could be passed to a control room for wider channel distribution. The whole environment is now virtualised. We believe this is the future.”
VSN has added new capabilities for remote interoperability to its VSN NewsConnect web plugin for news production. This were on the cards for a NAB release but recent events have made them more relevant.
VSN NewsConnect, which brings together a number of third party tools required for news production, now enables users to control multiple studios in different locations, even if the systems used in the studios are different.
“What this means is that a journalist can simply send a news item to any studio and NewsConnect will automatically ensure that the delivered content matches the format requirements of the receiving devices,” said Patricia Corral, marketing director. “This remote interoperability is very useful in enabling news to be repurposed to the requirements of local broadcasters without worrying about technical compatibility.”
Pixel Power’s work is currently mainly based around large projects for refurbishment or replacement of playout and production infrastructure; projects with long timescales, so the current viral outbreak isn’t yet causing any major changes in demand.
“Our technology can be virtualized and deployed in data centre or public cloud, with remote access operation from anywhere in the world,” explains James Gilbert, CEO. “This is not something that can be done as an impulse reaction to the current situation - this capability has to be architected and designed into the product from the beginning.”
Once the outbreak subsides, the evolution of remote, decentralised working practices is likely to accelerate. “The industry is already moving towards remote, decentralised working practices because of the ecological and economic benefits,” Gilbert says. “The ability of staff to work from any location is core to that concept and whilst it is an obvious advantage during the current outbreak where staff may be required to, or choose to, work from home, I do not feel the pace of change will be accelerated - there are already enough drivers for it.”
Collaborative workflows with someone sitting next to you or on the opposite side of the world is in the DNA of storage solutions specialist GB Labs.
“We’ve fostered cloud integration for years and therefore, have always offered a remote workflow,” says Dominic Harland, CEO/CTO. “Obviously, there will be many other challenges with this ongoing situation, but GB Labs is confident that accessing content securely and quickly will not be one of them.
He thinks current events will accelerate solutions to enable a faster response to any future crisis. “The next two/three months is not long enough to develop, test and bring to market anything exceptional, but we are definitely looking at developing new products and new solutions. Whether this becomes a real-world advantage that the customer will want to buy after the outbreak subsides, well, that’s a different question.”
Each Bridge Technologies product has transformative potential in the field of remote broadcast and production, but none so more than its Widglets API. This leverages the full value of data collected by its VB440 - video, audio and ancillary - not only for network performance monitoring but also for a multitude of other workflows and applications. Full motion, colour-accurate, ultra-low-latency video, for example, can be made available from any source to any application or user.
“Being browser based, all that is required is a laptop and a network connection,” explains
Tim Langridge Head of Marketing. “Each geographically dispersed user receives feeds from multiple cameras with multiple waveform vectorscopes and streams via a single HTML5 video monitor view. Not only does this result in incredible technical improvements in production and improved decision making, but also logistically frees up immense amounts of room in OB vans or MCRs – making them more efficient, affordable and adaptable.”
Blackbird has seen a significant increase in sales enquiries since the containment phase began. “Enterprises need effective technology solutions to enable their workforces to operate efficiently whilst working at home or remotely,” says CEO, Ian McDonough. “Blackbird is a fully featured video editor available in any browser and can operate at low bandwidth. It's the perfect solution for the majority of live and file-based video production workflows.”
Essentially Blackbird can be used by anyone, any time, anywhere and this flexibility is enormously attractive to enterprises looking to drive massive productivity efficiencies through their operations. It also runs on bandwidth as low as 2Mb/s which is ideal given the pressure in traffic over the network – a situation which has caused Netflix and YouTube to throttle back their bitrates.
“As teams become used to de-centralised video production and enterprises enjoy significant infrastructure savings together with a flexible globally distributed workforce untethered to source content, we anticipate an accelerated adoption of Blackbird,” McDonough adds.
For live sports workflows, there are few production partners more experienced than Gravity Media. In February it wrapped its 2000th remote production, in this case of a Pac-12 Networks’ broadcast of the USC Trojans 65-56 win over the Washington State Cougars.
This impressive number includes ‘At Home’ centralized productions that were undertaken under the Proshow Broadcast (acquired by Gravity Media in July 2018) and Gearhouse Broadcast brand.
The benefits of this remote approach are obvious, with REMIs offering a cost-efficient modern workflow that is operationally flexible and durable. By centralizing the control room, video switching, audio mixing, graphics, replays and show production can all be done ‘At Home’ in the broadcast centre. This means that smaller, more affordable purpose-built mobile units can be used at the venue. Only video and audio acquisition hardware such as engineered cameras, microphones and announcer headsets, as well as comms hardware, a transmission interface and engineering support are required on site.
Company president Michael Harabin, says, “The potential for creating quality programming at an attractive price has never been greater, and we now have over 2000 proof points that showcase its consistent effectiveness and our ability to deliver.”
Sweden’s Intinor specialises in helping companies overcome the challenges of remote production. “As we are currently in lock-down of travel for personnel, the benefits of remote production could be felt all the more keenly,” says Daniel Lundstedt, regional sales manager. “Instead of having to arrange for operators to travel on location, broadcasting companies could instead work with local talent with equipment all that needs to be shipped rather than staff members.”
Intinor is already able to make going live, from anywhere, very easy, without marshalling a small (but expensive) army to make it happen. It’s all down to the “supreme mobility” of its Direkt link remote production pack. With an Intinor Direkt receiver or router in a control room, captured audio and video from a camera or mixer connected to a backpack can be streamed over public internet to a Direkt router and then re-streamed using other protocols, transcoded or outputed to SDI or NDI.
Mobile Viewpoint has a heritage in remote production solutions, especially for live streaming. CEO Michel Bais says the company has proven to reduce costs for production companies by not having to send a wealth of resource to an event.
“As we see companies trying to reduce their carbon footprint, it has emerged that it is not only cost savings that are driving these innovations,” he tells InBroadcast. “In line with this philosophy, we have developed remote cameras that allow sports games to be live streamed but without the need for a camera crew or an onsite production team.”
With the IQ-Sports Producer, an entire field of play can be recorded with a single 4x4K camera, while AI is used to create a virtual zoom of the play by automatically following players and the ball. Games can be live steamed in real time and with different format versions depending whether it is for web streaming, or for higher quality broadcasts requiring HD-SDI workflows, all at a fraction of the cost of an on-site production team.
vPilot is another AI driven solution from Mobile Viewpoint that can be used for remote newsrooms. A combination of cameras using 3D sensors and audio cues means round-table discussions can bet set-up without the need for a camera team or an onsite director. “Both IQ-Sports Producer and vPilot can be managed remotely with cameras that can be semi-permanently installed to create quality and cost-effective programming,” Bais says.
Net Insight’s plug and play solution Nimbra extends the production workflow to reach remote venues anywhere on the globe, with the same ease of operations as for traditional in-house productions. Users include
Nimbra is a high-quality multi-service media transport over IP platform supporting both native video and audio in addition to standard IP/Ethernet. Built-in video processing, low-latency JPEG 2000 and MPEG-4 encoding as well as unique features for equipment control and synchronisation makes it a great choice for remote production. Users include SVT and TV2 Denmark.
“100 percent reliability is key for remote live production and our solution offers mechanisms to assure the content is delivered with perfect quality regardless of network issues,” the company states. “Enterprise customers can use the solution to deliver live video content to support internal communications and working remotely.”
All of Cinegy’s software solutions lend themselves to flexible working practices. “We have long been a proponent of virtualization and IP – and what is the cloud if nothing more than using someone else’s computer, hosted somewhere else? Says Weigner.
“Give your office-based staff a laptop, access to the internet and access to Cinegy software– locally or in the cloud, and they are ready to remotely produce content using Cinegy Desktop, remotely playout content with Cinegy Air; remotely monitor channels with Cinegy Multiviewer. Whether our customer is at home or at another location and needs to set-up a pop-up channel in the cloud, doesn’t matter.
“Our customers who already embraced our workflows are more prepared and ready to deal with the new business practices that are emerging,” he argues. “Being ready for this business process change is markedly harder than being ready for a technology change. In this case, circumstances are dictating that there must be change. The barriers are being lowered and it is time to embrace it.”
By Adrian Pennington | Creative Planet Network | Published 13th April 2020
There’s no escaping the fact that 8K is four times as many pixels as 4K but recording 8K is easier and less expensive than you think.
For many, the idea of recording 8K video understandably conjures up images of unmanageable files sizes, long transfer times, huge piles of hard drives, and slow proxy workflows not to mention a black hole in the budget.
What’s more, with the biggest showcase for 8K TV—the Tokyo Olympics—delayed, the demand for content delivered in 8K is likely to stay in the bottle a little longer.
Leaving aside for one moment the fact that HDR and HFR are far more valuable than resolution to the consumer’s eye, there are benefits to an 8K production which an increasing number of projects are taking advantage of.
Mank, directed by David Fincher and lensed by Erik Messerschmidt, ASC, was acquired 8K using the RED Monstro in monochrome; and Money Heist, the Netflix drama which in season 4 is shot at 7K to accommodate HDR in a 4K deliverable, are just two of the most recent.
You can’t sell productions made in less than 4K to Netflix and other streaming services now. One day soon, some will mandate 8K to begin with and Netflix will have its fair share in the bank.
Even if the final output is only going to be 4K/UHD, shooting in 8K gives you many options in post that you do not have when starting in 4K. These include downscaling, cut/crop (pan/scan) or headroom for VFX.
“Before making the decision to capture a project in 8K, producers and cinematographers need to consider the project’s long-term goals,” says Bryce Button, director of product marketing, AJA Video Systems. For instance, capturing in 8K makes a lot of sense if there will be future use for the material.
“And even if not currently working in 8K nor planning to move to 8K in the future, 8K acquisition can also be hugely beneficial for capturing background plates for VFX and for virtual sets in live broadcast,” Button continues. “Having a larger raster for the background gives producers the confidence that as they zoom, pan and tilt around the background video plate or set, they’ll be delivering the cleanest possible imagery.”
When selling your shots, for example to stock footage outlets, 8K still manages to command considerably higher prices and is much rarer, so there is a chance to sell more and make more money at the same time. 8K is still a unique selling point and as Barry Bassett, the MD at London-based camera rental house VMI puts it, “That means bragging rights.”
“If you can acquire in 8K, there is no good reason not to do it,” urges Jan Weigner, Co-Founder & CTO at broadcast and post software developer Cinegy. “This is the same question that we were supposed to ponder when the switch to 4K happened. Currently camera rental cost for 8K can be higher, but in terms of total production costs, your budget would have to be seriously constrained or require many simultaneous cameras to not be able to shoot in 8K.”
Producing in 8K is no different to 4K: The availability of hardware to capture, edit and store 8K makes the high-resolution format unavoidable. There are also now tools to answer the demand from HD SDR to 8K HDR, and everything in between.
“All the necessary parts of the 8K puzzle are in place,” says Atomos CMO Dan Chung.
All current NLEs handle 8K, at least if you are using the latest version.
The main cost that will hit your pocket are camera rental/purchase and the proper lenses to go with it. RED cameras are pretty much the only option for an 8K TV or feature workflow but there should be healthy competition at the rental houses. Other options, such as Sony, Ikegami and Sharp 8K TV cameras might use the latest 8K Canon lenses and that can be costly.
Canon’s announcement, in February, of an 8K DSLR was a game-changer in that respect. “Not so long ago if you wanted to shoot 8K anywhere near affordably you had to shoot RED,” Chung remarks. “Now you can do so on a prosumer camera. Canon has clearly laid down a marker that others are sure to follow.”
Details including price, release date and even sensor are sparse but Canon says the full-frame EOS R5 will feature a blistering 20fps electronic shutter, dual memory card slots, and a Body Image Stabilization function to provide shake-free movies.
“There’s a misconception that 8K is vastly more expensive than it actually is,” says Button. “Generally, moving to 8K is an incremental cost, especially if you’re already working in 4K or have worked in stereo 3D. The biggest expense often comes with storage and moving large volumes of data, but the strides made by the industry to support 4K and S3D have provided a strong foundation to support the data needs that 8K workflows require.”
Recording and Monitoring Options
By nature, 8K is a massive format and is therefore inherently data-intensive. As such, in certain circumstances, it may be advantageous to avoid shooting a fully uncompressed 8K video and instead seek out codecs that keep data sizes manageable where the balance between data size and perceived quality is preserved.
“As with any project, it’s crucial to always start with the end in mind,” advises Button. “If uncompressed footage is a necessity for everything from video effects needs to deep color work, uncompressed will always offer a range of advantages.
However, he notes, many projects – whether for broadcast or other delivery methods – may be better served using codecs specially designed for editing and grading, where media and workload savings on workstations can be incredibly advantageous.
“Apple ProRes, for example, has been tuned to specifically provide resolution details and color depth that are more than acceptable while providing the appropriate media bandwidth storage and minimizing CPU strain.”
In terms of monitoring, 8K displays are just beginning to surface, but are still scarce but as Weigner points out so are inexpensive, cinema quality, reference grade HDR 4K screens.
“You could use UHD/4K monitors or TVs and just zoom in when necessary,” he says. “Brand name 8K TVs sized 65” or even 75” can be bought well below U$3000 and they usually have a decent enough image that can be tuned manually to meet certain TV production demands.”
AJA offers audio and video I/O solutions like the KONA 5 to facilitate downconversion and monitoring of 8K content on 4K displays in real-time, whether for editing or other tasks. AJA says it is working very closely with major NLE and color grading companies to ensure that its Desktop Software and KONA I/O cards provide a seamless 8K creative experience whether working on macOS, Windows, or Linux workstations.
For many projects, the codec will be defined by what the camera produces, unless one uses an external recorder.
The Atomos Neon line of cinema monitors and recorders come with a 4K master control unit but the firm has additionally announced an 8K master control unit, which can upgrade every Neon to an 8K recorder. The unit allows for recording and monitoring 8K video at 60 fps. Both, ProRes and ProRes RAW are supported straight from the camera sensor.
“If you go 8K you need ProRes RAW since this allows you to get a manageable file size and all the benefits of working with raw data,” says Chung.
Users of RED camera will be familiar with Redcode RAW, the proprietary recording format. Redcode is always compressed – there is no uncompressed version, claimed by Red to be visually lossless, and there’s no chroma subsampling or color space attached to the R3D RAW files. Visually lossless is usually good enough for any type of post-production including green screen work.
For example, using a Weapon Helium or Monstro at 8K 24fps to a 240GB Red Mini-Mag would record on average 259Mb/s and just 16 minutes record time (per mag). Upping the compression to 10:1 would double the record time and halve the bitrate. At the highest compression of 22:1, the figures would be 59Mb/s and 69 minutes. You can calculate your own figures from the Red website: https://www.red.com/recording-time
Netflix recommend a Redcode value of between 5:1 and 8:1. UK rental house Proactive has done some useful groundwork on recording 8K with newer Red cameras like the Monstro, Helium and Gemini.
It concludes that the majority of productions shooting Red use 8:1 as it offers “a fantastic balance between quality at the highest level, and practical data rates for the production to handle.”
The big surprise though, finds Proactive, is that if you use the Monstro in 8K at 8:1 as your standard compression level, it actually becomes much more manageable than the raw formats from Red’s competition, even some Prores formats. This becomes even more obvious when you go down to the 5K Gemini sensor. It found that at 8:1, the Gemini actually has smaller file sizes in 5K 16-bit RAW than the Sony Venice does in 4K XAVC-I which isn’t a RAW format.
Cinegy’s codec, Daniel2, specifically targets 8K and higher resolution production. Weigner claims it is up 20x faster than Apple ProRes or AVID DNxHR.
“With Daniel2, 8K production is as fast and easy as HD production, albeit requiring considerably more storage,” he asserts. “But since the days of HD we also have seen storage costs decrease massively while storage speed, thanks to the advent of SSDs, has increased dramatically. Put these factors together allows 8K production on inexpensive laptops or computers costing well below $2000 with standard NLE software such as Adobe Premiere.”
Weigner says that he edits 8K on a three-year-old Dell laptop without any issues or speed problems. This, of course, uses the Daniel2 codec accelerated by GPU inside Adobe Premiere and exported using H.264 or HEVC for distribution using Cinegy’s GPU accelerated export plugin.
“This may not satisfy high-end workflows, but will be sufficient for the average news, sports, even documentary production,” he says. “Editing these long GOP formats is much tougher. But depending on the NLE, the use of on-the-fly proxies or render caches and hardware acceleration by using graphics cards this does not need to be the case.”
Arguably, making a production in 8K will future-proof it to mitigate any risk and make it more attractive for sale in the long term.
“In the end this all depends on the type of production and how many cameras are needed and how much you will shoot using which codec and so on,” Weigner says. “Making clever decisions to begin with will reduce a lot of pain, headaches and ultimately cost.”
With the loss of this summer’s Olympic Games, the move to 8K content for some broadcasters is now on hold. Jenny Priestley talks to Cinegy’s Jan Weigner about the future of adoption for the new resolution
2020 was supposed to be the year 8K would take the world by storm. Both NHK in Japan and RAI in Italy had announced plans to broadcast this summer’s Olympic Games in 8K, the first time the new resolution would be seen on TV in the home - at least in Europe. But then the coronavirus pandemic hitand the Games were postponed to 2021.
While some in the industry have been preparing for the gradual adoption of 8K, last year Cinegy announced that its entire software-defined television product range is 8K capable. In fact, the company’s CEO Jan Weigner says it’s been working with 8K size video and much bigger resolution in earnest for more than five years. “We’ve been developing video codecs for almost 20 years, so we need to anticipate what is required in the future as this is not developed overnight,” adds Weigner. “Most broadcasters are not even UHD yet and many are still to go fully HD. The broadcast industry is not the pacesetter anymore and has not been for a very long time.
Cinegy prides itself on being very forward-thinking, having developed technologies that have been used in HPC (High Performance Computing) or Artificial Intelligence for a long time. The company takes those technologies and applies them to video processing, which enables it to achieve processing speeds that Weigner describes as “orders of magnitude faster than what the traditional vendors achieve. This makes our technology equally relevant for other sectors such as gaming, AR/VR, enterprise communication, GIS, medical and defence. We are happily serving not just the broadcast industry,” he adds.
While NHK and RAI are ready to be the leaders of the broadcast world, there remain many linear channels that are still unable to offer content in HD, let alone 4K or 8K. According to Weigner, it will be OTT platforms that lead the way in the move to the new resolution. “8K broadcasting outside Japan is not going to happen,” he argues. “Even the value of having a, or should I say one, live 8K broadcast via satellite is highly dubious. The bandwidth required at the moment consumes an entire satellite transponder. Other than for the sakes of technical ‘showing off ’ this is pointless.
“Via broadband internet or 5G, the 50-100 Mbit/s required for a very good quality HEVC, or soon AV1, encoded 8K stream is easily delivered. 8K delivery via OTT is possible today to tens of millions of homes around the globe. People watching Netflix in UHD could also be watching in 8K in terms of technical infrastructure required (and of course needing an 8K TV).”
Weigner also cites some broadcasters taking their time to adopt IP as another reason why we won’t see 8K content on linear channels for quite a while. “They are not going to touch 8K for many years other than maybe the odd proofof-concept,” he says. “Most have not moved to UHD yet and they have equally not managed to go to IP successfully. For now, IP and 8K are mutually exclusive both technically and budgetarily. IP and UHD is difficult and also expensive at least when using SMPTE 2110.
“On the production side, things look a bit different,” he continues. “High-profile productions that are expected to have a long shelf life and high resell opportunity are already being future-proofed by being shot in 8K, even if the final output is just 4K/UHD. But the original footage exists in 8K and 8K versions can be made whenever required or a joint venture partner like NHK mandates.
As stated at the beginning, this summer’s Olympics was supposed to be the showcase for both broadcast equipment and consumer technology to show off their years of work and preparation in terms of 8K. Weigner describes the postponement of the Games to 2021 as a “disaster of the highest order” for all those companies who were planning to debut their new tech ahead of the Games. “The double whammy of losing NAB 2020 and Tokyo 2020 being postponed means that we have seen no new 8K television cameras publicly launched.
“We can record, edit, mix and playout 8K without any problems, but some 8K TV cameras that are actually shipping for real and in numbers would be helpful too. As a result of this failure to launch the Tokyo 8K bonanza, both smartphones and gaming will take most of the 8K glory away from the Games by being there first.
“The Samsung Galaxy S20 is the first publicly available smartphone that can record 8K video, I have one in my pocket,” Weigner adds, “and many other vendors will release their 8K capable phones this year. This will drive 8K adoption more than anything else during 2020.”
What does Weigner think the next 12-24 months hold for the future of 8K in terms of adoption by the broadcast industry? Again, he believes it will be smartphones that lead the way. “Consumers have started to record 8K by the millions with the release of the Samsung Galaxy S20 Plus und Ultra, the two S20 models that are 8K capable,” he explains. “Tens of millions will do 8K video recording with their smartphones before the end of 2020. Next year 8K recording will become a standard feature for mid-range smartphones.
“8K TVs are already the choice for anyone buying a toprange TV – or those buying a new TV for the next coming 4-5 years,” he continues. “With prices continuing to drop, 8K TVs will start displacing top- and middle-range 4K TV offerings. This has nothing to do with what consumers will watch on those TVs. That will most likely still be HD and maybe some UHD. But why buy a 4K TV when the 8K TV is the same or similar price, but offers a future-proof investment? This is exactly the same situation we have today with 4K TVs. We all have them already but there is very little 4K TV content to watch – other than Netflix and Amazon, which will also be first to give us genuine 8K content to watch.”
Finally, does Weigner see a time when we move past 8K and start adopting 16K or even higher? “The ‘industry’ will adopt resolutions higher than 8K, that is for sure, but it will not be ‘broadcast’ as we know it,” he says.
“The top-range Samsung Galaxy S20 model has a 108MP CCD sensor,” Weigner notes. “CCD sensors with even higher resolution have existed for quite some time, but they were not made for smartphones or video recording. 150MP CCD sensors for use in smartphones are coming soon, which will allow 2x2 pixel binning for higher sensitivity in darkness whilst still maintaining full 8K resolution, but other than that, this is a real 16K sensor which will deliver good-looking 16K stills and video in daylight scenarios.
“For high-end VR recording, setups using six or more 8K cameras are being used,” says Weigner. “The stitched output video size easily exceeds 16K. For a fully immersive VR experience, the video resolution can never be high enough. 16K seems a lot, but for a 360-degree VR recording this equates to little over 5K for the normal 120-degree field of view we humans have. For real immersive VR video recording with 8K resolution for these 120-degrees, we need 24K horizontal resolution. If you want stereo vision for the proper 3D effect – double it.
“You think this is mad and can’t be done?” laughs Weigner. “Cinegy can do up to 64K today with our Daniel2 video codec - so bring it on!”
By Contributor | TVBEurope | Published 18th June 2020
TVBEurope recently featured an interview with senior staff at a major playout centre, talking about the “uberisation” of playout, and noting that they now had the capabilities to playout a broadcast channel using only software applications.
As a follow-up, we talk to three vendors who have been leaders in advocating virtualised software platforms, capable of running in the machine room or in the Cloud. Jan Weigner of Cinegy, Adam Leah of nxtedition and Ciáran Doran of Pixel Power, a Rohde & Schwarz Company, gave their views.
“Being a software company is the only thing Cinegy has ever done – we have been saying ‘SDI must die’ for years,” Weigner says. “We never considered being a hardware company. Our first systems used MPEG-2, because that was what was available when we started out in 2006. It just made sense to us to keep the content in MPEG-2 rather than continually converting back to baseband, because every conversion step degrades the signal.”
Doran adds that Pixel Power started out making hardware decades ago because it was the only way to get the performance its software products needed. “As soon as it was practical we moved away from being a heavy metal company. Pixel Power was the first to offer premium broadcast graphics on COTS hardware, and from there we became the first to develop software-only automation.”
Swedish vendor nxtedition started life as a systems integrator, and found that automation systems invariably had gaps between the supposedly fully-functional hardware products. “We wanted to provide our customers with a system that not only worked, but reduced complexity,” Leah explains. “So we developed the functionality in software, because that is the obvious way to do it.
“Systems should be easy to use, and easy to maintain,” he adds. “If you reduce complexity you do not need so many technicians, so you can employ more journalists and creative talent. And if the system is so intuitive you can learn it in a couple of hours, you can be more productive. Our technology is largely used in news production, and being first and fast with the news is always the primary driver.”
While the nxtedition platform is designed as a single-source solution, it does include APIs and the implementation of open standards. For Pixel Power, Doran emphasises that “open standards absolutely has to be the way to go.
“Broadcasters have always regarded themselves as different, wanting specific functionality for their unique operations. With ST-2110, they can continue to demand best of breed solutions.”
Weigner agrees that open standards are vital, but adds a note of caution. “ST-2110 is designed to be used within one facility – other standards exist for the long haul.
“DVB is an IP signal. UDP as a standard is 40 years old. All the building blocks for IP connectivity between facilities and functions have been in place for 25 years or more.”
All three agree that to achieve the necessary performance, software systems for broadcast need to be built on an architecture that minimises the processor demands by only using the precise functionality needed from moment to moment.
Microservices form the foundation of virtualisation, and virtualisation leads inevitably to discussion of the Cloud.
“Cloud is a conversation starter,” Doran says. “People want to talk Cloud, but the reality is that it is more secure and more cost-effective to do it on-premise. The business model of the Cloud is that it costs little or nothing to upload: the costs are in the download. So do the maths.”
Adam Leah of nxtedition adds, “Because video servers get very big, they need to be near at hand. Having them on premises works out significantly less expensive – we did the sums for one of our clients and one third-party server charges worked out at around three times the capital cost, per year.”
He also says that latency is a critical issue. “Broadcasting is very hungry: we need a new frame every few milliseconds. But the Cloud is not about synchronous delivery, it is about scale. It really doesn’t matter if it takes 100 milliseconds or 220 milliseconds to authorise a credit card transaction. These delays can be problematic in delivering video”
“What people really want is virtualisation,” emphasises Cinegy’s Weigner. “The Cloud is just virtualisation running on someone else’s computer.” For an application like broadcast, where processes are pretty constant, then you do not need the elasticity, so why pay someone else to provide a service you could do yourself?
One area where elastic scale is a positive benefit is in disaster recovery. “We have been preparing for the wrong sort of disaster,” Doran says. Planning for business continuity has traditionally been based on a lack of access to the primary facility because of fire or flood, so all the staff get in cars to drive to a replica installation somewhere else.
Covid-19 has brought a different sort of disaster: the staff cannot get to any sort of facility, at least not in the usual numbers. So the ability to access playout from anywhere becomes very desirable.
“German broadcasters, for instance, are looking into a common, shared playout facility,” adds Doran. “If you can access a playout installation in one region, why can’t you access it from home? You only need KVM, and IP KVM has negligible latency.”
Weigner makes the point that the Cloud business model of very low cost uploads plays into this disaster recovery application, as you can have all the content and software ready and waiting for only hundreds of dollars a year, and spin-up playout channels very quickly should it become necessary.
“It is not only about CPUs,” he says. “One of Cinegy’s early projects was about accelerating video using GPUs – we have 20 years’ experience. GPU virtualisation in the cloud tremendously reduces footprint. You only need one CPU cored to run an HD channel if you have GPU acceleration. So you can run an HD channel for maybe 20 cents an hour.”