manor marketing curve


By Jenny Priestley | TVBEurope | Published 23rd March 2020

How modern archive systems can help media companies make the most of their legacy content


As numerous linear TV schedules shudder to a halt with the cancellation of live sport, or postponement of production on continuing dramas, broadcasters are looking to alternative programming to fill the sizeable void left behind.

TVBEurope asks three experts how modern archive systems can help media companies make the most of their legacy content.

“If broadcasters already have them in place right now filled with the relevant metadata, this allows them instantly to call up content and build new schedules around themes or topics to captivate the audience,” explains Jan Weigner, CEO at Cinegy.

“Better yet, if archive systems were also used during the production, new programming can be created out of the existing raw material. This is perfectly illustrated by BBC NHU’s Planet Earth series archived raw footage, which was ‘gold mined’ for dozens of other projects. The same can be done for documentaries in general, but also educational content, training, reality TV, other unscripted formats as well as news and sports.

“This of course requires keeping more material in the archive than only just what went into the first aired programme,” Weigner continues. “As storage is getting increasingly cheaper and video compression better, while the range of formats and platforms that need to be served increases, it is a wasted opportunity to not keep as much of the original raw footage as possible. Or to vary a popular phrase: One man’s B-roll is another man’s new programme!”

With many members of staff now working from home, how easy is it for production teams to access digitised archive content? According to Jeff Braunstein, director of product management at Spectra Logic, this is where the Cloud can shine: “The use of Cloud is becoming more and more a part of the storage and workflow landscape,” he explains. “Storage management software supports movement of files to popular Cloud platforms, be it for Cloud-based workflows or disaster recovery purposes.

“Broadcasting organisations can leverage this software to create a multi-tier private Cloud to store a copy of frequently accessed assets on online disk, and archive infrequently or rarely accessed assets indefinitely on the Perpetual Storage Tier (which can consist of Cloud, object storage disk, and tape). This enables sharing content globally by utilising the public Cloud’s inherent infrastructure to make content available to disparate users and sites worldwide.”

But what if a broadcaster doesn’t have content stored in the Cloud? Are there fast turnaround options/solutions for providers sitting on a hoard of legacy content but who lack, perhaps, a fit-for-purpose archive or MAM system?

“It depends on the type of content and where that content sits, but there are many fast turnaround options based on content discovery and media indexing tools,” says Julian Fernandez-Campon, CTO at Tedial. “These are modern MAM solutions that allow a quick scan and bulk ingest of content into the MAM without moving it while generating some basic information. It can also create a proxy to provide first access to content, which can be enriched later with AI or any other automatic analysis tools.”

“There are no miracles especially if the content is still on tape,” adds Weigner. “File-based material can be imported with the speed being largely scalable. Tapes on the other hand are normally ingested in real-time, but this can also be scaled to some extent. The bigger problem is that if the content is still on video tape then you are fighting a losing battle against technology extinction and availability of playback equipment to make this happen at all. In this regard celluloid-based content is the lesser problem, but of course needs to be telecined as well (and maybe in UHD this time around).”

Of course, any drastic changes to profitable scheduling will have a financial impact for the providers. What are the monetisation options for the distribution of archive content through linear and OTT channels to help media companies maintain revenue?

“There are many and they will depend on the target audience,” says Fernandez-Campon. “With new OTT platforms, it’s possible to monetise media by creating focused channels or a series of content for specific target audiences, that cannot find what they want on other platforms. We have seen this in some productions on platforms like Netflix where series or documentaries are published to engage a specific sector.”

“The monetisation options are almost binary. If you don’t know what you have and can’t access it in real-time you stand to make no money at all,” says Weigner. “If you have real-time access to all your content and it is all digital already, you can slice and dice and package and sell it in dozens of different ways immediately. If it just sits on some shelves, no matter whether is on video tape, celluloid, data tape, or so on, it is just dead data.”



By Adrian Pennington | InBroadcast | Published 14th April 2020

A review of technologies enabling production companies and broadcasters to deliver high quality content to viewers while optimising costs, resources, and eliminating travel. 

Whilst the world grapples with the emergency outbreak of the coronavirus, we are seeing not only how people modify their behavior but will see how businesses must modify theirs. Events being canceled, travel being scaled back and replaced with teleconferencing. Many corporations have sent staff home to work where it is possible to do so.

This is all made possible because we as a society have already have much of the technology to facilitate flexible working. Give your office-based staff a laptop and access to the internet, and they are ready to sit in their home office or at their kitchen table.

“What has changed in the last few weeks is that working remotely is no longer a work-life balance argument, or a nice-to-have, it is now a question of business continuity,” says Jan Weigner, CEO, Cinegy. “The crisis is forcing companies to reevaluate their ways of working and finally act upon it. The technological infrastructure is in place and we have the tools ready to go – from acquisition over production to distribution, all can be handled remotely and / or in the cloud.”

With bases in the UK, mainland Europe, Middle East, Australia and North America,’s teams are able to service regional customers without the risk of the virus affecting workflows or production needs. Bee-On is its cloud-based audience engagement platform runs on AWS for access anywhere with a web browser and internet connection, “so there is no need for production teams to be managed under one roof,” CEO Scott Davies says.

“Individual projects can be pre-planned and packaged with audience generated content and dynamic visualisations prior to delivery / broadcast of live or pre-recorded content. Viewers continue to watch, more-so during a crisis, so content producers need to continue programming and deliver captivating content, with audience engagement a priority – Bee-On can help deliver this.”

He adds, “We’re seeing a need for packaged end-to-end solutions that utilise cloud-production and seamlessly integrates ‘off-the-shelf’ graphics and compatibility with native broadcast graphics for a wide range of programming, such as news, live events and popular chat shows. Gone are the days where production is managed and delivered from one hub.”

Demand for Quicklink’s video call management system has never been higher, according to CEO Richard Rees. The firm is releasing a completely browser-based cloud supported workflow with automated Panasonic PTZ camera and lighting.

“A journalist could sit at home and interview someone located elsewhere live to air while a colleague edits the video online (in Adobe Premiere) and in realtime,” says CEO Richard Rees. “That edit could be passed to a control room for wider channel distribution. The whole environment is now virtualised. We believe this is the future.”

VSN has added new capabilities for remote interoperability to its VSN NewsConnect web plugin for news production. This were on the cards for a NAB release but recent events have made them more relevant.

VSN NewsConnect, which brings together a number of third party tools required for news production, now enables users to control multiple studios in different locations, even if the systems used in the studios are different.

“What this means is that a journalist can simply send a news item to any studio and NewsConnect will automatically ensure that the delivered content matches the format requirements of the receiving devices,” said Patricia Corral, marketing director. “This remote interoperability is very useful in enabling news to be repurposed to the requirements of local broadcasters without worrying about technical compatibility.”

Pixel Power’s work is currently mainly based around large projects for refurbishment or replacement of playout and production infrastructure; projects with long timescales, so the current viral outbreak isn’t yet causing any major changes in demand.

“Our technology can be virtualized and deployed in data centre or public cloud, with remote access operation from anywhere in the world,” explains James Gilbert, CEO. “This is not something that can be done as an impulse reaction to the current situation - this capability has to be architected and designed into the product from the beginning.”

Once the outbreak subsides, the evolution of remote, decentralised working practices is likely to accelerate. “The industry is already moving towards remote, decentralised working practices because of the ecological and economic benefits,” Gilbert says. “The ability of staff to work from any location is core to that concept and whilst it is an obvious advantage during the current outbreak where staff may be required to, or choose to, work from home, I do not feel the pace of change will be accelerated - there are already enough drivers for it.”

Collaborative workflows with someone sitting next to you or on the opposite side of the world is in the DNA of storage solutions specialist GB Labs.

“We’ve fostered cloud integration for years and therefore, have always offered a remote workflow,” says Dominic Harland, CEO/CTO. “Obviously, there will be many other challenges with this ongoing situation, but GB Labs is confident that accessing content securely and quickly will not be one of them.

He thinks current events will accelerate solutions to enable a faster response to any future crisis. “The next two/three months is not long enough to develop, test and bring to market anything exceptional, but we are definitely looking at developing new products and new solutions. Whether this becomes a real-world advantage that the customer will want to buy after the outbreak subsides, well, that’s a different question.”

Each Bridge Technologies product has transformative potential in the field of remote broadcast and production, but none so more than its Widglets API. This leverages the full value of data collected by its VB440 - video, audio and ancillary - not only for network performance monitoring but also for a multitude of other workflows and applications. Full motion, colour-accurate, ultra-low-latency video, for example, can be made available from any source to any application or user.

“Being browser based, all that is required is a laptop and a network connection,” explains
Tim Langridge Head of Marketing. “Each geographically dispersed user receives feeds from multiple cameras with multiple waveform vectorscopes and streams via a single HTML5 video monitor view. Not only does this result in incredible technical improvements in production and improved decision making, but also logistically frees up immense amounts of room in OB vans or MCRs – making them more efficient, affordable and adaptable.”

Blackbird has seen a significant increase in sales enquiries since the containment phase began. “Enterprises need effective technology solutions to enable their workforces to operate efficiently whilst working at home or remotely,” says CEO, Ian McDonough. “Blackbird is a fully featured video editor available in any browser and can operate at low bandwidth. It's the perfect solution for the majority of live and file-based video production workflows.”

Essentially Blackbird can be used by anyone, any time, anywhere and this flexibility is enormously attractive to enterprises looking to drive massive productivity efficiencies through their operations. It also runs on bandwidth as low as 2Mb/s which is ideal given the pressure in traffic over the network – a situation which has caused Netflix and YouTube to throttle back their bitrates.

“As teams become used to de-centralised video production and enterprises enjoy significant infrastructure savings together with a flexible globally distributed workforce untethered to source content, we anticipate an accelerated adoption of Blackbird,” McDonough adds.

For live sports workflows, there are few production partners more experienced than Gravity Media. In February it wrapped its 2000th remote production, in this case of a Pac-12 Networks’ broadcast of the USC Trojans 65-56 win over the Washington State Cougars.

This impressive number includes ‘At Home’ centralized productions that were undertaken under the Proshow Broadcast (acquired by Gravity Media in July 2018) and Gearhouse Broadcast brand.

The benefits of this remote approach are obvious, with REMIs offering a cost-efficient modern workflow that is operationally flexible and durable. By centralizing the control room, video switching, audio mixing, graphics, replays and show production can all be done ‘At Home’ in the broadcast centre. This means that smaller, more affordable purpose-built mobile units can be used at the venue. Only video and audio acquisition hardware such as engineered cameras, microphones and announcer headsets, as well as comms hardware, a transmission interface and engineering support are required on site.

Company president Michael Harabin, says, “The potential for creating quality programming at an attractive price has never been greater, and we now have over 2000 proof points that showcase its consistent effectiveness and our ability to deliver.”

Sweden’s Intinor specialises in helping companies overcome the challenges of remote production. “As we are currently in lock-down of travel for personnel, the benefits of remote production could be felt all the more keenly,” says Daniel Lundstedt, regional sales manager. “Instead of having to arrange for operators to travel on location, broadcasting companies could instead work with local talent with equipment all that needs to be shipped rather than staff members.”

Intinor is already able to make going live, from anywhere, very easy, without marshalling a small (but expensive) army to make it happen. It’s all down to the “supreme mobility” of its Direkt link remote production pack. With an Intinor Direkt receiver or router in a control room, captured audio and video from a camera or mixer connected to a backpack can be streamed over public internet to a Direkt router and then re-streamed using other protocols, transcoded or outputed to SDI or NDI.

Mobile Viewpoint has a heritage in remote production solutions, especially for live streaming. CEO Michel Bais says the company has proven to reduce costs for production companies by not having to send a wealth of resource to an event.

“As we see companies trying to reduce their carbon footprint, it has emerged that it is not only cost savings that are driving these innovations,” he tells InBroadcast. “In line with this philosophy, we have developed remote cameras that allow sports games to be live streamed but without the need for a camera crew or an onsite production team.”

With the IQ-Sports Producer, an entire field of play can be recorded with a single 4x4K camera, while AI is used to create a virtual zoom of the play by automatically following players and the ball. Games can be live steamed in real time and with different format versions depending whether it is for web streaming, or for higher quality broadcasts requiring HD-SDI workflows, all at a fraction of the cost of an on-site production team.

vPilot is another AI driven solution from Mobile Viewpoint that can be used for remote newsrooms. A combination of cameras using 3D sensors and audio cues means round-table discussions can bet set-up without the need for a camera team or an onsite director. “Both IQ-Sports Producer and vPilot can be managed remotely with cameras that can be semi-permanently installed to create quality and cost-effective programming,” Bais says.

Net Insight’s plug and play solution Nimbra extends the production workflow to reach remote venues anywhere on the globe, with the same ease of operations as for traditional in-house productions. Users include

Nimbra is a high-quality multi-service media transport over IP platform supporting both native video and audio in addition to standard IP/Ethernet. Built-in video processing, low-latency JPEG 2000 and MPEG-4 encoding as well as unique features for equipment control and synchronisation makes it a great choice for remote production. Users include SVT and TV2 Denmark.

“100 percent reliability is key for remote live production and our solution offers mechanisms to assure the content is delivered with perfect quality regardless of network issues,” the company states. “Enterprise customers can use the solution to deliver live video content to support internal communications and working remotely.”

All of Cinegy’s software solutions lend themselves to flexible working practices. “We have long been a proponent of virtualization and IP – and what is the cloud if nothing more than using someone else’s computer, hosted somewhere else? Says Weigner.

“Give your office-based staff a laptop, access to the internet and access to Cinegy software– locally or in the cloud, and they are ready to remotely produce content using Cinegy Desktop, remotely playout content with Cinegy Air; remotely monitor channels with Cinegy Multiviewer. Whether our customer is at home or at another location and needs to set-up a pop-up channel in the cloud, doesn’t matter.

“Our customers who already embraced our workflows are more prepared and ready to deal with the new business practices that are emerging,” he argues. “Being ready for this business process change is markedly harder than being ready for a technology change. In this case, circumstances are dictating that there must be change. The barriers are being lowered and it is time to embrace it.”


By Adrian Pennington | Creative Planet Network | Published 13th April 2020

There’s no escaping the fact that 8K is four times as many pixels as 4K but recording 8K is easier and less expensive than you think.

For many, the idea of recording 8K video understandably conjures up images of unmanageable files sizes, long transfer times, huge piles of hard drives, and slow proxy workflows not to mention a black hole in the budget.

What’s more, with the biggest showcase for 8K TV—the Tokyo Olympics—delayed, the demand for content delivered in 8K is likely to stay in the bottle a little longer.

Leaving aside for one moment the fact that HDR and HFR are far more valuable than resolution to the consumer’s eye, there are benefits to an 8K production which an increasing number of projects are taking advantage of.

Mank, directed by David Fincher and lensed by Erik Messerschmidt, ASC, was acquired 8K using the RED Monstro in monochrome; and Money Heist, the Netflix drama which in season 4 is shot at 7K to accommodate HDR in a 4K deliverable, are just two of the most recent.

You can’t sell productions made in less than 4K to Netflix and other streaming services now. One day soon, some will mandate 8K to begin with and Netflix will have its fair share in the bank.

Even if the final output is only going to be 4K/UHD, shooting in 8K gives you many options in post that you do not have when starting in 4K. These include downscaling, cut/crop (pan/scan) or headroom for VFX.

“Before making the decision to capture a project in 8K, producers and cinematographers need to consider the project’s long-term goals,” says Bryce Button, director of product marketing, AJA Video Systems. For instance, capturing in 8K makes a lot of sense if there will be future use for the material.

“And even if not currently working in 8K nor planning to move to 8K in the future, 8K acquisition can also be hugely beneficial for capturing background plates for VFX and for virtual sets in live broadcast,” Button continues. “Having a larger raster for the background gives producers the confidence that as they zoom, pan and tilt around the background video plate or set, they’ll be delivering the cleanest possible imagery.”

When selling your shots, for example to stock footage outlets, 8K still manages to command considerably higher prices and is much rarer, so there is a chance to sell more and make more money at the same time. 8K is still a unique selling point and as Barry Bassett, the MD at London-based camera rental house VMI puts it, “That means bragging rights.”

Acquisition Options

“If you can acquire in 8K, there is no good reason not to do it,” urges Jan Weigner, Co-Founder & CTO at broadcast and post software developer Cinegy. “This is the same question that we were supposed to ponder when the switch to 4K happened. Currently camera rental cost for 8K can be higher, but in terms of total production costs, your budget would have to be seriously constrained or require many simultaneous cameras to not be able to shoot in 8K.”

Producing in 8K is no different to 4K: The availability of hardware to capture, edit and store 8K makes the high-resolution format unavoidable. There are also now tools to answer the demand from HD SDR to 8K HDR, and everything in between.

“All the necessary parts of the 8K puzzle are in place,” says Atomos CMO Dan Chung.

All current NLEs handle 8K, at least if you are using the latest version.

The main cost that will hit your pocket are camera rental/purchase and the proper lenses to go with it. RED cameras are pretty much the only option for an 8K TV or feature workflow but there should be healthy competition at the rental houses. Other options, such as Sony, Ikegami and Sharp 8K TV cameras might use the latest 8K Canon lenses and that can be costly.

Canon’s announcement, in February, of an 8K DSLR was a game-changer in that respect. “Not so long ago if you wanted to shoot 8K anywhere near affordably you had to shoot RED,” Chung remarks. “Now you can do so on a prosumer camera. Canon has clearly laid down a marker that others are sure to follow.”

Details including price, release date and even sensor are sparse but Canon says the full-frame EOS R5 will feature a blistering 20fps electronic shutter, dual memory card slots, and a Body Image Stabilization function to provide shake-free movies.

“There’s a misconception that 8K is vastly more expensive than it actually is,” says Button. “Generally, moving to 8K is an incremental cost, especially if you’re already working in 4K or have worked in stereo 3D. The biggest expense often comes with storage and moving large volumes of data, but the strides made by the industry to support 4K and S3D have provided a strong foundation to support the data needs that 8K workflows require.”

Recording and Monitoring Options

By nature, 8K is a massive format and is therefore inherently data-intensive. As such, in certain circumstances, it may be advantageous to avoid shooting a fully uncompressed 8K video and instead seek out codecs that keep data sizes manageable where the balance between data size and perceived quality is preserved.

“As with any project, it’s crucial to always start with the end in mind,” advises Button. “If uncompressed footage is a necessity for everything from video effects needs to deep color work, uncompressed will always offer a range of advantages.

However, he notes, many projects – whether for broadcast or other delivery methods – may be better served using codecs specially designed for editing and grading, where media and workload savings on workstations can be incredibly advantageous.

“Apple ProRes, for example, has been tuned to specifically provide resolution details and color depth that are more than acceptable while providing the appropriate media bandwidth storage and minimizing CPU strain.”

In terms of monitoring, 8K displays are just beginning to surface, but are still scarce but as Weigner points out so are inexpensive, cinema quality, reference grade HDR 4K screens.

“You could use UHD/4K monitors or TVs and just zoom in when necessary,” he says. “Brand name 8K TVs sized 65” or even 75” can be bought well below U$3000 and they usually have a decent enough image that can be tuned manually to meet certain TV production demands.”

AJA offers audio and video I/O solutions like the KONA 5 to facilitate downconversion and monitoring of 8K content on 4K displays in real-time, whether for editing or other tasks. AJA says it is working very closely with major NLE and color grading companies to ensure that its Desktop Software and KONA I/O cards provide a seamless 8K creative experience whether working on macOS, Windows, or Linux workstations.

For many projects, the codec will be defined by what the camera produces, unless one uses an external recorder.

The Atomos Neon line of cinema monitors and recorders come with a 4K master control unit but the firm has additionally announced an 8K master control unit, which can upgrade every Neon to an 8K recorder. The unit allows for recording and monitoring 8K video at 60 fps. Both, ProRes and ProRes RAW are supported straight from the camera sensor.

“If you go 8K you need ProRes RAW since this allows you to get a manageable file size and all the benefits of working with raw data,” says Chung.

Shooting RED

Users of RED camera will be familiar with Redcode RAW, the proprietary recording format. Redcode is always compressed – there is no uncompressed version, claimed by Red to be visually lossless, and there’s no chroma subsampling or color space attached to the R3D RAW files. Visually lossless is usually good enough for any type of post-production including green screen work.

For example, using a Weapon Helium or Monstro at 8K 24fps to a 240GB Red Mini-Mag would record on average 259Mb/s and just 16 minutes record time (per mag). Upping the compression to 10:1 would double the record time and halve the bitrate. At the highest compression of 22:1, the figures would be 59Mb/s and 69 minutes. You can calculate your own figures from the Red website:

Netflix recommend a Redcode value of between 5:1 and 8:1. UK rental house Proactive has done some useful groundwork on recording 8K with newer Red cameras like the Monstro, Helium and Gemini.

It concludes that the majority of productions shooting Red use 8:1 as it offers “a fantastic balance between quality at the highest level, and practical data rates for the production to handle.”

The big surprise though, finds Proactive, is that if you use the Monstro in 8K at 8:1 as your standard compression level, it actually becomes much more manageable than the raw formats from Red’s competition, even some Prores formats. This becomes even more obvious when you go down to the 5K Gemini sensor. It found that at 8:1, the Gemini actually has smaller file sizes in 5K 16-bit RAW than the Sony Venice does in 4K XAVC-I which isn’t a RAW format.

The Codecs

Cinegy’s codec, Daniel2, specifically targets 8K and higher resolution production. Weigner claims it is up 20x faster than Apple ProRes or AVID DNxHR.

“With Daniel2, 8K production is as fast and easy as HD production, albeit requiring considerably more storage,” he asserts. “But since the days of HD we also have seen storage costs decrease massively while storage speed, thanks to the advent of SSDs, has increased dramatically. Put these factors together allows 8K production on inexpensive laptops or computers costing well below $2000 with standard NLE software such as Adobe Premiere.”

Weigner says that he edits 8K on a three-year-old Dell laptop without any issues or speed problems. This, of course, uses the Daniel2 codec accelerated by GPU inside Adobe Premiere and exported using H.264 or HEVC for distribution using Cinegy’s GPU accelerated export plugin.

“This may not satisfy high-end workflows, but will be sufficient for the average news, sports, even documentary production,” he says. “Editing these long GOP formats is much tougher. But depending on the NLE, the use of on-the-fly proxies or render caches and hardware acceleration by using graphics cards this does not need to be the case.”

Arguably, making a production in 8K will future-proof it to mitigate any risk and make it more attractive for sale in the long term.

“In the end this all depends on the type of production and how many cameras are needed and how much you will shoot using which codec and so on,” Weigner says. “Making clever decisions to begin with will reduce a lot of pain, headaches and ultimately cost.”



By Jenny Priestley | TVBEurope | May 2020 Edition

With the loss of this summer’s Olympic Games, the move to 8K content for some broadcasters is now on hold. Jenny Priestley talks to Cinegy’s Jan Weigner about the future of adoption for the new resolution

2020 was supposed to be the year 8K would take the world by storm. Both NHK in Japan and RAI in Italy had announced plans to broadcast this summer’s Olympic Games in 8K, the first time the new resolution would be seen on TV in the home - at least in Europe. But then the coronavirus pandemic hitand the Games were postponed to 2021.

While some in the industry have been preparing for the gradual adoption of 8K, last year Cinegy announced that its entire software-defined television product range is 8K capable. In fact, the company’s CEO Jan Weigner says it’s been working with 8K size video and much bigger resolution in earnest for more than five years. “We’ve been developing video codecs for almost 20 years, so we need to anticipate what is required in the future as this is not developed overnight,” adds Weigner. “Most broadcasters are not even UHD yet and many are still to go fully HD. The broadcast industry is not the pacesetter anymore and has not been for a very long time.

Cinegy prides itself on being very forward-thinking, having developed technologies that have been used in HPC (High Performance Computing) or Artificial Intelligence for a long time. The company takes those technologies and applies them to video processing, which enables it to achieve processing speeds that Weigner describes as “orders of magnitude faster than what the traditional vendors achieve. This makes our technology equally relevant for other sectors such as gaming, AR/VR, enterprise communication, GIS, medical and defence. We are happily serving not just the broadcast industry,” he adds.

While NHK and RAI are ready to be the leaders of the broadcast world, there remain many linear channels that are still unable to offer content in HD, let alone 4K or 8K. According to Weigner, it will be OTT platforms that lead the way in the move to the new resolution. “8K broadcasting outside Japan is not going to happen,” he argues. “Even the value of having a, or should I say one, live 8K broadcast via satellite is highly dubious. The bandwidth required at the moment consumes an entire satellite transponder. Other than for the sakes of technical ‘showing off ’ this is pointless.

“Via broadband internet or 5G, the 50-100 Mbit/s required for a very good quality HEVC, or soon AV1, encoded 8K stream is easily delivered. 8K delivery via OTT is possible today to tens of millions of homes around the globe. People watching Netflix in UHD could also be watching in 8K in terms of technical infrastructure required (and of course needing an 8K TV).”

Weigner also cites some broadcasters taking their time to adopt IP as another reason why we won’t see 8K content on linear channels for quite a while. “They are not going to touch 8K for many years other than maybe the odd proofof-concept,” he says. “Most have not moved to UHD yet and they have equally not managed to go to IP successfully. For now, IP and 8K are mutually exclusive both technically and budgetarily. IP and UHD is difficult and also expensive at least when using SMPTE 2110.

“On the production side, things look a bit different,” he continues. “High-profile productions that are expected to have a long shelf life and high resell opportunity are already being future-proofed by being shot in 8K, even if the final output is just 4K/UHD. But the original footage exists in 8K and 8K versions can be made whenever required or a joint venture partner like NHK mandates.

As stated at the beginning, this summer’s Olympics was supposed to be the showcase for both broadcast equipment and consumer technology to show off their years of work and preparation in terms of 8K. Weigner describes the postponement of the Games to 2021 as a “disaster of the highest order” for all those companies who were planning to debut their new tech ahead of the Games. “The double whammy of losing NAB 2020 and Tokyo 2020 being postponed means that we have seen no new 8K television cameras publicly launched.

“We can record, edit, mix and playout 8K without any problems, but some 8K TV cameras that are actually shipping for real and in numbers would be helpful too. As a result of this failure to launch the Tokyo 8K bonanza, both smartphones and gaming will take most of the 8K glory away from the Games by being there first.

“The Samsung Galaxy S20 is the first publicly available smartphone that can record 8K video, I have one in my pocket,” Weigner adds, “and many other vendors will release their 8K capable phones this year. This will drive 8K adoption more than anything else during 2020.”

What does Weigner think the next 12-24 months hold for the future of 8K in terms of adoption by the broadcast industry? Again, he believes it will be smartphones that lead the way. “Consumers have started to record 8K by the millions with the release of the Samsung Galaxy S20 Plus und Ultra, the two S20 models that are 8K capable,” he explains. “Tens of millions will do 8K video recording with their smartphones before the end of 2020. Next year 8K recording will become a standard feature for mid-range smartphones.

“8K TVs are already the choice for anyone buying a toprange TV – or those buying a new TV for the next coming 4-5 years,” he continues. “With prices continuing to drop, 8K TVs will start displacing top- and middle-range 4K TV offerings. This has nothing to do with what consumers will watch on those TVs. That will most likely still be HD and maybe some UHD. But why buy a 4K TV when the 8K TV is the same or similar price, but offers a future-proof investment? This is exactly the same situation we have today with 4K TVs. We all have them already but there is very little 4K TV content to watch – other than Netflix and Amazon, which will also be first to give us genuine 8K content to watch.”

Finally, does Weigner see a time when we move past 8K and start adopting 16K or even higher? “The ‘industry’ will adopt resolutions higher than 8K, that is for sure, but it will not be ‘broadcast’ as we know it,” he says.

“The top-range Samsung Galaxy S20 model has a 108MP CCD sensor,” Weigner notes. “CCD sensors with even higher resolution have existed for quite some time, but they were not made for smartphones or video recording. 150MP CCD sensors for use in smartphones are coming soon, which will allow 2x2 pixel binning for higher sensitivity in darkness whilst still maintaining full 8K resolution, but other than that, this is a real 16K sensor which will deliver good-looking 16K stills and video in daylight scenarios.

“For high-end VR recording, setups using six or more 8K cameras are being used,” says Weigner. “The stitched output video size easily exceeds 16K. For a fully immersive VR experience, the video resolution can never be high enough. 16K seems a lot, but for a 360-degree VR recording this equates to little over 5K for the normal 120-degree field of view we humans have. For real immersive VR video recording with 8K resolution for these 120-degrees, we need 24K horizontal resolution. If you want stereo vision for the proper 3D effect – double it.

“You think this is mad and can’t be done?” laughs Weigner. “Cinegy can do up to 64K today with our Daniel2 video codec - so bring it on!” 


By Contributor | TVBEurope | Published 18th June 2020

TVBEurope recently featured an interview with senior staff at a major playout centre, talking about the “uberisation” of playout, and noting that they now had the capabilities to playout a broadcast channel using only software applications.

As a follow-up, we talk to three vendors who have been leaders in advocating virtualised software platforms, capable of running in the machine room or in the Cloud. Jan Weigner of Cinegy, Adam Leah of nxtedition and Ciáran Doran of Pixel Power, a Rohde & Schwarz Company, gave their views.

“Being a software company is the only thing Cinegy has ever done – we have been saying ‘SDI must die’ for years,” Weigner says. “We never considered being a hardware company. Our first systems used MPEG-2, because that was what was available when we started out in 2006. It just made sense to us to keep the content in MPEG-2 rather than continually converting back to baseband, because every conversion step degrades the signal.”

Doran adds that Pixel Power started out making hardware decades ago because it was the only way to get the performance its software products needed. “As soon as it was practical we moved away from being a heavy metal company. Pixel Power was the first to offer premium broadcast graphics on COTS hardware, and from there we became the first to develop software-only automation.”

Swedish vendor nxtedition started life as a systems integrator, and found that automation systems invariably had gaps between the supposedly fully-functional hardware products. “We wanted to provide our customers with a system that not only worked, but reduced complexity,” Leah explains. “So we developed the functionality in software, because that is the obvious way to do it.

“Systems should be easy to use, and easy to maintain,” he adds. “If you reduce complexity you do not need so many technicians, so you can employ more journalists and creative talent. And if the system is so intuitive you can learn it in a couple of hours, you can be more productive. Our technology is largely used in news production, and being first and fast with the news is always the primary driver.”

While the nxtedition platform is designed as a single-source solution, it does include APIs and the implementation of open standards. For Pixel Power, Doran emphasises that “open standards absolutely has to be the way to go.

“Broadcasters have always regarded themselves as different, wanting specific functionality for their unique operations. With ST-2110, they can continue to demand best of breed solutions.”

Weigner agrees that open standards are vital, but adds a note of caution. “ST-2110 is designed to be used within one facility – other standards exist for the long haul.

“DVB is an IP signal. UDP as a standard is 40 years old. All the building blocks for IP connectivity between facilities and functions have been in place for 25 years or more.”

All three agree that to achieve the necessary performance, software systems for broadcast need to be built on an architecture that minimises the processor demands by only using the precise functionality needed from moment to moment.

Microservices form the foundation of virtualisation, and virtualisation leads inevitably to discussion of the Cloud.

“Cloud is a conversation starter,” Doran says. “People want to talk Cloud, but the reality is that it is more secure and more cost-effective to do it on-premise. The business model of the Cloud is that it costs little or nothing to upload: the costs are in the download. So do the maths.”

Adam Leah of nxtedition adds, “Because video servers get very big, they need to be near at hand. Having them on premises works out significantly less expensive – we did the sums for one of our clients and one third-party server charges worked out at around three times the capital cost, per year.”

He also says that latency is a critical issue. “Broadcasting is very hungry: we need a new frame every few milliseconds. But the Cloud is not about synchronous delivery, it is about scale. It really doesn’t matter if it takes 100 milliseconds or 220 milliseconds to authorise a credit card transaction. These delays can be problematic in delivering video”

“What people really want is virtualisation,” emphasises Cinegy’s Weigner. “The Cloud is just virtualisation running on someone else’s computer.” For an application like broadcast, where processes are pretty constant, then you do not need the elasticity, so why pay someone else to provide a service you could do yourself?

One area where elastic scale is a positive benefit is in disaster recovery. “We have been preparing for the wrong sort of disaster,” Doran says. Planning for business continuity has traditionally been based on a lack of access to the primary facility because of fire or flood, so all the staff get in cars to drive to a replica installation somewhere else.

Covid-19 has brought a different sort of disaster: the staff cannot get to any sort of facility, at least not in the usual numbers. So the ability to access playout from anywhere becomes very desirable.

“German broadcasters, for instance, are looking into a common, shared playout facility,” adds Doran. “If you can access a playout installation in one region, why can’t you access it from home? You only need KVM, and IP KVM has negligible latency.”

Weigner makes the point that the Cloud business model of very low cost uploads plays into this disaster recovery application, as you can have all the content and software ready and waiting for only hundreds of dollars a year, and spin-up playout channels very quickly should it become necessary.

“It is not only about CPUs,” he says. “One of Cinegy’s early projects was about accelerating video using GPUs – we have 20 years’ experience. GPU virtualisation in the cloud tremendously reduces footprint. You only need one CPU cored to run an HD channel if you have GPU acceleration. So you can run an HD channel for maybe 20 cents an hour.”


By Lewis Kirkaldie| Cinegy| Published 6th July 2020

As the ramifications of the Coronavirus pandemic descended on all our lives, those of us who could made the shift to working from home wherever possible.

While in our industry a great many of us were already fairly used to doing at least some work from home on occasion, we weren’t normally doing it while simultaneously wondering whether we should invest in the next trade show, where we would find our next egg or bag of flour, or if we’d be able to earn enough to buy it if we did.

Some, like Cinegy, implemented business continuity plans and simultaneously took advantage of the period of cocooning for some deep introspection, focus, and idea generation. It also gave us and many others a chance to work out how to prepare for what we expect a new normal to be like (which in our view is basically to bring forward by a year or two what it was going to look like anyway). It has to be said that although it was a bitter pill to swallow, not having to prepare for, engage in, and follow up major trade shows has, at least in the short term, had its benefits.

Many have been forced – not unwillingly – to learn in a compressed space of time a lot more about how to successfully work remotely, without leaving our current, often home-based, workspaces to fill in gaps that used to just involve a daily commute. I’ve missed the smell of a whiteboard pen…

In broadcast, one of the first things to go while working remotely is cables – running SDI leads to staff houses isn’t going to be viable (or will just create some truly epic trip hazards). Enter the alternative – the Secure Reliable Transport (SRT) protocol. SRT demonstrates its value with two key strengths:

  • Secure - you can use SRT for AES 256-bit audio and video stream encryption, which is critical in times like these with widely dispersed workforces and distribution channels. Circuits that were once under physical lock and key at a broadcast or production facility now must travel back and forth to someone’s home or workspace, wherever in the world that might be.
  • Reliable - if someone is operating a channel or conducting an interoperability test from home, as a great many are right now, they must have a reliable stream to ensure that what they think they are transmitting is actually happening. It’s also possible that they could be up-link contributing or simply carrying on as best they can to fulfil deployment obligations. In any case, it’s imperative that people can have the confidence that what they believe is happening is reliably taking place.

SRT provides that security and confidence. To describe SRT as a safety-net below a high-wire would be a poor metaphor. SRT laughs at the safety net, uses that high wire to drop some civil engineers at the far end, and throws up a four-lane suspension bridge. Then it sets some fireworks off for New Years from the support struts and loans out selfie-sticks for tourists. SRT is a bit of a show-off.

And recently, SRT was put through its paces in the second of a series of global SRT interop plug fests hosted by Haivision. Over three days, vendors from around the world joined forces to list and provide SRT-enabled streams for people to test the veracity of their respective technologies with SRT acting as a truly open interoperability enabler. It turns out that the midst of a pandemic, truly awful as it is, turned out to be an ideal opportunity to focus on the kind of widespread and highly detailed remote testing that many organisations don’t always have enough time to do as thoroughly as they would like during their usual course of operations.

You can only do that kind of testing, especially in current conditions, with some form of formal, cross-industry collaboration, which in this case in this case the SRT Alliance, an industry wide open-source initiative dedicated to overcoming the challenges of low-latency video streaming, which now has more than 350 member companies.

And 350 members and counting is pretty close to the saturation point for companies that have a vested interest in video streaming and is a percentage of membership almost unheard of in any field of interest. Collectively, these companies have seized upon the vision of SRT Alliance founders Haivision, plus early adopters such as Cinegy and strong supporters such as Microsoft, Avid, and many others. It’s actually far easier today to name the handful who aren’t members of the SRT Alliance, and that’s great for everyone involved and the industry at large.

SRT has changed the way companies work. Following the initial paralysis of the pandemic, many companies realised they should have moved their disaster recovery plan further up the company agenda and took immediate steps to bring it to the fore. Those and other plans are now being implemented, albeit while working under a number of understandable constraints.

One of the upshots of this is that it has shone a more positive light on both the benefits of the viability of home working for some, and the benefits of cloud computing. Why sit in the same physical location as the technology your broadcast backend runs on? The concept of distributed platforms is finally getting the traction it deserves – the benefit of distributed platforms requiring no operational human visits suddenly looks like a silver bullet. If something is amiss in London, you can fix it by spinning up a new virtual machine from your sofa in Singapore.

As a result, cloud-based operations have carried on throughout recent events with little or no disruption. Those who have such an operation now appreciate it even more, and those that don’t have started looking far more seriously as to how they might migrate some or all of their relevant operations into that model.

In short, remote production has gone from being the catchphrase of the moment to a proven, fully legitimate working practice that now also encompasses the expanding possibilities of many other forms of remote working, including from home.

SRT has been, and is, one of multiple catalysts that have enabled these shifts, the predominantly positive workplace and even cultural ramifications of which will continue to emerge in the coming months, years, and perhaps decades as the shape of content delivery continues to redefine – or create - “normal”.

The ability to jointly and/or independently confirm SRT interoperability greatly accelerates its deployment and implementation, which in turn streamlines the delivery of high-quality, low-latency video across the public internet which, in layman’s terms, translates as “one less thing to worry about”.

And one less thing to worry about, at home or the office, is a universal “yes please” these days.



By Jennie Priestley| TVBEurope | 30th July 2020.

Cinegy's Jan Weigner on why it's time lawmakers mandated media archives.

Being asked to write an opinion piece about thoughts concerning MAM, archive and storage and all the related technologies is a great opportunity for reflection on the last 20 years I have been in this industry. Another aspect that plays into this introspection is the global pandemic situation and the even more recent #BlackLivesMatter protests that have ignited a whole other debate regarding our not too distant history and how this is embodied in public monuments such as statues of past dignitaries, or closer to our industry, in old TV programmes such episodes of Fawlty Towers or films like Gone with the Wind.

Starting on an emotional level, the immediate response is an overwhelming feeling of frustration and a sense of self-entitled “I told you so”-ness, however helpful that is.

But the fact is, that after all these years too many broadcasters or media companies still do not have an over-arching technical archive strategy.

In many cases the situation is even worse than it ever was as tapes and other media have deteriorated and are gone for good. Moving to digital “workflows” has not made things any better. Silos, departmental islands, regional islands, different buckets of storage anywhere you look. Many, many databases, all leading their independent lives or dying quietly when projects or shows end. Each of these silos have their own social media “strategy” publishing their bits to YouTube, Twitter, Facebook, TikTok or whatever the respective countries’ local flavour of this is.

On the one side, if you still have not digitised your SD video tapes, don’t bother and save us all from the trouble attempting to do so today. That ship has sailed and a number of decades of television are lost for good. On the other side, there is not too much hope for the future either, unless publishing clips to YouTube is the archive strategy. But does gifting potentially valuable content to the social media quasi monopolies absolve you of your legal, historical and ethical responsibilities to have a properly managed archive? Seemingly yes.

But that is not the answer. It can’t be. There is no guarantee that any of these social media platforms will persist. Nothing is constant other than change. Remember MySpace – the fabulous platform once backed by Murdoch? Google+ anyone? A company does not even need to disappear into obscurity. It is just enough for them to decide that this is something they don’t do anymore.

Archiving what gets published to social media platforms is of historical importance. The Trump presidency should have made that abundantly clear. How can you look back in 50 years from now and not look at how social media was much more instrumental than television? The televised, endless press conferences are also important records, but are historic records that need to be kept.

Who do we rely on doing this? Who chooses what is kept and what is not? Just this choice alone allows to shape future views on these events.

The United States at least has the Library of Congress, but it is equally unprepared to deal with the avalanche of data brought on by all the social media platforms. Preserve books, magazines, television and film for future generations. But a lot of our public lives take place online. Does not every YouTube video with more than one million hits also deserve eternal preservation in the Library of Congress? Or only ones with US political relevance? No to Canadian clips? Or British ones? No space for Brexit recordings?

Where is the Library of Congress for social media platforms for all the respective countries? Answer: there is not. If the rotting tape archive of your organisation is a real problem, the digital black hole we have created is an infinitely bigger problem.

The likes of Google, Facebook and all the many others will not willingly guarantee preservation and archiving of all the content they harbour. No one is safe from bankruptcy or “business changes”.

Anyone who publishes to airwaves, streams live or puts content on social media in a professional capacity must be obliged to archive this for at least 10 years. Think of a media Sarbanes-Oxley Act that requires media professionals and companies to archive. While we are at it, I would mandate a standardised set of metadata as well and a temper-proof digital fingerprinting preventing alteration.

Oh no – the cost, the additional work, and all the other blah blah blah people will come up with! The financial industry survived the Sarbanes-Oxley Act quite well.

There is no excuse. There has never been one. This never has been about technology. Not for decades.

Storage costs? 1000TB or 1PB of disk storage can be had for less than $40K. That would hold approximately 40K hours of XDCAM HD422. 40K hours for $40k – or one dollar per hour, easy enough to remember. This gets cheaper all the time. There should be no business that can’t factor this into their business model. Also, the “nuisance” of a mandated archive will help protect business and will act as legal proof in case of disputes.

It is not the money. Seemingly archive is not sexy. Worse, most managers see it purely as a cost factor with no or little upside to it.

I could now start my diatribe on how short-sighted this is and that a corporation-wide archive which makes all assets immediately available to anyone planning or producing news, sports, drama, documentaries, children’s programming and even reality, immediately pays back in spades. With “available” I mean also during production, as well as rushes, and not the select bits that end up being broadcast or published to streaming services or social media.

But the reality is that especially with larger organisations this falls on deaf ears on many levels.

The question managers seem to ask themselves consciously or subconsciously: a) Will this get me promoted, b) Will I still be there when this is all done and dusted and I stand to take credit for it?, c) Will this show up positively on this or next quarter’s bottom line?, d) Do I get to go to the cool parties for doing this? As the answers are mostly negative, so are the chances of pulling off the big picture.

Yes, in the silos we will find Production Asset Management Systems or news systems with PAM, maybe some with an attached MAM mostly to migrate storage. This all falls into the category production workflow acceleration, but is not aiming at strategic, long term archival.

Ultimately, the lawmakers need to mandate media companies and professionals to maintain archives, and also exactly what and how. Including social media. Again, the Sarbanes-Oxley Act comes to mind. Without this, no adoption of long-term archive strategies will occur and those that maintain archives will shape and define future generations perceptions of this period in time. Who do you want that to be?


By Adrian Pennington | IBC 365 | 6th August 2020.

The industry isn’t stopping at 8K. All bets are off in a million megapixel-plus future where massive digital screens, VR and lightfields take centre stage.

Ultra High Definition is far from the last word in TV resolution. Though not yet widespread, the industry is going beyond 8K UHD and entering the era of “Super Resolution” where there are no limits to what can be achieved.

Common broadcast systems may not reach let alone exceed 8K any time soon but Super Resolution technologies capable of new creative options and visual experiences will eventually consign even 4K imaging to a blur.

“The moment when the resolution ceases to matter and we can cover 12K or 16K resolution per eye for VR (virtual reality) - which requires 36K or 48K respectively - we are getting somewhere,” says Jan Weigner, Cinegy co-Founder & CTO. “Then there is volumetric video. 12K just gets us warmed up.” 


Reading, UK, 19 August 2020: Starfish Technologies, a pioneer in transport stream processing and advertising insertion, has been granted European and US patents that cover a range of techniques employed within its TS splicer product. The TS Splicer is a software-based solution for clean switching of encoded transport streams and is used for media switching and content replacement for advertising insertion and slating.

Starfish Marketing Director Peter Blatchford said, “We are incredibly proud to have been granted patents that cover a number of techniques that were invented by our engineering team. The benefits of these techniques enable the TS splicer to offer clean switching of encoded streams across different modes of operation. Our customers have long recognised the benefits in the performance of our product and we are delighted that this intellectual property is now formally protected.”

Page 11 of 17

Contact us

Let's boost your journey to success, get in touch today

Please contact us with any questions.

Manor Marketing, 4 Olympus House, Calleva Park, Aldermaston, RG7 8SA Get Directions