Cinegy has appointed ElectronicVerve as its exclusive master reseller for India. ElectronicVerve is a specialist in helping broadcasters and media organizations to design and implement IP workflows, ensuring a smooth transition as customers migrate from their legacy infrastructures. With its IP expertise and its strong relationships in the broadcast, gaming and corporate sectors, ElectronicVerve is strongly placed to help Cinegy meet the needs of a dynamic Indian market.
Founded by Arjun Dhawan, who continues the legacy of the late Mr. Naresh Dhawan, Managing Director of long-time Cinegy distributor Setron India P. Ltd., ElectronicVerve continues a strong history of supporting Cinegy customers in the Indian market.
“We have now irrevocably moved towards a work from anywhere model in our industry, and customers increasingly want – and need – the flexibility to leverage the best talent, regardless of their location, and implement ultra-agile workflows that take collaboration and creativity to the next level,” said ElectronicVerve’s Founder and Managing Director Arjun Dhawan. “IP is the most efficient way that these needs can be met and Cinegy is widely recognized by the market as the premium system. Cinegy delivers an unmatched combination of price and functionality, the entire workflow from acquisition to delivery is based on standard IT hardware and is highly customizable. Additionally, this is a simple plug-and-play solution that anyone with a good basic knowledge can set up and configure.”
With a raft of customer wins in the last year, Cinegy’s Air PRO software-based system for HD and UHD 4K playout automation is establishing itself as the go to for small and medium IP deployments, as well as large enterprise roll-outs. The system currently supports a wide range of industry systems including General Entertainment Channels, Travel & Lifestyle Channels, YouTube Channels, News Channels, Cable and Distribution Hubs, Monitoring Solutions and other applications in India.
Cinegy Managing Director Daniella Weigner said, “We are delighted to continue our long relationship with Arjun and the team at ElectronicVerve. India is an important market for us and with its rising demand for flexible, scalable IP-based workflows, their IT and IP expertise will be central to our efforts to best serve our customers as they transition from traditional workflows and established ways of working to the intelligent workflows that IP enables.”
ElectronicVerve is currently working on several broadcast projects where Cinegy solutions will feature.
iSIZE, the AI-tech company, that specializes in deep learning for intelligent and sustainable video delivery, is exhibiting as part of the Connected Media|IP at NAB 2021 and will show the latest version of BitSave, its fully artificial intelligence (AI)-powered technology. This first-of-a-kind solution leverages deep perceptual optimization in order to optimize video quality, and it offers significant bitrate savings for all video encoding standards.
ISIZE has also been selected to present a paper – entitled Perceptually-Optimized AI-Based Video Delivery Over Existing Standards – on Sunday, October 10 at 15:20, presented by the company’s CTO and Co-Founder, Yiannis Andreopoulos. Additionally, Andreopoulos is participating in a panel session entitled Meeting the Demand for a Content Everywhere Future.
The latest updates to BitSave mean it is now capable of achieving even greater bitrate savings, along with a leap forward in video quality and optimization. iSIZE is also demonstrating a new AI-based video codec de-noising technology and a degenerative video compression innovation at NAB 2021. A server-side preprocessing enhancement, BitSave gives customers an elegant way to integrate AI with existing systems and encoders to achieve the savings without moving to new codec standards.
iSIZE CEO Sergio Grce said, “The accelerating demand for higher quality video streaming and the need to deliver their video content as cost efficiently as possible are challenges that everyone who streams media now faces. We know that the amount of video content is only going to increase and that the demand for higher resolution content is on an upward trend – Cisco forecasts that 56.8 percent of the global IP video traffic will be HD and 22.3 percent will be Ultra HD by 2022. Studies also show that the growing carbon footprint of video streaming has overtaken that of the airline industry. BitSave addresses all three of these major challenges in a cost-efficient way without compromising the end viewer experience.”
iSIZE’s patent-pending technology is fully codec independent, increasing the efficiency and performance of all latest codec standards including AVC/H.264, HEVC/H.265, and VP9. This ensures seamless integration with existing media workflows, without breaking any standards or requiring any changes in the streaming process or the client devices.
Intinor Technology, Sweden’s leading developer of products and solutions for high quality video over the internet, will be showcasing its flagship Direkt series of remote production solutions at NAB 2021. The company will also demonstrate the recently released "ultra low latency" feature, which delivers less than half a second end-to-end for complex contribution links. Visitors to the show will be able to see first-hand why the Direkt series has proved itself invaluable for enabling hybrid events in recent times.
Also being highlighted at the Intinor booth is the newly added synchronised streams feature, which makes the Intinor Direkt series even more plug and play simple. This brings real, practical benefits for broadcasters looking for a way to support remote production without compromising operational flexibility or on-screen quality. Thanks to Direkt, two-way interviews are made easy; the solution eliminates the issue of people talking over each other and ensures that live sport really is live.
Intinor CEO Roland Axelsson commented: “Our Direkt series pulls together all the elements that a production needs for a seamless remote operation. We can now offer our customers the same low latency they get with a Zoom call, but with full broadcast quality audio and video. With the latest enhancements to Direkt we are giving broadcasters a simple way to handle all the practicalities of a hybrid production without having to change the way they work.”
Direkt is a highly robust, professional quality, dedicated encoder for contribution circuits and is available in an easy-to-use environment. Intinor has designed the system to be straightforward for operators and technicians to set up and has made operation easy enough so that on-screen talent can work the system from a home studio without breaking isolation.
Cinegy GmbH, the premier provider of software technology for digital video processing, asset management, video compression and automation, and playout, has appointed Mikhail Efimov as regional sales manager for Eastern Europe, Russia and CIS (the former Soviet states). He joins Cinegy from Perspectiva, and brings with him many years’ technical and commercial experience in the broadcast industry.
Efimov, a Russian native, is a graduate of the St Petersburg University of Cinema and Television, where he specialised in the acoustics of studios and control rooms. He went on to do postgraduate work in digital video processing as well as audio. He added business experience with his most recent job, at Perspectiva, which works to develop localised systems for the television industry.
“I am looking forward to bringing my technical background to customers’ projects,” Efimov said. “By developing strong relationships with our partners and users, to help them make the most of Cinegy’s software-based products and services for the broadcast industry.”
Daniella Weigner, CEO of Cinegy, said “Russia and Eastern Europe has long been a strategic market for us, and we have had an exciting 18 years of successful business driven by our long-standing partners and clients. Mikhail’s technical, commercial and communications skills are going to push relationships even further forward. He will help our partners and our customers use our innovative technologies to develop world-beating solutions, delivering business advantages reliably and cost-effectively.”
He joins Cinegy, based in Moscow, on September 1, 2021.
Mo-Sys VP Pro 4.27 to release simultaneously with Epic Games’ release of Unreal Engine 4.27
Mo-Sys™ Engineering (www.mo-sys.com), a world leader in virtual production and image robotics, today announces the launch of Mo-Sys VP Pro 4.27 with a raft of new features that will support Epic Games’ Unreal Engine 4.27. During the preview period of Unreal Engine 4.27, Mo-Sys has used the new update to complete a full multi-camera shoot with Amazon. The update was also put through a comprehensive, multi-camera technical rehearsal for a major Netflix production.
Mo-Sys has been working on expanding the feature set and capabilities of VP Pro, which runs inside Unreal Engine Editor, for some time now and will launch the new version on the same day as Epic Games’ Unreal 4.27. The VP Pro 4.27 upgrade brings four new key features; an improved compositor, NearTime® rendering, an online lens distortion library, and remote control capability.
Michael Geissler, CEO Mo-Sys Engineering Ltd., said, “Virtual production is seeing a real surge, but it does bring challenges. Typically, directors and cinematographers must make continuous calculations to make the real and virtual elements match in post-production. This is often a costly and repetitive process. Mo-Sys is a pioneer in providing a platform for graphics rendering as well as camera and lens tracking to create higher quality virtual productions. With VP Pro 4.27 we have made it even easier to produce seamless, high-end productions and we are also unique in making our 4.27 update available on the same day as Epic Games launch Unreal Engine 4.27.”
Mo-Sys has done a complete overhaul of its internal compositing system, to change the way compositing is handled across all modes. The latest update of VP Pro now provides improved support for high-end graphics features, such as refraction, and delivers a 15% performance improvement. Other features include:
· Support for reflection and refraction of video in CG objects
· Improved support for advanced ray-tracing features
· Support for fur and groom
· Advanced controls for CG shadows falling onto video
The updated VP Pro 4.27 also widens the Beta program for the new NearTime® rendering workflow to give access to more users. NearTime is an automated, cloud-based re-render that dramatically increases and homogenises the visual quality, and allows for higher resolutions (UHD, 4K, 8K), without the performance restrictions of real-time. The moment the shot is complete, a high-quality, high-resolution optimized, cloud-based NearTime render starts – with no compromise on performance or visual quality. With NearTime, you can “turn up all the dials” for unrestricted real time quality and do it at a fraction of the cost of completing the process in post-production.
As part of the 4.27 update, Mo-Sys is also launching its online lens distortion library giving users access to a wide selection of calibration tools and allowing them to tweak their lenses on-set in a highly cost-effective way.
New for VP Pro 4.27, the VP Remote iPad interface is fully customizable and supports control of multiple engines, and camera chains, from a single control panel. The Beta version was trialled at the technical rehearsal with Netflix.
Mo-Sys™ Engineering recently launched its new Onset VP Services office based in Los Angeles to provide an out-sourced solution for production companies new to virtual production. This has enabled US creative and production specialists Papertown to quickly and accurately create an ambitious virtual production project featuring a passenger plane in a hangar.
Papertown is an experienced creative production agency with specialization in computer graphics (CGI). Its client, Business Made Simple, one of the most successful management coaching organizations in North America, had the concept of a management training course based around the analogy that every business should run like an airplane, so a plane in a hangar was chosen for the video concept.
Renting and preparing an aircraft for shooting inside a hangar would have been expensive and impractical, so Papertown proposed shooting the speaker against a green screen and creating the plane, hangar and other items in CGI. This is where Mo-Sys Onset VP Services came in to play bringing in the virtual production expertize required.
Having just opened a branch in LA offering virtual production services to assist with this new way of filming, the Mo-Sys Onset VP Services team were able to help Papertown set up a virtual production studio to merge the CGI and real world elements together. This arrangement enabled Papertown to focus on imaging and storytelling without the time-drain of organizing their own virtual production workflow or needing advanced knowledge of the latest technology.
Mo-Sys, world leader in camera tracking, image robotics and virtual production solutions, created StarTracker to provide precision 6-axis camera tracking, enabling 3D movement in a virtual production scene. Combined with highly accurate real-time lens data, the full StarTracker data set via Mo-Sys VP Pro drives the Unreal Engine’s virtual graphics so they accurately emulate the real camera shot, delivering convincing virtual scenes.
Using StarTracker’s ability to lock the virtual graphics to the real world, not only could the cinematographer frame each shot easily because the composite image was available in the viewfinder, but the production team were able to record the finished output, eliminating the need for time-consuming post-production compositing. In fact, Papertown was able to shoot the entire three hour video series, and dozens of terabytes of premium cinematic footage in just two studio days.
“We had no experience of Mo-Sys products before this shoot, but we had all the support we needed to quickly put our creative to work,” said Papertown founder and Executive Creative Director Julian Smith. “Being able to use StarTracker, VP Pro and Unreal together extended an insane amount of value to our clients. It took our photo-real CGI to a whole new level. This is a game changer. Papertown is always pushing the boundaries with new ideas and technology and we are grateful to have partnered with a team like Mo-Sys that does the same.”
Michael Geissler, CEO of Mo-Sys, added “Whether for television, movies or corporate, producers are looking for the highest possible quality at the maximum productivity. Shooting final pixel like this is a tremendous boost. But it can only realistically be done with precision camera and lens tracking, real-time compositing, and a virtual production operator who is familiar with the Unreal Engine.
StarTracker and VP Pro are available now. Mo-Sys has just released the VP Pro XR server solution specifically aimed at LED volumes.
SEQUOIA, an R&D project partnership between iSIZE, the BBC R&D and Queen Mary University of London (QMUL) has been awarded £700k from Innovate UK following a competitive grant submission, and a rigorous review and award process (<5% acceptance rate). Innovate UK is part of UKRI, the national funding agency investing in science and industrial research in the UK.
SEQUOIA looks at the way new technology, including artificial intelligence, can discontinuously change the way we distribute video content. It is a response to the pressing need for video streaming to become more sustainable. It addresses the challenges faced by the media sector in tackling the surge in online media consumption, which is posing unprecedented stress on network infrastructures worldwide. As well as imposing content delivery bottlenecks, this massive load on the internet infrastructure affects how content can be distributed efficiently to larger numbers of viewers, and contributes to its environmental footprint.
The project recognises that innovation in video streaming is urgently required. It is looking at perceptual optimisation of video streams as a way of making significant reductions in bandwidth required for equal quality. This is at the heart of iSIZE’s work, and the company has built extensive IP and expertise in this domain. This will be combined with innovations in encoding technologies and optimization, which is pursued by BBC R&D and QMUL.
“The problems facing video streaming are real and represent significant environmental issues,” said Sergio Grce, CEO of iSIZE. “The increase in video encoding complexity is outpacing Moore’s Law’, and some respected researchers suggest that the carbon footprint of the internet is greater than that of aviation. So this is an issue that must be addressed."
“We are very excited to be working with the BBC and QMUL on this project,” he added. “SEQUOIA brings us together with BBC and QMUL to advance the video streaming, incorporating our expertise in deep perceptual optimisation and the latest cutting-edge AI innovation. This project will deliver significant financial and environmental improvements for video streaming.”
Disruptive innovation for video streaming is urgently needed: new pre and post processing, encoding and delivery tools that are device-aware and cross-codec compatible. This is vital to meet the growing demand for online video, reducing processing, energy and storage requirements.
This project will make an impact at every stage in the media distribution chain, demonstrating its results on operational and portable encoder designs, applicable both to video on demand and live streams. This will lead to benefits for the whole sector, demonstrating technology to enable sustainable distribution of Ultra High Definition content, while limiting the impact of video on internet traffic and reducing distribution costs. Extending beyond the commercial benefits, project outcomes will be devised to support environmentally conscious solutions by monitoring and proactively reducing energy consumption at all stages within the media value chain.
The partnership of iSIZE, BBC and QMUL brings unique expert know-how and expertise on AI, video coding standardisation, adaptive video pre/post processing and streaming, perceptual optimisations and interoperable software architectures to collaboratively work towards these challenging objectives.
Mo-Sys™ Engineering, world leader in virtual production and image robotics, has supplied the brand-new Mo-Sys VP Pro XR extended reality server as well as StarTracker camera tracking to ARRI for its new studio in Uxbridge, west of London, making them the first user of this innovative technology.
The new facility is one of the largest permanent LED volume studios in Europe and they have chosen the Mo-Sys technology, which is designed specifically for real time final pixel shooting, to deliver cinematic quality imagery, as well as precise 3D tracking for cameras, allowing directors to see the combined real and virtual images during the filming process and eliminating lengthy post-production compositing.
The new ARRI studio offers 708 square metres of floor space, bounded by 343 square metres of LED wall, including curves and a ceiling. The LEDs extend 360˚ around the studio, allowing the light from the virtual elements to fall naturally onto the real objects and actors, providing realistic soft edge lighting effects.
The facilities will also make full use of the latest VP Pro XR feature, Cinematic XR Focus, a unique and intuitive solution which allows focus pullers to pull focus between the real and virtual environments, using the same lens focussing controllers they are used to.
The LED volume was designed and installed by Creative Technology in collaboration with ARRI. Head of technical services at Creative Technology Tom Burford said, “We are thrilled to showcase this exciting new solution, bringing together best-in-class technology. This is a space designed specifically for mixed reality productions of all kinds, carefully considered to produce the most flexible shooting environment possible.”
Jannie van Wyk, Managing Director of ARRI Rental, which manages bookings for the studio, added: “Mo-Sys’ commitment to ongoing development and integration with ARRI camera systems was the reason for our decision to use the Mo-Sys StarTracker and VP Pro XR system for our mixed reality studio at Uxbridge."
“Producers love the concept of final pixel shooting because of the creative freedom it gives directors plus the time and cost saved during post-production,” said Michael Geissler, CEO of Mo-Sys. “In the past, though, there have been concerns about image quality, shooting options, and workflow. We are solving these issues with VP Pro XR and StarTracker, in a way that fits naturally into today’s production workflows without adding complications or time penalties.
“We are delighted to have provided this ground-breaking solution, and this new ARRI studio is a world-class showcase for what can be achieved,” he added.
The UK ARRI Studio is now open for bookings.
Mo-Sys VP Pro XR is available now.
Mo-Sys™ Engineering is introducing on-set virtual production services for the Los Angeles market. Mo-Sys On-set VP Services provides an out-sourced solution for production companies new to virtual production.
This is a unique Mo-Sys concept designed to empower Cinematographers and Directors who can now focus solely on imaging and storytelling without the time-drain of organising their own virtual production workflow or needing advanced knowledge of the latest technology.
A key element of the service is that all bookings are supported by experienced on-set virtual production technicians who remain with the system for the duration of the shoot. Several of these technicians joined the LA On-set VP team earlier this year, and have been on intensive product training with Mo-Sys specialists since then. Additional LA team members and further roll-out of the solution to other cities, such as London, are planned for the near future.
The news follows the recent announcement of the launch of Mo-Sys VP Pro XR, a new XR server solution for LED volumes, meeting the demands of final pixel XR production for film and TV. Mo-Sys VP Pro XR comprises a hardware and software solution combining multi-node nDisplay architecture, real-time VP Pro compositor/synchroniser and a new Cinematic XR toolset containing unique features such as Cinematic XR Focus.
“Mo-Sys’ new On-set VP Services enable production companies to shoot whilst learning the techniques and processes of virtual production, until they’re comfortable doing it themselves. We are boosting access to advanced production capabilities, and expanding the knowledge pool. We want to help our clients try new things with virtual production, irrespective of the screen technology, workflow or type of virtual production chosen.”
Mo-Sys On-set VP Services are available now with full details available here.
• Investment will enable iSIZE to accelerate its traction and to continue strengthening its technical team and patent portfolio
• iSIZE has already secured licensing agreements with leading technology and streaming companies
iSIZE, a deep-tech company that applies deep learning to optimize video streaming and delivery, today announces that it has raised a further $6.3 million in funding as it seeks to make streaming more environmentally friendly without reducing quality.
The round was led by Octopus Ventures, with participation from existing investors including TD Veen and Patrick Pichette, Chairman of Twitter and ex-CFO of Google. This brings the total funding raised by the company to $8.2 million.
The amount of video streamed over the internet is at all-time high, a trend which has been accelerated by the pandemic and the shift to working from home. At the same time, streaming and content companies are facing pressure from users and advertizers to deliver ever-increasing video quality. With forecasts projecting video to reach 82% of total global internet traffic by 2022, there is also growing awareness of its carbon footprint, with research indicating that it already contributes to more than 1% of global emissions.
As a result, streaming and content providers are increasingly turning to technology to address the challenge of delivering a reliable and high-quality experience while managing the financial and environmental costs of doing so.
To help solve this problem, iSIZE has pioneered deep-learning solutions that optimize video streaming quality while reducing bitrate requirements, allowing for a significant reduction in data and energy consumption.
The potential impact of its technology is huge and iSIZE has already attracted attention from some of the world’s largest technology companies to whom they already licensed their BitSave technology.
Headquartered in London, iSIZE was founded by Sergio Grce and Dr. Yiannis Andreopoulos who saw an opportunity to tackle the challenges caused by the explosion of video streaming. The founding team combines many years of research in machine learning, neural networks and video signal processing, evidenced by dozens of research publications. The company is also a graduate of the Creative Destruction Lab Oxford 2019-2020 programme where it received advice and investment from expert mentors.
iSIZE intends to use the funding raised to accelerate its traction in the U.S and to further strengthen its technical team and patent portfolio to continue improving the results and innovations it delivers to its customers.
Sergio Grce, Founder and CEO of iSIZE, commented: “Today there are more people streaming more video than ever before. Our customers recognize both the commercial opportunity and their social responsibility to optimize their video delivery pipelines with our pioneering technology. We are excited to partner with Octopus Ventures to tap into their network and expertize in building world-changing companies.”
Simon King, Partner and deep tech investor at Octopus Ventures, said: “The technology iSIZE has created is pioneering and is already being used by some of the world’s largest companies to reduce the costs and energy used in streaming. Consumer demand for high quality video is only going to increase as our devices are upgraded, so it’s vital that we find new ways to reduce the environmental impact. We are very familiar with this space having been an investor in Magic Pony and Sergio is one of those visionary founders who we believe can build something truly special.”
iSIZE’s leading product is a proprietary AI-trained, deep perceptual optimizer that is trained to ‘see with the human eye’ in order to optimize video quality and deliver significant bitrate savings. Its technology has applications across VoD, live streaming, gaming and IoT and bolts-on to the existing conventional video delivery pipeline while integrating with all video encoding standards (including AVC, HEVC and AV1) – all without requiring changes to the streaming process or to end-users’ devices. This allows its customers to improve the end-user experience and reduce costs without breaking standards and with minimal deployment risk.