IBC Trend 5: Digital Watermarking

This is the fifth report in Steve Ahern’s series on new trends at IBC24 in Amsterdam.

 

Electronic watermarking has been around for a while. It has been used to track pirated copies of music and video using digital rights management (DRM) techniques since CDs were introduced and digital content piracy proliferated across the internet.

There are a range of techniques used to watermark content which range from visible watermarks such as transparent logos in films and tv shows, to invisible metadata code insertion in audio tracks and video files. There is even a new cryptographic based embedded blockchain code developed by South African company Custos, using Amazon Web Services, that alerts a user to piracy and offers them a bitcoin reward for reporting it.

AI generated content has just increased the stakes for content creators in three main areas:

  1. Tracking the original content that has been used to train AI.
  2. Identifying content created by AI.
  3. Verifying the authenticity of news content.

Content creators will want to track how much of their original works have been used by AI to create new pictures, videos, songs, news reports and books, with a view to trying to get some revenue for the part their work played in training the AI.

Authentication is going to be needed in years to come as more and more content on the internet is created automatically by AI, leading to inaccuracies and hallucinations that will get out on the world wide web and be used as the basis for more AI tools to develop more inaccurate content. Evaluating whether something is true will become more difficult without the ability to authenticate sources and trace claims back to their source.

With deep fake editing now so easy, it takes very little effort to hijack a tv news report and manipulate it using deep fake pictures and synthetic voices. A credible report can be manipulated to feed misinformation, so verifying whether the report was changed will be crucial to maintaining trust in responsible media publishers.

These trends may undermine the financial viability and the credibility of responsible media companies if regulations and technological tools don’t keep up.

At IBC a range of technical and policy papers tackled the issues of verification, especially using different types of watermarking techniques.

Technology companies such as Google, one of the major players developing AI systems, are planning to embed watermarking technology into AI created music tracks so that AI and original content can be identified.

Tracks made with YouTube’s impressive new Lyria generative music tool will be watermarked.

Deep Mind says: “Our team is also pioneering responsible deployment of our technologies with best-in-class tools for watermarking and identifying synthetically generated content. Any content published by our Lyria model will be watermarked with SynthID, the same technology toolkit we’re using for identifying images generated by Imagen on Google Cloud’s Vertex AI.”

 

Verifying the authenticity of News Content was brought home to delegates by Laura Ellis, the BBC’s Head of Technology Forecasting (main picture), who described an ‘Aha moment’ at the BBC.

In April 2022 a BBC news report claimed that Ukraine was behind a missile attack on a Donbas station that killed 57 people. The video opened with a BBC logo and had the broadcaster’s watermark in the corner. It was a fake, as a BBC Verify journalist pointed out on X but it was also a wake-up call to the broadcaster to do something about rising deepfake disinformation.

“Everyone was horrified to see the fake video but the only thing we could do was tweet denials. For some it was the ‘Aha!’ moment when they fully realised we needed to do more.” Read a full report here.

 

 

John Simmons and Joseph Winograd from Verance Corporation in a Technical Paper on interoperable provenance authentication explained:

Any attempt to address false information on the web must proceed from an understanding of how people come to place trust in information.

The prevalence of information ‘bubbles’ demonstrates that people primarily place trust in specific sources of information. If information appears unaltered and from a trusted source, we often consider that information to be factual. In other words, most of us judge what is factual based on the provenance and authenticityof the information, where provenance refers to the origin, history, and chain of custody of a piece of audio-video content, and authenticity refers to whether the content has been manipulated or altered in a way out of the control of the trusted source of the information.

There are two general methods for conveying provenance and authenticity metadata in association with audio-video content. Metadata can be cryptographically bound to the audio-video content, perhaps stored at the audio-video container level. Metadata can also be embedded as a watermark in the audio-video elementary stream. Read the full paper here (free registration required)

 

Blocking content is also another way of limiting piracy. Reporting on blocking trends for IBC, David Davies wrote:

“The rise of blocking is more recent and has attracted some controversy because of perceived drawbacks like the possible inadvertent obstruction of legitimate services. For instance, the recently launched Pirate Shield project in Italy, which aims to protect the big sports rights holders – such as DAZN, Sky, Prime Video and Infinity – by blocking unauthorised viewing of live events has attracted criticism from some ISPs, VPNs and consumers. At the least, further refinements would seem to be inevitable.”

“Pirates are really well-organised now, and there is potentially a lot of money for them to make – as well as a lot for the media industry to lose in terms of attacks on its revenue. That’s why you really need solutions such as watermarking, in particular to protect against the live redistribution of content,”  said Mélanie Langlois, Product Manager, Anti-Piracy Services at Viaccess-Orca. Read the full report here.

 

As many thought leaders and technology suppliers told me, “it’s a game of whack-a-mole… an arms race… we find a way to watermark or verify then the bad actors think of something else.”

The race continues.

 

Related IBC Trends Articles

IBC Trends 1: Artificial Intelligence

IBC Trends 2: The Cloud

IBC Trends 3: Automated Content Detection

IBC Trends 4: Using AI to make talkback and reporting more efficient

 

Subscribe to the Radioinfo Youtube Channel to get the latest videos, conference reports and awards event videos.

 

Tags: | | | | | | | |