Synthetic media is a term I’d not come across before hearing about Google’s paper on fighting disinformation that was published this week. It describes those eerily realistic images and video generated through artificial intelligence (AI) or machine learning (ML) techniques.
You might have seen the video of the Jennifer Lawrence press conference where she has been gifted Steve Buscemi’s face as an example of this.
Deep fakes are what we more commonly call these.
Google says it is sharing datasets of synthetic media so that others can use it to spot deep fakes and develop systems that can do this. However it admits that there are limits to what tech can do – always slightly galling for a tech giant – and says it will need to also work with “researchers, policymakers, civil society, and journalists around the world”. Although none of these watchdogs have enjoyed complete success dealing with disinformation generated by non-silicon based actors, to coin a phrase. This last ditch defence against the tidal waver of fakes has been breached many times before.
Check your anti-tech moral panic
While we are trembling at the prospect of future synthetic media horrors, we might also recall that manipulating and misleading through media is something humans have been doing without support from AI for as long as we care to recall. See The Daily Mail and other tabloids in the UK’s decades-long campaign of disinformation about the European Union, for an example of this.
The Economist recently provided a helpful infographic of disinformation spread by the British media since the early 90s, sadly including stories published by non-tabloids such as the BBC and The Times.
Source: The Economist
The eyes! The eyes!
Another deep fake demo that crossed my awareness this week was a website called This Person Does Not Exist, which claims to showcase images of human faces that have been generated by an algorithm, something that previously had been hard to achieve.
Hit refresh and another fake face pops up. Obviously helped by foreknowledge, I thought that many of these would bear up to a glance, but still teetered on the edge of the uncanny valley.
The giveaway – when there is one – always seems to be the eyes. Sometimes more obviously than others…
Although other things can go wrong too…
I’m not sure about the provenance of this website. I’m aware that I may be enjoying it because it plays on whatever cognitive biases I have that make me want to believe that machines will never be able to completely fool me. In fact, they probably already have.
You must be logged in to post a comment.