The Deepfakes Analysis Unit (DAU) examined a video where Bollywood actor Aamir Khan is apparently warning the public against political parties that make empty promises. Toward the end of that video after his apparent monologue ends another male voice can be heard seeking votes for the Indian National Congress (INC) party. After putting the video through A.I. detection tools and escalating it to a lab at the University of California, Berkeley, we concluded that the video has been manipulated using fake audio.
The 31-second Hindi video clip was forwarded to the DAU tipline multiple times for assessment. We noticed some inconsistencies with Mr. Khan’s lip sync between the eight-second and 11-second mark in the video and also between the 19-second and 23-second mark. At the 27-second mark a very faint chant of “Satyamev Jayate'' can be heard in the background; a television show by the same name was hosted by the actor several years ago. Those words helped us track down the original video on the official Youtube channel for the show.
The backdrop, clothing, and posture of the actor in the original video posted on Youtube on August 30, 2016 and that in the manipulated video are identical, however, the audio in the two videos is not the same. We put the video through various A.I. detection tools to diagnose the nature of manipulation. First, we used HIVE AI’s deepfake detection tool, which indicated that the video we were analysing was manipulated at two separate points across the entire 31-second clip.
Next, we used TrueMedia’s deepfake detector, which overall categorised the video as “highly suspicious”. However, it only gave a seven percent confidence score to deepfake face detection, which means that the tool found very little evidence of the actor’s face having been recreated using generative A.I.
We also ran the video through the audio detection tool of AI or Not to get an analysis on the audio in the video clip. The free version of AI or Not that we accessed only detects A.I. use in images and audio. The results that returned suggested that the probability of the audio being A.I. generated was 60 percent.
To get an expert view on the video under investigation, we escalated it for analysis to a lab run by the team of Dr. Hany Farid, a professor of computer science at the University of California, Berkeley, who specialises in digital forensics and A.I. detection. They noted that the video stream had no manipulation, and that the video and audio streams were desynchronised.
Dr. Farid’s team analysed the audio in a couple of ways including separating the music from the talking; they are confident that the voice in the audio track in this video is fake. They further added that it seemed like a case where the audio track was just swapped without even trying to sync it with mouth movements.
Based on all our findings and expert analysis we can conclude that the words being attributed to the actor were never uttered by him and that his video was doctored using fake audio.
(Written by Debraj Sarkar and edited by Pamposh Raina.)
Kindly Note: The manipulated audio/video files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.
You can read below the fact-checks related to this piece published by our partners:
Congress Functionaries Share A.I. Voice Clone Of Aamir Khan Targeting PM Modi
Aamir Khan Warning Against 'Jumlas'? No, Edited Video of Actor Goes Viral Ahead Of Lok Sabha Polls
Clip of actor Aamir Khan slamming BJP, supporting Congress uses deepfake audio