Video Featuring Ambani and Kohli Promoting a Gaming App Is Fabricated

July 2, 2024
July 2, 2024
Cheapfakes/Shallowfakes blog main image
Screengrabs of the video analysed by the DAU

The Deepfakes Analysis Unit (DAU) analysed a video that shows Mukesh Ambani, chairperson of Reliance Industries, and Indian cricketer Virat Kohli apparently promoting a gaming app. After putting the video through A.I. detection tools and escalating it to our expert partners, we were able to conclude that the fabricated video was produced using a mosaic of unrelated video clips, graphics, and an A.I.-generated audio track.

The 33-second video in English was discovered by the DAU on Facebook while media monitoring, it has received more than 327,000 views so far. The video opens with an anchor from a leading Indian television channel in a studio setting briefly introducing an app supposedly launched by Mr. Ambani. The channel logo is visible on the top right corner of the video frame, superimposed text at the bottom provides additional information about that app.

The visual elements of the video comprise a clip featuring Ambani, another clip featuring Mr.Kohli, interspersed with visuals of wads of cash, and a graphical representation of the multiplied returns on investment from the supposed new gaming mobile app. A logo bearing resemblance to RuPay, an Indian payment service system, is also flashed along with the graphics.  

The audio track has a female voice with the anchor’s visuals and two distinct male voices with the visuals of Ambani and Kohli, both recommending the app as a way of earning money quickly.

The lip-sync of the anchor featured for barely three seconds seems consistent, however, there is an evident inconsistency in the lip-sync in the frames featuring Ambani and Kohli. The lack of inflection in the pitch and tone of the male and female speech, heard in the video, seems uncharacteristic of the usual speaking style of the people being associated with those voices.

We ran a reverse image search using screenshots from the video under review, to locate the original videos from which short clips had apparently been lifted. 

The anchor’s clip was taken from this video about politics published on May 12, 2024 from the official YouTube channel of Aaj Tak, a Hindi news channel. It does not mention anything about Ambani or Kohli.

The logo seen in the fabricated video is not Aaj Tak’s and the anchor is speaking in Hindi in the original video and not English. The anchor’s clothes and body language are identical in both videos, however, the backdrop in the doctored video is slightly edited. 

Ambani’s clip is from an interview of his on GPS, a television show hosted by Fareed Zakaria on CNN, published on the official website on Feb. 22, 2016. The CNN website does not play the interview recording, however, the same interview was posted on the official website of Reliance Industries. Ambani mentions nothing about any app in the interview. 

Ambani’s clothes in the doctored video and the original are identical, though, the backdrop is slightly edited. His head and hand movements and his posture, as well as the camera angle, for a few seconds around the one minute and 47-second mark in the original video match the footage seen in the doctored video.  

The screenshots of the visuals of Kohli led us to two separate videos. One of which was this video published from the official Youtube channel of Puma on May 26, 2023. The other one was this reel posted from his official Instagram account on Dec. 10, 2023. In neither of these clips does he mention anything about the supposed app. 

About four seconds of footage from the Puma video has been lifted from between the 10 minute 30-second-mark and the following 10 seconds; in that segment the camera angle, backdrop as well as the body language, and clothing of the cricketer is the same as that in the video we analysed. A few seconds of footage from the Instagram reel has been used in the doctored video but without the brand logo that is visible on the top right corner of the reel.

To discern if A.I. was used to manipulate the audio and visual elements in the video, we put it through A.I. detection tools. 

Hive AI’s deepfake video detection tool indicated A.I. manipulation in the bits featuring both Ambani and Kohli, while their audio tool caught A.I. tampering in the audio toward the end of the video, which features Kohli. 

Screenshot of the analysis from Hive AI’s deepfake video detection tool


Screenshot of the analysis from Hive AI’s deepfake video detection tool

We also used TrueMedia’s deepfake detector, which presented substantial evidence of manipulation in the video, suggesting a high probability of A.I. use in the video. 

It gave 100 percent confidence score to “face manipulation” and 87 percent confidence score to “generative convolutional vision transformer” both the subcategories indicate the use of A.I. to manipulate the faces of the subjects featured in the video. The tool also gave a 100 percent confidence score to "AI-generated audio detection", a subcategory pointing to the use of A.I. to generate the audio track.

Screenshot of the audio analysis from TrueMedia’s deepfake detection tool
Screenshot of the analysis from TrueMedia’s deepfake detection tool

We further reached out to ElevenLabs to get an analysis on the audio. They told the DAU that the audio track used in the video is synthetic, implying the use of A.I. to generate the audio. They added that, the user who broke their “terms of use” by generating the synthetic audio has been identified as a bad actor through their in-house automated moderation system, and has been blocked from their platform. 

While we were able to ascertain that the audio track in the video was synthetic, we specifically wanted to learn about the anchor’s lip synchronisation in her three-second appearance as it seemed more convincing than that of Ambani and Kohli. We escalated the video to a lab run by the team of Dr. Hany Farid, a professor of Computer Science at the University of California in Berkeley, who specialises in digital forensics and A.I. detection. 

They noted that the video quality was bad, and that there had been little attempt to synchronise the audio and the video tracks in the frames featuring the anchor.

Their observation underscores the point that poor video quality makes it difficult for the naked eye to discern the lip-sync. 

On the basis of our findings and analyses from experts, we can conclude that synthetic audio was used over a series of clips snipped from unrelated videos to produce a false narrative in video format.

(Written by Debopriya Bhattacharya with inputs from Areeba Falak, and edited by Pamposh Raina.)

Kindly Note: The manipulated audio/video files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.