Actress Sonakshi Sinha Targeted Through Deepfake Video

July 9, 2024
July 9, 2024
Deepfakeblog main image
Screengrabs of the video analysed by the DAU

The Deepfakes Analysis Unit (DAU) analysed a video, which apparently features Indian actress Sonakshi Sinha in a fashion show. After putting the video through A.I. detection tools and escalating it to our forensic and detection partners we were able to establish that Ms. Sinha’s face had been swapped with the face of another woman walking the ramp in a swimsuit, to create a deepfake of the actress.

An Instagram link to the 11-second video featuring the likeness of Sinha was sent to the DAU tipline for assessment. The video was posted as a reel on June 18, 2024 from an Instagram account that displays no profile information. There is no watermark or disclaimer by the uploader indicating that A.I. was used in the production of the video, which could not be found on any of Sinha’s official social media accounts.

We reviewed the facial features of the actress’s likeness and noticed that between the four-second and seven-second mark, a lump-like protrusion appears on the left side of her chin, after which it seemingly blends into her jawline. At around the nine-second mark, her cheek bones suddenly change shape and become more pronounced as she lowers her head.

We ran a reverse image search, using screenshots from the clip, to trace the original video. We were able to locate a reel on Instagram posted on Dec.13, 2023, featuring a model who was clad the same way as Sinha’s likeness and walked the ramp in the exact same setting as seen in the reel we reviewed, the body language of the subjects in the two reels was also identical. However, their faces were different.

The facial inconsistencies that we noticed in the video received on the tipline were not there in the video featuring the model. The background music in the model’s video was a song in English while the music playing in the reel is a song in an Indian language. The doctored video also had graphics resembling musical notes and a flame, which were not there in the model’s video.

Several actresses have been targeted through deepfake videos since last year. The DAU recently debunked a video featuring the likeness of actress Rashmika Mandanna, in that video too the actress’s face had been swapped with that of a model, leading to the creation of non-consensual sexual imagery.

We ran the video featuring Sinha’s likeness through A.I. detection tools to assess the nature of manipulation in the video. Hive AI’s deepfake detection tool did indicate that the video had been manipulated using A.I.

Screenshot of the analysis from Hive AI’s deepfake video detection tool

We also used TrueMedia’s deepfake detector, which overall categorised the video as having “substantial evidence of manipulation”. In a further breakdown of the analysis, the tool gave an 87 percent confidence score to the subcategory of “generative convolutional vision transformer”, which analyses video frames for unusual patterns in facial features. The tool also detected the use of A.I. in the audio, however, that could be because of the background music; audio detection tools do not give accurate results when there's music or noise in the audio.

Screenshot of the overall analysis from TrueMedia’s deepfake detection tool
Screenshot of the audio and video analysis from TrueMedia’s deepfake detection tool

To get an expert view, we escalated the video to Dr. Hany Farid, co-founder of GetReal Labs, and his team, they specialise in digital forensics and A.I. detection. They pointed to visual artifacts in the video, specifically in the right eyebrow and the left eye of Sinha’s likeness. They noted that both visual markers are consistent with synthesis and are suggestive of a face-swap deepfake.

To get another analyst to weigh in, we reached out to our partners at RIT’s DeFake Project. Saniat Sohrawardi from the project told us that the video showed signs of a face-swap, and inconsistencies between frames, especially when the actress’s likeness turns her head.

Mr. Sohrawardi noted that the most popular way to create face-swaps is using tools such as DeepFaceLab or face-swap code repositories for which many tutorials are readily available online. He added that the biggest weakness in these tools is that they create deepfakes frame-by-frame, rendering each frame susceptible to inaccurate face extraction and alignment issues.

He stated that the cues to look out for when assessing potential face-swap deepfake videos are jumps in the angles of the subject’s face when it is turning in any direction or errors in perspective.

On the basis of our investigation and expert analyses, we can conclude that the video featuring Sinha is a deepfake produced using face-swap technology.

(Written by Debraj Sarkar and edited by Pamposh Raina.)

Kindly Note: The manipulated audio/video files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.

You can read below the fact-checks related to this piece published by our partners:

फैक्ट चेक: बिकिनी में रैंप वॉक करतीं सोनाक्षी सिन्हा का डीपफेक वीडियो हुआ वायरल

Fact Check: Viral Clip Showing Sonakshi Sinha Walking The Ramp In Bikini Is A Deepfake

Viral clip of Indian actor Sonakshi Sinha 'walking the ramp in a golden bikini' is a deepfake

Edited Images, Deepfake of Sonakshi Sinha Viral on Social Media After Wedding