Actress Rashmika Mandanna Targeted Through Deepfake Video, Again

May 30, 2024
Deepfakeblog main image
Screengrabs of the clips analysed by the DAU

The Deepfakes Analysis Unit (DAU) analysed a video, which apparently features Indian actress Rashmika Mandanna in a photoshoot by a waterfall. After putting the video through A.I. detection tools and escalating it to our forensic and detection partners we were able to establish that her face had been swapped with the face of another woman, posing in a bikini by a waterfall, to create a deepfake.   

The eight-second video featuring the likeness of Ms. Mandanna was sent to the DAU tipline for assessment multiple times. In each of the reels, sent as Facebook posts, a song from the Telugu movie Pushpa starring Mandanna can be heard in the background, and an Instagram handle is mentioned right on top of the video. This video was posted from that Instagram handle on May 17, 2024 and has garnered more than 48,000 likes and about 200 comments. 

We noticed that the same handle has posted a whole host of suggestive videos featuring women including the likeness of Bollywood actresses. There is no bio of the person behind this handle nor does the handle carry a watermark for the videos that have been created using A.I. or any disclaimer to that effect. 

We checked Mandanna’s official social media handles but could not find this video posted on either of her handles. We reviewed the video closely and noticed that at around the four-second mark, the eyes, the nose contour and the lips of the actress’s likeness seem to change as her hand moves over those portions. And right after that the eyes remain static for a split second and her left eye appears positioned at an awkward angle to the rest of the face.   

We ran a reverse image search to trace the original video or at least similar videos since we noticed the inconsistencies and also because the same actress had been targeted by a deepfake video late last year. The initial search did not yield satisfying results, however, a fact-checking partner shared a video in which everything was identical to the video that we were reviewing but for the face. 

The identical video was posted from the Instagram account of someone who identifies themselves on their profile as a content creator and model based in Colombia, they posted that video on April 19, 2024. We cannot say with certainty if that is the original, which was used to create Mandanna’s fake video.  

We ran Mandanna’s video through A.I. detection tools to assess the nature of manipulation in the video. Hive AI’s deepfake detection tool indicated that the video was indeed manipulated using A.I.

Screenshot of the analysis from Hive AI's deepfake video detection tool

We also used TrueMedia’s deepfake detector, which overall categorised the video as “highly suspicious”. In a further breakdown of the analysis, the tool gave a 99 percent confidence score to the subcategory of “deepfake face detection” and a 93 percent score to “generative convolutional vision transformer”, the latter analyses the video for visual artifacts and also to detect A.I. manipulation.

Screenshot of the overall analysis from TrueMedia’s deepfake detection tool
Screenshot of the analysis from TrueMedia’s face detection tool

We further escalated the video to experts to seek their analysis.

Dr. Hany Farid, a professor of computer science at the University of California in Berkeley, who specialises in digital forensics and A.I. detection, said that the video is a face-swap deepfake. He confirmed our observations about the video by noting that at the four-second mark when the actress’s likeness moves her hand over her face, “the identity of the face changes as the deepfake face tracking loses the face for just a brief moment.” 

Including Mandanna several women, particularly celebrities have been victims of deepfakes in the recent past.     

“Reverting back to its roots, deepfakes continue to be used to create non-consensual sexual imagery (NCSI),” Dr. Farid told the DAU. “While it used to be that you needed hundreds to thousands of images of a person to create a face-swap deepfake, now this can be done from only a single image,” he said.  

He further added that, “as a result, anyone with even a single image of themselves online could become a victim of NCSI.” 

ConTrails AI, a Bangalore-based startup, that has its own A.I. tools for detection of audio and video spoofs also concluded through their analysis that this video is a deepfake created using face-swap techniques.

Screenshot of the analysis from Contrails AI’s detection tool

Their analysis also corroborated the inconsistencies that we observed in the video. They mentioned that tools such as DeepFaceLab, Roop or FaceFusion could have been used to create this deepfake.

Likely original video in which model’s face remains unchanged
Rashmika Mandanna’s face-swap failing due to arm movement

They noted that a face-swap can only be consistent when there is no obstruction over the designated object, so when the hand obstructs the face the algorithm fails to give the desired results.

On the basis of our investigation and expert analyses, we can conclude that the video featuring Mandanna is a deepfake produced using face-swap technology.

(Written by Debraj Sarkar with inputs from Areeba Falak and edited by Pamposh Raina.)

Kindly Note: The manipulated audio/video files that we receive on our tipline are not embedded in our assessment reports because we do not intend to contribute to their virality.

You can read below the fact-checks related to this piece published by our partners:

फैक्ट चेक: फिर हुईं रश्मिका मंदाना डीपफेक की शिकार, इस बार उन्हें बिकिनी में दिखाया गया

Video showing actor Rashmika Mandanna in a red bikini is a deepfake

Another Deepfake Video of Rashmika Mandanna in a Bikini Goes Viral!

Fact Check: Is That Rashmika Mandanna In Red Bikini Or Another Deepfake? Know The Truth