During the sixth quarter of its operations from July 1, 2025 to Sept. 30, 2025, the Deepfakes Analysis Unit (DAU) analysed audio and video content that was submitted to its dedicated tipline; escalations from its fact-checking partners in India, and global fact-checking organisations that are IFCN signatories.
This report captures trends from the content analysed during that time period; some insights that stood out are highlighted below:
- Most of the videos debunked by the DAU during this quarter were A.I.-manipulated videos that surfaced in the wake of "Operation Sindoor".
- The individuals apparently featured in these videos were senior Indian defence officials such as chief of the army, navy, and air force as well as top Indian ministers, including India's Foreign Minister Dr. S. Jaishankar.
- The narratives in most of these videos peddled misinformation about India having lost combat equipment or defence personnel to Pakistan.
- Our expert partners from RIT’s DeFake Project pointed to generation techniques that were likely used in some of these videos such as image-to-video generation, which involved the use of screenshots from a source video to generate video clips that were similar to the source video. In some cases that resulted in body movements looking different between the source video and the manipulated version.
- In addition to more obvious manipulations on the mouth and face of the apparent subjects in these videos, some common manipulation techniques included tampering of the insignia and name tags on the uniforms of the defence officials.
- Another running theme in these videos was the use of distorted logos of media houses. The logos looked similar but were not identical, the discrepancies included different font sizes or typefaces.
- Most videos used voice clones or synthetic voices that sounded somewhat similar to the people apparently featured in the videos. In some cases some words that were part of the original audio track were used in the synthetic audio. We also saw a few examples of audio splicing, which involved stitching together parts of the original audio with the synthetic audio.
- Financial scams were only a small percentage of the videos that the DAU debunked during this quarter. The narrative in these scam videos focussed on fraudulent income-generating platforms supposedly powered by A.I.; and in some cases the videos claimed that the Indian government or senior ministers were backing these get-rich-quick schemes.
- The videos were similar to the financial scam videos previously debunked by the DAU in terms of the content as well as packaging, making them appear like public announcements or interviews for news segments. All of them involved the use of synthetic audio tracks with original footage sometimes of politicians, tech leaders or news anchors; the manipulations made it appear that they were promoting these schemes and urging people to invest.
- India's Finance Minister Nirmala Sitharaman and Road Transport and Highways Minister Nitin Gadkari were among the politicians linked to such scam content. Television journalist Rajdeep Sardesai and Sundar Pichai, chief executive of Google and Alphabet, were among the non-political public figures purported to be endorsing these scams.
You can read the full report here.









