Rashmika Mandanna Denies Involvement in Deepfake Video!
A video surfaced online on November 5, supposedly showing actor Rashmika Mandanna entering an elevator. The video quickly gained popularity on social media, but it was later discovered that the person in the video was not Mandanna at all. Instead, it was a skillfully crafted deepfake video, in which the actor’s face was digitally imposed onto that of a British-Indian influencer. This video has sparked a significant debate regarding the potential risks of AI-powered technology, the methods to identify and reduce misinformation, and the measures individuals can take to safeguard themselves from impersonation.
Before we jump into the case, we should know what deepfake means. Deepfake is an artificial intelligence technique where media such as photos, videos, and audio are hyper-realistically manipulated to appear very real. Mandanna has become the victim of the latest such attack.
Rashmika Mandanna deepfake line
A small six-second clip of the actor was shared online, the original uploader is unknown, and it went viral. In the video, Mandanna is seen entering the elevator. However, soon AltNews journalist Abhishek published X and highlighted that it was a deep fake. In several posts, he said, “There is an urgent need for a legal and regulatory framework to deal with deepfake in India. You might have seen this viral video of actress Rashmika Mandanna on Instagram. But wait, this is a Deepfake video of Zara Patel.”
“The original video is of Zara Patel, a British-Indian girl who has 415,000 followers on Instagram. She uploaded this video on Instagram on October 9… From a deepfake POV, the viral video is perfect enough for ordinary social media users,” he added.
Amitabh Bachchan, Rajeev Chandrasekhar regov video
As soon as the deepfake video was revealed, many celebrities and important leaders started reacting to the situation. One of the first among them was actor Amitabh Bachchan, who also stars in Mandanna’s upcoming film Goodbye. He sent X: “Yes this is a strong case in terms of the law”.
Union Minister Rajeev Chandrasekhar also posted on X, stressing that “the government is committed to ensuring the safety and confidence of all DigitalNagrikis using the Internet.” He called deep fakes the latest and most dangerous and damaging form of misinformation, explaining that “platforms have to deal with it.”
Mandanna herself got in touch with X and said: “I feel really hurt sharing this and I have to talk about the deepfake video circulating online. Something like this is honestly extremely scary, not just for me but for all of us who are exposed to such a large to harm due to misuse of technology.
Patel, the woman whose video was faked by Bad Actors, posted a statement on her Instagram account saying, “It has come to my attention that someone created a deepfake video using my body and the faces of popular Bollywood actors. I had nothing to do with the deepfake video and I am deeply shocked and appalled by what is happening. I am concerned for the future of women and girls who now have to be even more afraid of posting themselves on social media. Take a step back and fact check what you see on the internet. Not everything on the internet is true.”
How to recognize deep fakes and protect yourself from them
The Massachusetts Institute of Technology (MIT), which has its own AI and ML research department, has published some useful tips to help people distinguish deep fakes from real videos. A few of them are listed below.
1. Pay attention to the face. High-end DeepFake manipulations are almost always face changes.
2. Pay attention to the flashing. Does the person blink enough or too much?
3. Pay attention to the movements of the lips. Some deep fakes rely on lip syncing. Do the lip movements look natural?
In the Mandanna/Patel deepfake video, all three of these problems are present, and even in a six-second video, you can spot them with careful observation.
It has also become important to protect against deep spoofing, as some scammers have started using it to trick victims into believing that the person they are talking to in a video or voice call is someone they know.
To protect yourself:
1. Ask the person to wave their hands in front of their face. Deepfake videos made with modern technology cannot withstand self-evident interference.
2. Never send money to anyone on a whim after receiving fake videos from purported friends or relatives. Always call their second number or another family member to confirm first.
3. Ask them for something personal to confirm who they claim to be.
Most people aren’t afraid of being deeply faked because the training data needed to create such overlays is quite large, and unless you have a lot of photos and videos of yourself online, it’s hard for an AI model to do. create a perfect depth fake, especially if the side view is visible.
One more thing! ReturnByte is now on WhatsApp channels! Click here to join so that you never miss any updates from the world of technology.