Image source: Unsplash
Deepfakes are videos or images which have been manipulated using artificial intelligence. Deepfakes have spread in India for a variety of purposes. Most notably, these altered images have made their way to the Indian music and film industry to create realistic special effects.
Ready for a challenge? Click here to take our quiz and show off your knowledge!
In this article, we will discuss how deepfakes influence pop culture in India.
Deepfakes Making Their Way Through Bollywood
Once a deepfake is released, it’s usually debunked as quickly as it’s shared. However, ExpressVPN’s blog piece shows how deepfakes have been seen to alter people’s memory – such as their perceived realities of situations. Psychologists have named this cognitive bias the ‘Mandela Effect.’ Most people aren’t aware of the potential memory-altering effects of such videos on certain individuals.
With regard to pop culture in India, the distortion of people’s perceived realities due to deepfakes is of great concern as there have been issues of ethicality and consent raised in the industry as a result of deepfakes.
Ready for a challenge? Click here to take our quiz and show off your knowledge!
For example, India Times reported how after rapper Sidhu Moose Wala’s death, his unreleased songs were published using deepfake technology. One of the songs released became the fastest Indian music video to reach a million likes. Despite such videos winning the hearts of Wala’s fans, concerns have been raised regarding the ethics of such measures.
Furthermore, Outlook India noted how Cadbury superimposed Bollywood actor Shah Rukh Kahn into one of their advertisements for free. The video convinced its audience until it was found to have used AI imagery.
With such knowledge, showbiz has raised concerns about the possibility of virtual actors putting humans out of business. Why pay a large sum for a Bollywood actor if you could superimpose them into a video for free?
What The Indian Government Is Doing
In response to these negative uses, the Indian government has taken steps to regulate deepfakes.
In 2019, the Ministry of Electronics and Information Technology released a draft bill that would make the creation and distribution of deepfakes illegal without the depicted person’s consent.
Source: Unsplash
How To Spot Them
Despite the Indian government’s attempts to regulate the circulation of deepfakes in Bollywood, more needs to be done to fend off the deepfake frenzy.
That’s why it’s important to take our own steps to understand what’s reality and what’s a deepfake.
How do we know what a deepfake is?
Spotting them can be challenging, as the technology used to create deepfakes is becoming increasingly sophisticated. Some general tips that can help you identify a deepfake can include:
- Check the source: Deepfakes are often shared on social media or other online platforms. If the source of the video or image is questionable or comes from an unverified source, it could be a sign that it is a deepfake.
- Look for inconsistencies: Deepfakes may have subtle inconsistencies that give away their artificial nature. These could include changes in shadows or reflections inconsistent with the surroundings or glitches in movement.
- Use online tools: Several online tools can help detect deepfakes. These tools use machine learning algorithms to analyze the content of the video or image and determine whether it is a deepfake.
It is important to note that while these tips can be helpful, they are not foolproof. As deepfake technology continues to improve in India and around the rest of the world, it may become increasingly difficult to spot them.
If you’re ever unsure, seek out additional resources to verify the information before sharing it online.
Stay Aware
As deepfake technology proceeds to evolve, it is crucial that individuals remain vigilant in an effort to protect Bollywood and the music industry in India.
By staying informed and taking proactive steps to understand what a deepfake is, we can help put a stop to the use of deepfakes in Indian pop culture and beyond.