Women, Blockchain, and Protecting Democracy

One of the hallmarks of a modern democracy is that everyone can participate in free and fair political processes. The Women, Peace and Security agenda is one of the many tools that decision-makers have at their disposal to both increase the participation of women in high-level decision-making and to ensure their protection from violence that prevents them from participating in public life. Sadly, it is no surprise that women are the primary subject of political violence online that can take the form of deepfakes.

Deepfakes, lifelike fake videos created with Artificial Intelligence (AI), can mislead people and undermine trust. The problem of deepfakes and AI-generated disinformation is a significant issue, especially for women, with female politicians being targeted at a disproportionately high rate. Attacks often include sexually explicit deepfakes, designed to intimidate and damage their reputation, and discourage other women from entering public office.

The Problem

Deepfakes aimed at women are part of a wider tactic of weaponizing AI and disinformation to push women out of political spaces, creating what amounts to "chilling democracy." This is one of the many challenges that Our Secure Future has documented in a recent report on the intersection between technology, women's participation, and security.

Some high-profile examples include Azma Bukhari (Pakistan): An Information Minister of Pakistan who was the victim of a deepfake video that superimposed her face onto an actor's body in a sexually explicit context. Hillary Clinton (United States): A deepfake video of the former U.S. presidential candidate circulated in 2023. It falsely showed her endorsing Republican Governor Ron DeSantis during his presidential bid. Angela Rayner (United Kingdom): The British Deputy Leader of the Labor Party was among several female UK politicians targeted with fake, sexually explicit deepfake pornography in 2024. Even celebrities like Taylor Swift are subject to false narratives and harmful misrepresentations. While Swift is not a politician, the musician is a powerful influencer globally.

Deepfakes not only create a hostile digital environment but also increase insecurity for women both online and in real life. Studies have shown that targeted gendered attacks discourage women from running for office or taking prominent roles like leading local organizations. Deepfakes are used to specifically prevent women and girls from entering the political sphere, where they could, as elected officials, make significant economic and political decisions for a country. They also erode public trust by making it easier for perpetrators to dismiss authentic evidence; as a Center for Strategic and International Studies (CSIS) report states, they "...[allow]... politicians to claim that a real video or audio recording of them doing something problematic is actually a fake." To further complicate matters, while social media companies have tried to ban deepfakes, their own enforcement, laws, policies, and regulations that protect people from this kind of disinformation are few and far between.

A Solution

Many have found ways to combat deepfakes and other forms of disinformation through legal channels, regulation, increasing digital literacy, or collaboration between civil society organizations, big tech companies, and governments. However, there may be another solution available based on blockchain that could address deepfakes at the core of the online environment.

News agencies have been fighting against disinformation and misinformation campaigns even before the advent of the internet. Today, the widespread use of social media has made it possible for anyone to publish "fake news" or deepfake videos and disseminate them globally, without any way for readers to check if the information is real or not. According to the 2024 Edelman Trust Barometer, 64% of people believe journalists and reporters are purposely trying to mislead people by saying things they know are false or gross exaggerations. To combat this problem, media outlets like the Italian agency ANSA and the American media empire, Fox News, are using blockchain to verify information with an automated tool that helps to authenticate whether the facts, stories, or news reported are real or not. Fox's tool is called "Verify," and it is being built on the Polygon blockchain.

The consulting group EY provides a detailed description of how this works for the Italian ANSA agency: "A blockchain is essentially a series of unchangeable records of data, each time-stamped and displayed as a shared ledger. This gives readers a way to trace the origin of ANSA's stories and follow each update or repost across the online world. When an ANSA editor publishes a news story, it is automatically fed into the OpsChain system, with its ID and publication details contained in a block that is made immutable through 'notarization' on blockchain. When a story is published online, it is given an 'ANSAcheck' icon which readers can click on to see who has written it and, potentially, republished it. If any changes or corrections are made after the story has been published, the information about the revisions will also appear on the relevant ANSAcheck page."

How does this process work? When a document or image is notarized on a blockchain, the document itself is converted to a fixed string of characters, also known as a hash. The hash is unique to the document and is generated using a mathematical function. When a document is notarized on the blockchain, the document itself is hashed and then placed on the chain as part of a transaction, not the whole document. This also preserves privacy so that the original document or image cannot be retrieved from the blockchain. The hash is how the document or image gets notarized. This is essentially a digital version of provenance for a piece of art, or a car title that passes from one owner to the next.

Could a similar blockchain technology work for combatting deepfake videos that target women? It is possible. This type of blockchain technology could be used by political campaigns, social media, and traditional media outlets to notarize real images of women political candidates or celebrity images through this process. And when a deepfake video is published, it could be immediately identifiable as unauthentic, since the deepfakes would not include images with documented notarization.

Skeptics may say that just because blockchain technology can combat fake news, it isn't a silver bullet for deepfakes that plague women.

There are other forms of new technology that are being tested to combat deepfakes, as well. For example, the Netherlands Forensic Institute (NFI) has developed a method that looks at subtle color differences in the face. However, AI algorithms are notoriously bad at detecting racial, ethnic, and even sex-based differences in images. Given these limits, blockchain offers a different kind of verification: one that does not attempt to interpret the image but records its origin. Why not try using blockchain with an eye towards the goals of Women, Peace and Security? It is worth applying this emerging technology to solve real-world problems to protect the things that humans value, such as democracy and freedom.

Author bio: Sahana Dharmapuri is the Vice President of Our Secure Future: Women Make the Difference, an organization dedicated to Women, Peace and Security.

ⓒ 2025 TECHTIMES.com All rights reserved. Do not reproduce without permission.

Join the Discussion