Six months ago, pilot Hana Khan saw her photo on an app that appeared to be “auctioning” dozens of Muslim women in India. The app was quickly taken down, no one was charged, and the issue was shelved – until a similar app appeared on New Years Day.
Khan was not on the new app called Bulli Bai – an insult to Muslim women – which sold activists, journalists, an actor, politicians and Nobel Laureate Malala Yousafzai as maids.
Amid growing outrage, the app was shut down and four suspects were arrested last week.
The bogus auctions that have been shared widely on social media are just the latest examples of how technology is being used – often easily, quickly and inexpensively – to put women at risk through human abuse. online, theft of privacy or sexual exploitation.
For Muslim women in India who are often abused online, it is a daily risk, even as they use social media to speak out against hatred and discrimination against their minority community.
“When I saw my photo on the app, my world shook. I was upset and angry that someone could do this to me, and I became more and more angry when I realized that this unnamed person was doing okay, ”said Khan, who filed a complaint with from the police against the first app, Sulli Deals, another derogatory term for Muslim women.
“This time I felt so much terror and despair that it was happening again to my friends, Muslim women like me. I don’t know how to stop it, ”Khan, a professional pilot in her thirties, told the Thomson Reuters Foundation.
Mumbai Police said they were investigating whether the Bulli Bai app was “part of a larger plot.”
A spokesperson for GitHub, which hosted the two apps, said it had “long-standing policies against content and behavior involving harassment, discrimination and incitement to violence.
“We have suspended a user account as a result of investigating reports of such activity, which violate all of our policies. “
Advances in technology have increased the risks for women around the world, whether it’s trolling or doxxing with their personal information revealed, surveillance cameras, tracking, or fake porn videos that contain forged images.
Deepfakes – or artificial synthetic media generated by intelligence – are used to create pornography, with apps that allow users to undress women or replace images of their faces in explicit videos.
Digital abuse of women is pervasive because “everyone has a device and a digital presence,” said Adam Dodge, CEO of EndTAB, a US-based nonprofit that fights abuse technological.
“Violence has become easier to perpetrate because you can assault someone anywhere in the world. The order of magnitude of the damage is also bigger because you can download something and show it to the world in seconds, ”he said.
“And there is permanence because that photo or video exists online forever,” he added.
The emotional and psychological effect of such abuse is “just as excruciating” as the physical abuse, with effects compounded by the virality, the public nature and the permanence of the content online, said Noelle Martin, an activist from Australia.
At 17, Martin discovered that his image had been digitally altered into pornographic images and distributed. His campaign against image-based abuse helped change the law in Australia.
But victims are struggling to be heard, she said.
“There is a dangerous misconception that the harms of technology-facilitated abuse are not as real, serious or potentially fatal as abuse with a physical element,” she said.
“For victims, this misconception makes speaking out, seeking support and accessing justice much more difficult.
Tracking lone creators and rogue coders is difficult, and tech platforms tend to protect anonymous users who can easily create a fake email or social media profile.
Even lawmakers are not spared: In November, the US House of Representatives censored Republican Paul Gosar for a digitally altered animated video that showed him killing Democrat Alexandra Ocasio-Cortez. He then retweeted the video.
“With any new technology, we should immediately think about how and when it will be misused and armed to harm girls and women online,” Dodge said.
“Technology platforms have created a very lopsided atmosphere for victims of online abuse, and the traditional ways to seek help when we are hurt in the physical world are not as available when the abuse is occurring. online, ”he said.
Some tech companies are taking action.
Following reports that its AirTags – tracking devices that can be attached to keys and wallets – were being used to track women, Apple launched an app to help users protect their privacy.
In India, women on auction apps are still shaken up.
Ismat Ara, a reporter featured on Bulli Bai, called it “nothing less than online harassment”.
It was “violent, threatening and intended to create a sense of fear and shame in my mind, as well as in the minds of women in general and the Muslim community,” Ara said in a police complaint that she posted on social media.
Arfa Khanum Sherwani, also featured for sale, wrote on Twitter: “The auction may be bogus, but the persecution is real. “