Misogyny, Surveillance, and Sexualisation: How Big Tech Is Making Women Afraid To Go Out In Public

In a political climate that is steadily shifting towards the right, technology and AI is becoming a tool of intimidation and digital violence towards women. As our social media feeds become saturated with ‘trad wife’ content and a resurgence of traditional gender roles, it is no coincidence that women are now finding themselves targeted for being in public spaces.

Mere days after the Grok AI scandal made headlines, another story broke about men covertly filming women at night and monetising the footage online. Similar incidents have occurred with men using Meta’s smart glasses to record women in public spaces, and again posting these videos online without the woman’s consent or knowledge they were being filmed.

Three main themes link these three cases, the first being the increasing role of technology and Artificial Intelligence (AI) in enabling this type of covert filming. Second is the disproportionate targeting of women, by men, in public spaces. And finally, the legal system’s inertia when these cases are brought to the police. It’s not illegal to film in public spaces and those filming can easily hide behind anonymous online profiles to avoid any accountability.

Technology is rapidly expanding as a form of misogynistic surveillance, empowering men to police women’s presence in public spaces. Social media then becomes a virtual space of public humiliation. The women targeted are suddenly confronted with thousands of hate comments for these interactions that they feel “afraid to go out in public.” And in some cases there was no interaction between the man filming and the women he was recording at all.

Online public humiliation is part of Grok AI’s sexualised image creation process too. X users tag @grok in the thread underneath a photo to give it instructions on how to alter the image. This thread, and Grok’s replies, are publicly visible – to other X users and the original account. This creates an echo chamber of Grok requests becoming increasingly sexual, extremist, and violent, and has targeted high profile women like Maya Jama, Alexandria Ocasio-Cortez, and Zendaya. It’s important to draw attention to a racialised element of this too, with users also citing fascist/neo-Nazi slogans and requesting to white-wash black and brown celebrities.

The public backlash to the Grok case led Prime Minister Keir Starmer to condemn the algorithm and an investigation is ongoing by regulatory bodies to determine whether X is legally responsible.

There is currently no law against (covert) filming in public spaces. Technology Secretary Liz Kendall has proposed criminalising nonconsensual sexual images created by AI. But in a world where everyone has a camera, smart glasses and hidden cameras make it even harder to identify malicious intent. And how do the police identify who is responsible? Florjan Reka who runs an account posting footage of women on nights out in Manchester is from Sweden. National borders don’t restrict this behaviour, signalling an urgent need for rapid international cooperation between lawmakers on this issue.

Right wing politics is attempting to push women out of the public sphere by emphasising a return to traditionalism through nationalist and populist rhetoric. Technology is developing to expand this narrative, particularly as company owners like Elon Musk, Jeff Bezos, and Mark Zuckerberg are supportive of and financially benefitting from regimes like Trump’s presidency.

Governments around the world need to act fast; these scandals won’t go away while the men involved can continue to reap the benefits. As well as individuals making money from posting covertly filmed videos, the right-wing benefit politically from this intimidation of women in public spaces.

If misogyny is to be seriously addressed in modern society, we need to recognise the role that technology plays in enabling it.

nightlife” by mripp is licensed under CC BY 2.0.