
MaskPark and the Silence around China’s Gender-Based Violence Online
When the MaskPark incident broke in mid-2025, it jolted the Chinese internet (Hawkins 2025). Hidden behind the encrypted walls of Telegram—a platform officially blocked in China but accessible through virtual private networks (VPNs)—the story first came to light when a young woman accidentally discovered that her photos had been shared on the private MaskPark channel by someone she knew. As she scrolled down, she uncovered an enormous ecosystem trading in non-consensual sexual images: deepfakes, voyeuristic videos, and extortion material involving tens of thousands of women. Much like South Korea’s notorious Nth Room scandal (Souza 2020), MaskPark revealed how encryption, anonymity, and cryptocurrency payments have fused into an ecosystem of gendered exploitation.
The incident initially received a lot of attention. Hashtags related to the issue on Weibo gained more than 270 million views within a few days, as users expressed shock at the scale of exploitation and the ease with which non-consensual images circulated online (Chen 2025). Even international media picked up the story. The Australian ABC News observed that women began mobilising informally, sharing safety tips, warning each other about exploitative Telegram channels, and urging stronger action against online sexual violence (Wang 2025). Influencers and activists posted messages warning that ‘silence will only make it worse’, trying to push the government to take meaningful action (Kaufman 2025b).
In the following days, a few informal collectives within China such as Free Nora (自由娜拉) and Human Sense (普通智人) tried to coordinate responses by collecting evidence, supporting victims, and raising public awareness. Yet, in the absence of any formal mechanism to report or remove illicit content, such initiatives quickly stalled. Unlike in South Korea, where the Nth Room scandal led to the creation of a national hotline and digital sex-crime reporting centre (Seoul Metropolitan Government 2022), China lacks a transparent, rights-based system to handle such cases. Volunteers working without legal protection or data-security protocols could make little progress.
Meanwhile, as the online debate intensified, the government moved swiftly to reframe and suppress the discussion. Posts linking MaskPark to systemic failures in gender protection were deleted; feminist accounts that called for coordinated action found themselves suddenly suspended (Kaufman 2025b). By late summer, the term ‘MaskPark’ had effectively vanished from China’s digital sphere. Search results in Chinese returned only a few reports or sceptical commentaries questioning whether the case had ever truly existed. What began as a collective outcry against technology-facilitated gender-based violence (TFGBV) was gradually reframed as isolated incidents, leaving individual women to navigate a legal system devoid of dedicated protection.
The contrast with South Korea’s Nth Room case could not be sharper. That scandal received international coverage, inspired documentaries and legal reforms, and became a global symbol of digital sexual violence and feminist resistance. MaskPark, by comparison, was quickly buried under censorship and denial. Despite its similar scale and brutality, it failed to generate sustained public debate, revealing how information control and political sensitivity continue to shape which forms of gendered violence are allowed to be seen.
Beyond MaskPark TFGBV
MaskPark is not China’s first encounter with TFGBV. In 2020, following the public attention surrounding South Korea’s Nth Room case, several Chinese pornography websites, including the notorious Yamiao Forum (芽苗论坛), were exposed for distributing child sexual abuse material (Feng 2020). Police swiftly shut these sites, yet little was said about the protection of victims or the prosecution of perpetrators, revealing a familiar pattern: decisive technical action, but silence on justice and care.
More recent cases demonstrate how frequent, diverse, and technologically sophisticated TFGBV incidents have become across China. To quote just a few examples, in Beijing, the Bai artificial intelligence (AI) deepfake pornography case marked China’s first criminal trial involving AI-generated ‘undressed’ images. Bai had created and sold more than 6,000 synthetic images of women, 1,500 of which were classified as pornographic, before pleading guilty at the Haidian District Court in April 2024 (Wang 2024). In June 2025, a Chinese PhD student in the United Kingdom convicted of sexually assaulting dozens of women—many filmed with hidden cameras—was sentenced to 24 years (Cook and Sandford 2025). The next month, a female student from Dalian Polytechnic University was doxed and vilified online after a foreign man leaked intimate videos of their encounter; supportive commentary on Chinese social media was quickly removed (Kaufman 2025a). Together, these cases highlight how entrenched and pervasive TFGBV has become in China.
This year, I joined one of the initiatives that aim to document cases of TFGBV in China and to monitor Chinese-speaking Telegram channels that circulate non-consensual sexual content. What I have seen confirms that TFGBV in China is both pervasive and persistent. The perpetrators act with remarkable confidence, convinced that they are invisible, anonymous, and beyond the reach of the law. Considering this, I can confidently say that MaskPark was not an isolated scandal—and it will not be the last.
As elsewhere, the forms of TFGBV that can be found on the Chinese internet extend far beyond voyeuristic recordings or so-called revenge porn (Bloom 2014). They include AI-generated deepfake pornography featuring relatives, classmates, or random bloggers on social media; doxing combined with sexualised threats; the online sale of hidden-camera footage from hotels, gyms, and even university dormitories; and upskirting/downblousing tutorials that teach users how to capture and monetise such material. There are even channels dedicated to instructing members about the technicalities of doxing, AI synthesis, and cryptocurrency payments. These are not manifestations of individual perversion but elements of a growing illicit economy—one that treats women’s bodies as infinitely replicable digital commodities. Each new form exposes the same structural gap: technology evolves rapidly, but the systems of law and governance meant to protect its users remain reactive and opaque, if not outright indifferent.
Even though we can count on assistance from lawyers, the support team with whom I volunteer often finds itself constrained by the same barriers that entrap the victims we try to support. Police reports must be filed in person, by the victims themselves, which sometimes is an impossible step for many who fear further exposure, retaliation, or shame. For volunteers, one of the greatest challenges lies in reaching out to those already hurt without retraumatising them. Lawyers who attempt to invoke privacy or defamation clauses quickly discover that there is no dedicated legal category for TFGBV against adults in China. Each case must be awkwardly fitted into outdated provisions on ‘obscenity’ (淫秽物品), the ‘right to one’s likeness’ (肖像权), or ‘public order offences’ (危害社会秩序). In some instances, victims who insist on pursuing justice even risk being accused of making false allegations—a threat that further discourages reporting and reinforces silence (Hu 2025).
MaskPark, then, was only the most visible rupture in a much wider landscape. Beneath it lies a continuum of unacknowledged violence that continues far beyond the moment when the crime is exposed.
When Barriers to Justice Become Another Form of Violence
For survivors of TFGBV in China, the pain rarely ends with exposure; it often begins again when they try to seek justice. What starts online as a violation of privacy and consent soon transforms into a second ordeal offline, shaped by disbelief, stigma, and bureaucratic neglect. In essence, this is double victimisation (Doerner and Lab 2014): women not only endure the original act of digital violence but also are forced to relive it through the very systems meant to protect them.
In interviews I conducted with Chinese women who experienced this trauma, this pattern recurred with striking consistency. Their first victimisation occurs when private images are stolen, fabricated, or circulated without consent; the second begins when they seek help. And the second wave of harm comes from several directions, including law enforcement, online platforms, and even their own families and communities.
Victims who try to report to law enforcement are often told that there is ‘not enough evidence’ or that the platforms involved are ‘outside the jurisdiction’. Others are advised to settle privately with perpetrators or to ‘move on’ for the sake of their reputation. The majority, however, never even make it that far: many are unable even to file a report or are simply ignored. One widely discussed case involved a male law student who allegedly used AI to turn photos of his friends, classmates, and other acquaintances into sexually explicit images; he received nothing more than a warning letter from his Hong Kong university (Ma and Yiu 2025).
Besides, law enforcement officials often lack gender-sensitive training, resulting in victim-blaming or dismissal of complaints (see Sun et al. 2022; Lin and Yuan 2023). Instead of empathy, survivors are met with suspicion. They are asked: Why have you taken such photos or shared them with someone online? If you are the one who took these images, why didn’t you delete them? The burden of proof falls entirely on the victim, while perpetrators continue to hide behind anonymity and technological loopholes.
Online platforms further compound this neglect. Victims who report deepfake or non-consensual sexual content to social media platforms receive little meaningful response, whether from domestic or international apps. One survivor told us that RedNote required her to fill out several lengthy report forms, but then failed to reply.
Although major AI companies such as ChatGPT and DeepSeek have explicit policies prohibiting the generation of AI deepfake pornography, we have found large online groups—some with thousands of members—openly sharing techniques to bypass these restrictions. Cryptocurrency transactions using Tether (USD₮) and other tokens make the trade of such material effectively untraceable, while the anonymity of encrypted platforms allows perpetrators to act with impunity.
Telegram bears much responsibility for this. Although the platform announced that it would ‘remove publicly available illegal content such as pornography or abuse shared in public channels or groups’, in practice, even after they receive a report, abusive channels that have been shut on these grounds reappear a few hours later under new names (Telegram n.d.). Each platform’s neglect, in its own way, contributes to the spread of TFGBV and creates a system that further compounds the plight of survivors.
In the Chinese context, this is intertwined with deep-seated cultural stigma. In a society in which sexual morality is still policed through shame, victims of digital exploitation are often seen as complicit in their own abuse (Mckinlay and Lavis 2020). Public sympathy quickly gives way to moral judgement; online comments routinely question women’s virtue rather than condemn the perpetrators. Such attitudes silence survivors before they can speak. ‘When I went to expose the perpetrator online,’ one woman told me, ‘people started to defend him by calling me a slut; they even doxed me and many stalked me online.’ In worse cases, such harm comes from family members. As one survivor recounted:
I told my mum that someone took photos of my private parts in the toilet, but her first reaction was to ask me what clothes I was wearing and how could I have made this happen. I was shocked, but even more, I felt ashamed. I felt like it was me who had done something wrong, that I was the one who had to be punished.
For these women, seeking justice is not a path to recovery. Instead, it is a continuation of harm. The digital violence that began with an image becomes embedded in every interaction that follows, reproducing itself through disbelief, stigma, and institutional neglect.
Reflections as a Practitioner and Researcher
Working on TFGBV in China often feels like navigating uncharted territory. There is no organisation specifically dedicated to this issue, no established model to follow, and very little, if any, institutional support. Every step, from evidence collection to survivor outreach, is an act of exploration, carried out by a handful of volunteers trying to traverse legal, emotional, and technical minefields without a map.
What makes the situation even more difficult is the absence of institutional allies. Civil society organisations that might otherwise support survivors are constrained by registration barriers and political sensitivity. Volunteers work in isolation, without funding, trauma-informed training, or data protection capacity. Our own attempts to coordinate cross-platform reporting or preserve evidence have often been met with silence. In practice, TFGBV in China has become a blind spot by design: too sensitive for public debate, too technical for ordinary citizens, and too ‘private’ for the state to prioritise.
The challenges for us as researchers are equally daunting. Very few scholars in China systematically study TFGBV, and even fewer adopt a feminist or survivor-centred lens. Quantitative data are virtually non-existent; qualitative insights rely on fragmented casework, informal interviews, or news reports. There is no accessible dataset, no official statistics, and no ethical or technical infrastructure for safe survivor participation. Academic institutions, too, tend to treat this topic as peripheral: ‘individual cases’, ‘minor offence’, or ‘too sensitive’ are common refrains.
For those of us working between practice and research, these absences carry a heavy emotional weight. We are often the first—and sometimes the only—people to whom survivors reach out. We translate technical evidence into legal language, listen to stories of humiliation and fear, and try to document what little can be verified before it disappears from the internet. The work is slow, invisible, and precarious. Yet each conversation, each deleted image, and each survivor who chooses to speak reminds us why we must continue.
It is not just personal frustration that I am venting; what I am describing are symptoms of a broader structural failure. When the responsibility for addressing digital gender violence falls entirely on a few volunteers and informal networks, it exposes how truly fragile is China’s ecosystem of protection. The absence of institutional allies, the silence of platforms, and the reluctance of the state to engage have together created a vacuum in which technology advances but justice does not. The challenge, then, is not only to support individual survivors but also to find ways to confront the governance architecture that makes such violence inevitable and invisible.
Featured Image: Jose Navarro (CC), Source: Wikimedia Commons.
References





