GLOSSARY

The glossary is not exhaustive; it only provides an overview of some practices that constitute the phenomenon of gender-based digital violence.

TECHNOLOGY-FACILITATED GENDER BASED VIOLENCE

TECHNOLOGY-FACILITATED GENDER BASED VIOLENCE

The term “technology-facilitated gender-based violence” or Cyber-VAWG (Cyber Violence Against Women and Girls) generally refers to a form of gender-based violence that manifests in the digital space through acts of abuse, intimidation, and discrimination, reproducing and amplifying the gender inequalities already present in offline society (EIGE, 2017; European Parliament, 2021). Existing literature suggests that digital violence disproportionately affects young women and girls (Cuenca-Piqueras et al., 2020; CYBERSAFE Project, 2020; Fansher & Randa, 2019), and can be perpetrated by a variety of actors (European Parliament, 2021), including family members, acquaintances, current or former partners, colleagues, classmates, and anonymous internet users.

In the digital space, women are constantly exposed to forms of humiliation simply due to their online presence (Bainotti & Semenzin, 2021; Eikren, Ingram-Waters, 2016; Goulds et al., 2020; Morahan-Martin, 2000). Moreover, Cyber-VAWG is characterized by a continuum between online and offline life: women who experience violence offline are often also victims of online violent behaviors (FRA, 2018). As with offline violence, Cyber-VAWG manifests in multiple forms, including hate speech, body-shaming, slut-shaming, doxing, cyberstalking, sextortion, gender trolling, technology-facilitated sexual violence, and the non-consensual dissemination of intimate images (revenge porn), thereby reproducing the structural inequality systems of society (Gius, 2021; Jane, 2016; 2017; Morahan-Martin, 2000).

Acts of Cyber-VAWG are characterized by high reproducibility, customization, and dissemination—traits typical of digital content (Jenkins, Ford, Green, 2006). Victims of cyber abuse are often without means of defense, particularly in the absence of appropriate legislative frameworks and protection systems. This has concrete consequences on the well-being of victims, who may experience depression and self-harm, and in some cases, even suicide (Citron, Franks, 2014; Vakhitova et al., 2021). Furthermore, there are significant repercussions for gender equality, negatively impacting women’s and girls' participation in public discourse (ElSherief et al., 2017), with self-censorship being used as a preventive strategy (EIGE, 2017).

 

Bibliography

Bainotti, L., & Semenzin, S. (2021). Violenza di genere online: il ruolo delle piattaforme digitali e la misoginia nella manosphere. In Cannito, M., & Bainotti, L. (Eds.), Genere e tecnologia: nuove frontiere della violenza e della discriminazione. Milano: FrancoAngeli

Citron, D. K., Franks, M. A. (2014). Criminalizing revenge porn. Wake Forest

Cuenca-Piqueras, C., Fernández-Prados, J. S., & González-Moreno, M. J. (2020). Face-to-face versus online harassment of European women: Importance of date and place of birth. Sexuality & Culture, 24(1),157-173.

CYBERSAFE project (2020). Cyber Violence against Women & Girls Report: Changing Attitudes among teenagers on Cyber-VAWG. 

EIGE (2017). Cyber violence against women and girls.

European Parliament (2021). Combating gender-based violence: Cyber violence. 

Eikren, E., Ingram-Waters, M. (2016). Dismantling ‘You get what you deserve’: Towards a feminist sociology of revenge porn. Ada: A Journal of Gender, New Media, and Technology,10,1–18

ElSherief, M., Belding, E., & Nguyen, D. (2017). #notokay: Understanding gender-based violence in social media. In Eleventh international AAAI conference on web and social media

Fansher, A. K., & Randa, R. (2019). Risky social media behaviors and the potential for victimization: A descriptive look at college students victimized by someone met online. Violence and Gender,6,115–123

FRA, 2018. Fundamental rights report (2018)

Gius, C. (2021). Addressing the Blurred question of 'responsibility': insights from online news comments on a case of nonconsensual pornography. Journal of Gender Studies,31(2), 193-203

Goulds et al. (2020). Free to be online? Girls’ and young women’s experiences of online harassment.

Jane, E. A. (2016). Misogyny online. A short (and brutish) history. London: Sage

Jane, E. A. (2017). ‘Dude stop the spread’: antagonism, agonism, and# manspreading on social media. International Journal of Cultural Studies, 20(5), 459-475

Jenkins, H., Ford, S., & Green, J. (2013). Spreadable media: Creating value and meaning in a networked culture. New York: NYU Press

Morahan-Martin, J. (2000). Women and the internet: Promise and perils. CyberPsychology& Behavior,3(5),683–691

Vakhitova, Z. I., Alston-Knox, C. L., Reeves, E., & Mawby, R. I. (2021). Explaining victim impact from cyber abuse: An exploratory mixed methods analysis. Deviant Behavior. Advance online publication.

HATE SPEECH

HATE SPEECH

The term “hate speech” generally refers to speech that expresses hatred and intolerance toward an individual or a group, which risks provoking violent reactions (Pino, 2008), particularly targeting historically vulnerable groups such as women and minorities, and contributing to their social exclusion (Waldron, 2012).

Although the concept is typically applied to speech in digital contexts, an initial definition can already be found in Recommendation No. 20 of the Committee of Ministers of the Council of Europe from 19971, which states:
The term hate speech should be understood as encompassing all forms of expression that spread, incite, promote, or justify racial hatred, xenophobia, antisemitism, and other forms of hatred based on intolerance, including intolerance expressed through aggressive nationalism and ethnocentrism, discrimination, and hostility against minorities, migrants, and peoples arising from migration flows.

With the rise of the internet and later social media, hate speech has acquired new specificities. UNESCO identifies four essential differences between hate speech that spreads through the web and “traditional” hate speech (Gagliardone et al., 2015):

1) Persistence over long periods and across different formats (platforms, messaging apps, photo or video content, etc.), traveling transversally through the web;

2) The possibility that an expression of hate can be retrieved via a different platform;

3) The possibility that anonymity allows the authors of hate speech to feel protected and comfortable acting behind a screen;

4) The ability to transcend national borders, a factor that complicates the identification of legal mechanisms to counter it (Ziccardi, 2020).

 

Notes

1: https://rm.coe.int/1680505d5b

 

Bibliography

Gagliardone, I., Gal, D., Alves, T., Martinez, G. (2015), Countering online hate speech, in https://unesdoc.unesco.org/ark:/48223/pf0000233231

Pino, G. (2008), Discorso razzista e libertà di manifestazione del pensiero, in Politica del Diritto, XXXIX, 2, pp. 287-305

Waldron, J. (2012), The Harm in Hate Speech, Cambridge University Press

Ziccardi, G. (2016) L'odio online. Violenza verbale e ossessioni in rete, Raffaello Cortina Editore, Milano

 

BODY SHAMING

BODY SHAMING

Body shaming refers to the ridicule, contempt, or devaluation of an individual based on physical characteristics such as weight, body shape, skin, facial features, and other aesthetic attributes considered “non-conforming” to socially and aesthetically accepted standards (Grabe, Ward, & Hyde, 2008). Women are particularly targeted by body shaming, as they are subject to greater scrutiny than men, with a predominant emphasis on thinness and youth as beauty criteria.

The roots of body shaming lie in a set of cultural and social norms that promote conformity to traditional beauty standards, such as slimness and youthful appearance (Cash & Smolak, 2011). The intense pressure to adhere to these aesthetic ideals creates an environment in which individuals who do not meet such standards are frequently subjected to criticism and contempt. Within this context, digital environments play a central role: media representations of bodies—both in advertising and on social networks—constantly reinforce certain aesthetic ideals, increasing the pressure on individuals who do not conform to those models. In particular, the use of social media has emerged as a crucial factor in intensifying body shaming, through the sharing of idealized images and the promotion of an aesthetic that excludes physical diversity (Fardouly et al., 2015).

In terms of its effects, body shaming constitutes a form of psychological violence that negatively impacts the self-esteem, mental health, and overall well-being of those affected. Victims of body shaming are often vulnerable to psychological issues such as depression, anxiety, body dysmorphia, and eating disorders (Tiggemann & Slater, 2014). Body shaming thus contributes to a culture of shame, which compels individuals to hide their bodies in order to avoid external judgment.

 

Bibliography

Cash, T. F., & Smolak, L. (2011). Body Image: A Handbook of Theory, Research, and Clinical Practice. The Guilford Press

Fardouly, J., Diedrichs, P. C., Vartanian, L. R., & Halliwell, E. (2015). Social comparisons on social media: The impact of Facebook on young women's body image concerns and mood. Body Image, 13, 38-45

Grabe, S., Ward, L. M., & Hyde, J. S. (2008). The role of the media in body image concerns among women: A meta-analysis of experimental and correlational studies. Psychological Bulletin, 134(3), 460–476

Tiggemann, M., & Slater, A. (2013). NetGirls: The Internet, Facebook, and body image concern in adolescent girls. International Journal of Eating Disorders46(6), 630-633

NON-CONSENSUAL DISTRIBUTION OF INTIMATE IMAGES

NON-CONSENSUAL DISTRIBUTION OF INTIMATE MATERIAL

The Non-Consensual Distribution of Intimate Material (NCDIM) refers to the dissemination of visual content (images, videos) of an intimate and personal nature without the consent of the individuals depicted (Citron & Franks, 2014).

This form of abuse includes both the distribution of intimate images obtained without consent and the dissemination of images originally shared consensually, typically within the context of intimate relationships. When the distribution concerns images consensually shared between intimate partners, the phenomenon is often referred to as revenge porn, although this term is considered misleading. The expression revenge porn links the deliberate act of disseminating intimate material with the concepts of revenge and pornography. On the one hand, it implies a justification for the perpetrator (revenge, typically following the end of a relationship), and on the other hand, it contributes to the victimisation of the individual depicted. The use of the term pornography in reference to sexually explicit images misleadingly suggests that the subject intended their public dissemination, even though such content was not consciously produced for public exposure.

Non-consensual distribution is characterised by four key elements:

1) The absence of consent from the person depicted, who has not authorised the distribution of the material;

2)The content involves sexual acts, nudity, or private situations that were originally intended to remain private;

3) The act is aimed at humiliating, blackmailing, or taking revenge on the victim;

4) The non-consensual sharing of intimate material causes serious harm to the victim, including psychological damage (anxiety, depression, post-traumatic stress disorder, suicidal ideation), social damage (stigmatization, loss of reputation, isolation), professional damage (dismissal or difficulty finding employment), and legal challenges (difficulty in having content removed, with consequences for the right to be forgotten).

In Italy, the non-consensual distribution of intimate material has been recognised as a criminal offence under Article 612-ter of the Italian Penal Code, introduced by Law No. 69 of 19 July 2019, commonly known as the Codice Rosso (Red Code)1.

 

Notes

1https://www.gazzettaufficiale.it/eli/id/2019/07/25/19G00076/sg

 

Bibliography

Citron, D. K., & Franks, M. A. (2014). Criminalizing revenge porn. Wake Forest L. Rev.49, 345.

CYBER-STALKING

CYBER STALKING

Cyberstalking is a form of harassment carried out through digital and technological means. It is characterised by repeated behaviours perpetrated by the same individual towards another person, with the intent to harass, intimidate, persecute, surveil, exert control, or establish unwanted communication or contact, through harmful actions that cause the victim to feel threatened, distressed, or unsafe in multiple situations. This phenomenon is often linked to gender-based violence: it primarily affects women, who—on the basis of their gender or the intersection of gender with other factors—are targeted by men driven by a sense of ownership or power dynamics. A survey conducted by the European Union Agency for Fundamental Rights in 2014 found that 5% of women in Europe had experienced online stalking at least once in their lifetime. Moreover, cyberstalking is one of the coercive control strategies used in abusive intimate relationships. Indeed, of 10 women who have experienced cyberstalking, 7 have also suffered at least one form of physical and/or sexual violence by an intimate partner (FRA 2014).

This phenomenon may involve various forms of control, such as the covert installation of apps on the targeted person’s device to monitor and track her movements, activities, and social interactions. It can also include hacking or cracking into the victim’s device to gain unauthorised access to personal communications or data stored on the device or online. Additional practices include the remote control of webcams and the use of smart devices to listen in on conversations. Cyberstalking may also manifest through a pervasive and intrusive presence on social media: the woman’s accounts may be monitored, comments posted under her content, obsessive tagging applied—even through fake accounts—and involvement in the same online groups.

Cyberstalking can have severe consequences for the mental health and emotional well-being of the targeted individual, who may feel trapped and deprived of her freedom, as well as subject to ongoing anxiety and distress. The anonymity and immediate access afforded by the internet amplify the reach of this violence, making it harder for victims to escape the stalker’s control. The constant feeling of being watched online can lead to high levels of stress and psychological tension. Additionally, such violence may force the affected woman to alter her behaviours and habits—both in digital and physical environments—in an effort to avoid any contact with the cyberstalker; this could involve changing her phone number or closing social media profiles to preserve her safety. Cyberstalking therefore undermines women’s ability to enjoy their individual rights.

In Italy, cyberstalking is classified as a criminal offence and falls under the category of acts of persecution. It is governed by Article 612-bis¹ of the Italian Penal Code, introduced by Law No. 39 of 2009. The Supreme Court has ruled that “harassment perpetrated in the virtual world of the Web holds the same criminal relevance as that in the real world.” However, current laws and protection mechanisms are not always adequate to address the speed and complexity of the digital world, often leaving victims in an even more vulnerable position.

 

 Notes

1https://www.gazzettaufficiale.it/atto/serie_generale  

 

Bibliography 

EIGE (2022). Cyber Violence against Women and Girls: Key Terms and Concepts 

FRA (2014). Violence against women: an EU-wide survey. Main results report 

DOXING

DOXING

Doxing refers to the intentional and non-consensual publication and dissemination of an individual’s personal information online, such as name, phone number, email address, and home address. While such data may already exist on the internet, it is often available in forms that restrict access or are difficult to trace. The term doxing—also spelled doxxing or d0xing—originates from the expression dropping documents or dropping dox, and has its roots in 1990s hacker culture, when it was used to reveal the identities of previously anonymous individuals. Today, doxing has evolved into a tool of digital violence capable of turning a person’s private life into a public target, violating their privacy and exposing them to potential harassment. This practice disproportionately affects women, who are often subjected to targeted attacks aimed at intimidating, silencing, or harming them. Doxing frequently occurs within the context of intimate partner violence.

Three main types of doxing can be identified:

  • Deanonymizing doxing: the publication of information that reveals the identity of a previously anonymous individual.
  • Targeting doxing: the dissemination of personal data that exposes specific and private aspects of a person’s life, thereby depriving them of privacy. 
  • Delegitimizing doxing: the sharing of information intended to damage a person’s credibility or reputation.

Doxing constitutes a clear violation of the European Union’s General Data Protection Regulation (GDPR)¹, which safeguards the privacy and personal data of its citizens. According to the GDPR, the unauthorised disclosure of personal—especially sensitive—information is an offence that may lead to legal and administrative sanctions. Article 5 of the GDPR outlines key principles, such as purpose limitation and the protection of personal data, both of which are breached in cases of doxing.

Beyond the legal framework, it is crucial to address doxing as a form of violence whose repercussions can be far-reaching. Individuals subjected to doxing may experience ongoing vulnerability and psychological distress due to the fear of being located or threatened by people in possession of their private information without consent. This can also lead to a restriction of women’s and girls’ freedom of expression, as fear for one’s privacy or safety may result in self-censorship or withdrawal from public discourse. Furthermore, the loss of control over one’s data, image, and reputation can diminish self-esteem and generate discomfort in everyday social interactions.

According to a report by the Pew Research Center, approximately 21% of internet users have experienced some form of doxing. There is a scarcity of data that allow for a gendered analysis of the phenomenon. However, a study by Amnesty International involving 4,000 women aged between 18 and 55 across eight countries (Denmark, Italy, New Zealand, Poland, the United Kingdom, Spain, Sweden, and the United States) found that 26% of the respondents had been victims of doxing.

 

Notes

1Regolamento Generale sulla Protezione dei Dati (GDPR)

 

Bibliography

Douglas, D. M. (2016). Doxing: A conceptual analysis. Ethics and information technology18(3), 199-210.

Amnesty International Italia. (2017). https://www.amnesty.it/ricerca-amnesty-international-rivela-limpatto-allarmante-delle-molestie-online-le-donne/

SEX-TORTION

SEX-TORTION

The term “sextortion” refers to the threat of disseminating sexually explicit images of an individual through technological means in order to coerce them into complying with specific demands—such as sharing additional intimate images, performing unwanted acts, or paying a ransom. This form of violence can occur in a variety of contexts, including intimate partner abuse, cyberbullying, the outing of LGBTQIA+ individuals or sex workers, online dating, sexual trafficking, online sexual exploitation of minors, hacking, and organised crime. In many cases, perpetrators falsify their identity, establish a relationship with the targeted individual, and obtain explicit images, which are subsequently used as leverage for blackmail.

Women are disproportionately affected by this form of violence. Sextortion also encompasses a dimension of gender-based violence—not only because it may take place in the context of abusive intimate relationships, but also due to the intense social stigma and shame attached to women’s sexuality and gender expression. The threat of public exposure of intimate content often carries particularly severe consequences for women, further compounding their vulnerability in both digital and offline spaces.

 

Bibliography

EIGE (2022). Cyber Violence against Women and Girls: Key Terms and Concepts 

GENDER TROLLING

GENDER TROLLING

Gender trolling is a form of online harassment or abuse that specifically targets the victim's gender, with the aim of insulting, intimidating, humiliating, or psychologically damaging the individual. This phenomenon predominantly affects women and individuals who do not conform to traditional gender norms, and it can include sexist insults, threats of sexual violence, body shaming, and other forms of psychological violence.

According to Mantilla (2013), gender trolling differs from more generic forms of trolling due to its intensity, persistence, and the use of explicit threats, which often include references to sexual violence, sexist insults, and practices such as doxxing and the non-consensual dissemination of the victim's personal information. Erin Jane (2017) emphasizes that gender trolling is not merely provocative behavior but a deliberate strategy aimed at silencing women and individuals who do not conform to traditional gender norms, particularly those active in public domains such as journalism, politics, and activism. This phenomenon is amplified by the dynamics of anonymity and virality on digital platforms: Citron (2014) includes it within a broader category of online hate crimes, highlighting how online threats have a tangible impact on the lives of victims, leading them to modify their behavior, limit their public presence, and, in many cases, abandon certain careers or activities out of fear of further attacks.

Gender trolling manifests through a set of specific characteristics, which often overlap and reinforce each other:

1) Sexist insults and threats of violence: Victims are targeted with insults that emphasize their gender and sexuality, using expressions that aim to delegitimize or undermine their competence. Frequent threats of rape or death are also used as intimidation tactics.

2) Coordination and persistence: Unlike generic trolling, gender trolling is often organized and involves groups of people coordinating to attack a victim systematically. This form of harassment can last for extended periods, with devastating effects on the victim's psychological health.

3) Doxxing and non-consensual dissemination of intimate material: One of the most severe tactics is the publication of private information with the intent to expose the victim to tangible dangers in daily life. This can also include the non-consensual sharing of intimate material, exacerbating the psychological, social, and reputational harm.

4) Effects of social censorship: Gender trolling not only has an individual impact but also contributes to reinforcing gender inequality by excluding victims from public discourse and limiting their freedom of expression. This phenomenon is particularly evident in hate campaigns directed at public figures.

 

Bibliography

Citron, D. K. (2014). Hate Crimes in Cyberspace. Harvard University Press.

Jane, E. A. (2017). Misogyny Online: A Short (and Brutish) History. SAGE.

Mantilla, K. (2013). Gendertrolling: Misogyny Adapts to New Media, Feminist Studies , 2013, Vol. 39, No. 2, A SPECIAL ISSUE: CATEGORIZING SEXUALITIES, pp. 563-570

DEEP FAKE

DEEP FAKE

The term "deepfake" refers to a technology based on artificial intelligence and machine learning that allows for the creation of videos, images, or audio that emulate, with great precision, the physical and behavioral characteristics of a person (Chesney & Citron, 2019; Maras & Alexandrou, 2018). This technique relies on deep learning algorithms that are capable of learning from a person’s visual and auditory features to produce highly realistic content, leaving few traces of manipulation (Chawla, 2019). The generated content appears both visually and audibly convincing, making it difficult to distinguish between authentic and falsified material.
The deepfake phenomenon has raised global concerns, particularly regarding its misuse in the context of misinformation, digital fraud, and, alarmingly, gender-based violence (European Parliament, 2021). According to the European Parliament, the technology used to generate sexually explicit deepfakes is predominantly developed to manipulate images of female bodies, a phenomenon exacerbated by the greater accuracy of detection systems in identifying male faces compared to female ones, thus limiting the effectiveness of countermeasures. The spread of non-consensual, sexually explicit, falsified content depicting women and girls serves as a tool of online violence, operating at the intersection of defamation, extortion, and intimidation, harming the reputation, dignity, and privacy of both adult and minor victims. Research company Sensity AI1  estimates that between 90% and 95% of all deepfakes involve non-consensual pornography. The dissemination of such falsified content can significantly impact the physical and psychological health of the depicted individuals, partly due to its pervasiveness, transmediality, and the difficulty of removing it from the internet (for further details, refer to the "Non-consensual dissemination of intimate material" section).
What makes this type of violence widespread and pervasive is the accessibility, in open-source mode, to software for creating realistic, high-quality deepfakes, enabling even users with limited technical skills and no artistic experience to modify audio-visual material with great precision, replace faces, alter expressions, and synthesize voices (Westerlund, 2019).
From a legal standpoint, the rapid spread of deepfakes has raised significant legal challenges, particularly concerning privacy protection and the safeguarding of women's rights. At the European level, the General Data Protection Regulation (GDPR)3 provides guidelines for the protection of personal data, but the emergence of deepfakes has highlighted the urgency of more targeted regulations to counteract the non-consensual dissemination of manipulated content. In Italy, with the introduction of the "Codice Rosso" (Law No. 69 of July 19, 2019)3, Article 612-ter was added to the Penal Code, criminalizing the unauthorized dissemination of sexually explicit images or videos, although it does not yet specifically address the risks related to deepfakes.

 

Notes

1: Georgio Patrini, Mapping the Deepfake Landscape, Sensity (blog), October 7, 2019

2Regolamento Generale sulla Protezione dei Dati (GDPR)

3:  Articolo 612 ter. Codice penale

 

Bibliography

Chawla, R. 2019. Deepfakes: How a pervert shook the world. International Journal of Advance Research and Development, 4(6): 4–8.

Chesney, R., Citron, D. K. (2019). Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security. 77 Maryland Law Review, 1-40

European Parliament (2021). Tackling deepfakes in European policy

Maras, M. H., Alexandrou, A. (2019). Determining authenticity of video evidence in the age of artificial intelligence and in the wake of Deepfake videos. International Journal of Evidence & Proof, 23(3): 255–262

Westerlund, M. (2019). The emergence of deepfake technology: A review. Technology innovation management review9(11).

UPSKIRTING

UPSKIRTING

The term "upskirting" refers to a form of gender-based violence facilitated by technology, which involves the non-consensual acquisition of images or videos of a person's intimate body parts, generally female, in contexts where the victim would not expect to be exposed (McGlynn & Rackley, 2017; Henry et al., 2020). This practice is perpetrated using devices such as smartphones or hidden cameras, and aims at the online dissemination or sharing within private groups, contributing to the commodification and objectification of the female body.
Recent studies highlight that upskirting has serious psychological consequences, including anxiety and depression; additionally, it leads to a loss of trust in institutions and a reduction in participation in public and digital life (Powell & Henry, 2017; Powell, Henry & Flynn, 2019; Henry & Flynn, 2020). Furthermore, the phenomenon is often minimized by the media and society, perpetuating a culture of tolerance toward sexual harassment (McGlynn et al., 2019).
At the European level, Directive 2011/93/EU on combating the sexual abuse and exploitation of children mentions the need to criminalize unlawful recording behaviors, but does not explicitly address the upskirting phenomenon. In Italy, there is no specific regulation addressing upskirting, but the phenomenon can be prosecuted under Article 612-ter of the Penal Code introduced by Law No. 69/2019 (Codice Rosso)1, which punishes the unlawful dissemination of intimate images. However, the absence of an explicit provision leaves room for interpretation and hinders the effectiveness of legal protection for victims (Femicide Commission, 2020).

 

Notes

1https://www.gazzettaufficiale.it/eli/id/2019/07/25/19G00076/sg

 

Bibliography

Commissione Femminicidio, (2020). Misure per rispondere alle problematiche delle donne vittime di violenza, dei centri antiviolenza, delle case rifugio e degli sportelli antiviolenza e antitratta nella situazione di emergenza epidemiologica da COVID-19

Flynn, A., Powell, A., & Henry, N. (2021). Image-Based Sexual Abuse: An International Overview. Journal of Criminology, 54(2), 134–152

Henry, N., Flynn, A. (2020). Image-Based Sexual Abuse: A Feminist Criminological Approach. Routledge.

Henry, N., Flynn, A., & Powell, A. (2020). Technology-Facilitated Sexual Violence: A Framework for Understanding Perpetration and Victimization. New Media & Society, 22(7), 1344–1365.

McGlynn, C., & Rackley, E. (2017). Image-Based Sexual Abuse.Oxford Journal of Legal Studies, 37(3), 534–561.

McGlynn, C., Rackley, E., & Houghton, R. (2019). Beyond ‘Revenge Porn’: The Continuum of Image-Based Sexual Abuse. Feminist Legal Studies, 27(1), 25–46.

Powell, A., Henry, N. (2017). Sexual Violence in a Digital Age. Palgrave Macmillan.

Powell, A., Henry, N., Flynn, A. (2019). The Role of Digital Platforms in Facilitating Image-Based Sexual Abuse. Journal of Criminology, 52(3), 276–292.

CATCALLING

CATCALLING

Catcalling is recognized as a form of gender-based violence and sexual harassment that targets women and girls in public spaces (Fileborn & O’Neill, 2021), and is therefore often referred to as “street harassment.” Although there is still no universally agreed-upon definition of the phenomenon (Logan, 2015), it can be described as a set of verbal and non-verbal behaviors, including evaluative and objectifying comments, whistling, noises, persistent staring, sexualized gestures, and, in some cases, even following the victim. These actions, typically perpetrated by strangers, aim to emphasize the sexualized dimension of the female body (Walton & Pedersen, 2021; Chaudoir & Quinn, 2010).

Despite the limited number of studies on the topic, existing literature highlights that street harassment is a widespread and systemic phenomenon, with global implications that cut across age groups and socio-cultural contexts. Victims may experience a range of emotional reactions, including fear, discomfort, and shame. The difficulty in defining the phenomenon also stems from its subjective and situated nature: the perception of harassment is highly dependent on context and the individual experience of the victim. Moreover, cultural normalization, the tendency to trivialize the phenomenon, and the reduced control over interactions in public as compared to private spaces all contribute to its persistence (Gray, 2016).

The limited academic attention to catcalling contributes to the marginalization of women’s experiences, often leaving them to develop individual strategies to avoid or mitigate the discomfort generated by such episodes (Farmer & Smock Jordan, 2017).

 

Bibliography

Chaudoir, S. R., Quinn, D. M. (2010). Bystander Sexism in the Intergroup Context: The Impact of Cat-calls on Women’s Reactions Towards Men. Sex Roles, Vol. 62, 623 634

Farmer, O., Smock Jordan, S. (2017). Experiences of Women Coping With Catcalling Experiences. Journal of feminist therapy, Vol. 29, No. 4, 205-225

Fileborn, B., O'Neill, T. (2021). From “Ghettoization” to a Field of Its Own: A Comprehensive Review of Street Harassment Research. Trauma, Violence, & Abuse, Vol. 22, 1-14

Gray, V. F. (2016). Men's stranger intrusions: rethinking street harassment. Women's studies international forum, Vol. 58, 1-20.

Logan, L.S. (2015). Street Harassment: Current and Promising Avenues for Researchers and Activists. Sociology Compass, 9, 196-211

Walton, K. A., & Pedersen, C. L. (2021). Motivations Behind Catcalling: Exploring Men’s Engagement in Street Harassment Behaviour. Psychology & Sexuality, 1–15.