Gendered Privacy Evaluation Framework Methodology

Version: June 2024

Why is this framework needed? 

The digital landscape continues to grow exponentially, transforming the way individuals interact, communicate, and access information. With this rapid expansion comes an alarming increase in technology-facilitated gender-based violence (TFGBV), disproportionately affecting women and girls worldwide. This pervasive issue underscores the critical need to prioritize gendered privacy considerations in the design and implementation of digital tools, applications, and online platforms. Various manifestations of TFGBV may stem from different contexts, but they all share a commonality: the infringement on the targeted individual's privacy rights. Unauthorized access is the main issue with TFGBV - the survivor not having given permission to access them, their account, their email, etc. Ensuring the right to privacy is essential for safeguarding women and gender-diverse individuals from various forms of TFGBV, such as non-consensual image-sharing, doxxing, harassment, and stalking.

Research indicates gender disparity in digital privacy concerns, with women showing more notable concern about their online privacy compared to men. However, despite their heightened concerns, women often possess a lower level of awareness regarding potential technology, data, and design threats. This lack of awareness is exacerbated by the male-dominated nature of the technology industry, and this gender disparity perpetuates a cycle of disengagement where women acquire lower levels of protection, despite being disproportionately affected by TFGBV. 

Online safety tools and interventions play a pivotal role in empowering users to navigate digital spaces safely and mitigate the risks of online harassment and abuse. By providing features such as filtering, blocking, and reporting mechanisms, these tools offer practical solutions for individuals, particularly women and marginalized groups, who are disproportionately affected by online threats. They play a crucial role in empowering users to navigate digital spaces safely and confidently, offering resources to combat online harassment and abuse. 

However, without a standardized framework for evaluation, it becomes difficult to assess the impact and efficacy of these interventions in addressing the diverse needs of users, particularly women and others who are disproportionately affected by online safety issues. Existing privacy frameworks often fail to account for the privacy concerns of people who experience heightened risk. This evaluation framework aims to offer guidance that can help technologists, researchers, and designers enhance their capacity in addressing the concerns and meeting the needs of some of technology's most vulnerable users.

Developing a gendered privacy evaluation framework for online safety tools is important for the following reasons:  

What types of tools is this framework applicable to? 

The tool scope for the evaluation framework encompasses a diverse range of online safety tools and applications designed to address online safety concerns broadly, or which may address one or more key online safety concerns (such as online harassment or non-consensual intimate image sharing) or specifically target vulnerable populations (such as women or gender minorities). This includes: 

Geographical Scope:

User Base Consideration:

What is a gender approach to privacy?

A gender approach to data privacy involves considering the ways in which individuals' gender identities and experiences can impact their privacy rights and vulnerabilities in the digital realm. Here are some key aspects of a gender approach to data privacy that we aim to address with this gendered privacy evaluation framework:

  1. Understanding gendered risks: Recognizing that certain data privacy risks may disproportionately affect individuals based on their gender identity or expression. For example, individuals may face unique risks related to the exposure of sensitive information if their gender identity is not respected or disclosed without their consent.
  2. Intersectionality: Acknowledging that gender intersects with other aspects of identity, such as race, ethnicity, sexual orientation, disability, and socioeconomic status, to shape individuals' experiences of privacy and data protection. Intersectional analysis helps uncover how multiple forms of discrimination can compound privacy risks for marginalized groups.
  3. Inclusive policy and regulation: Developing data privacy policies and regulations that address the specific needs and concerns of diverse gender identities. This may include measures to prevent discrimination, protect against technology facilitated gender-based violence or stalking online, and ensure that individuals have control over how their gender-related information is collected, stored, and shared.
  4. Data security: Implementing robust data security measures to safeguard user information from unauthorized access, misuse, or exploitation. This includes encryption, access controls, and regular audits to identify and address vulnerabilities in systems that store or process sensitive data.
  5. Support and education: Supporting individuals, especially those from marginalized gender groups, with knowledge and resources to protect their privacy rights online. This may involve providing guidance on privacy settings, safe browsing practices, and how to respond to privacy violations or harassment.
  6. Research and awareness: Conducting human centered design research to better understand the intersection of gender and data privacy, including the unique challenges faced by different gender identities and the impacts of data-driven technologies on gender equality and social justice. Increasing awareness of these issues among policymakers, technology developers, and the general public is also essential for driving positive change.

International human rights frameworks

From an international human rights framework perspective, gendered privacy is justified by several key principles and instruments.

  1. Universal Declaration of Human Rights (UDHR): Article 12 of the UDHR recognizes the right to privacy, stating that "no one shall be subjected to arbitrary interference with his privacy, family, home or correspondence." Gendered privacy aligns with this principle by acknowledging that privacy violations often have gender-specific implications, such as the disproportionate impact of technology-facilitated gender-based violence on women and marginalized groups.
  2. International Covenant on Civil and Political Rights (ICCPR): Article 17 of the ICCPR protects individuals against unlawful or arbitrary interference with their privacy, family, home, or correspondence. Gendered privacy extends this protection by addressing the unique privacy concerns faced by individuals based on their gender, including online harassment, surveillance, and discrimination.
  3. United Nations Resolutions and Declarations: Resolutions of the United Nations General Assembly and Human Rights Council recognize the importance of privacy rights in the digital age and highlight the specific vulnerabilities of women and marginalized groups to privacy violations. These resolutions call for preventive measures and remedies to address gendered privacy concerns.
  4. Yogyakarta Principles: These principles provide guidance on the application of international human rights law in relation to sexual orientation, gender identity, gender expression, and sex characteristics. Principle 6 specifically addresses privacy rights, emphasizing the need to protect individuals' privacy regardless of their gender identity or sexual orientation.

Inspiration 

We did an extra review of existing frameworks and standards and used those as a resource to inform our framework and address existing gaps in the space. 

Theory of change 

The theory of change underlying the gendered privacy evaluation framework is based on the belief that by developing a systematic approach to addressing gender-specific privacy concerns in digital spaces, we can effectively mitigate the risks and harms experienced by women and gender-diverse individuals. This framework aims to bring about positive outcomes through several key steps:

  1. Recognition of gendered privacy concerns: The first step involves acknowledging and understanding the unique privacy risks faced by women and gender minorities online. By recognizing these gender-specific vulnerabilities, stakeholders can prioritize the development of interventions tailored to address these concerns.
  2. Incorporation of gender perspectives: The framework emphasizes the importance of incorporating gender perspectives into the design, implementation, and evaluation of online safety tools and applications. By considering the diverse experiences and needs of women and marginalized groups, developers can create more effective and inclusive solutions.
  3. Enhanced protection and empowerment: Through the implementation of gender-sensitive privacy measures, individuals, particularly women and gender-diverse individuals, are better equipped to protect themselves from online threats and violations. This empowerment fosters a sense of agency and control over one's online experiences, contributing to greater digital autonomy and resilience.
  4. Advocacy for gender-inclusive practices: By setting clear standards and criteria for evaluation, the framework aims to encourage tech companies and policymakers to prioritize women's rights in their product design and development processes. This advocacy for gender-inclusive practices not only improves the quality of online safety tools but also signals a commitment to promoting gender equality and social responsibility within the technology sector.

Who is the framework for and how can they use it? 

  1. Tech companies and developers: Tech companies and developers can utilize this framework to integrate gender-sensitive privacy measures into their online safety tools and applications. By considering the unique privacy concerns of women and gender-diverse individuals, they can better protect their users from online threats and harassment. They can look at the framework as a check-list and conduct internal audits of how their technology is designed. 
  2. Policymakers and regulators: Policymakers and regulators can leverage this framework and results of the evaluation framework to develop gender-inclusive policies and regulations that govern digital privacy and online safety. By incorporating gender-sensitive approaches into their legislative and regulatory frameworks, they can create a more enabling environment for the protection of privacy rights.
  3. Civil society organizations (CSOs) and advocacy groups: CSOs and advocacy groups can use this framework to assess tech companies and use the results to raise awareness about gendered privacy concerns, advocate for policy changes, and hold tech companies and policymakers accountable. 
  4. End users and consumer groups: End users and consumer groups, including women and gender-diverse individuals, can benefit from this framework by better understanding their rights, and as a result be equipped to demand gender-inclusive online safety tools and applications, and raise awareness about gendered privacy concerns. By understanding their rights and responsibilities in digital spaces, they can navigate online environments more safely and confidently.
  5. International organizations: International organizations, such as United Nations Population Fund, can support the further development and implementation of this framework through advocacy, capacity-building, and resource mobilization.

Guide for conducting the research

Preparation 

  1. Figure out how your evaluation project might affect things and what results you expect.
  2. Write a short summary of your project plan to keep everything clear.
  3. Think about what could go wrong and make plans to deal with those risks.
  4. Pick which tools and services you'll study to learn more.
  5. Plan out how you'll do your research, depending on what you have available.
  6. Decide if you need any extra tools or methods to help with your research and analysis.
  7. Set up a research document, ideally a spreadsheet, where you can enter your findings. 

Research steps 

Depending on your available resources and relationship to the tools you are evaluating, your approach to the research process may vary.

Scenario A: If you are a civil society actor aiming to evaluate a set of tools to advocate for better regulation around gendered privacy, we recommend planning for three research phases. First, conduct an evaluation based on publicly available information. Then, share preliminary findings with the tools being evaluated for feedback, if no risks were identified in your preparatory steps. This step can help uncover evidence that may not have surfaced otherwise. Finally, integrate the feedback, finalize your findings, calculate the scores, analyze the results, and disseminate them according to your plans.

Scenario B: If you are an open-source tech development team seeking to self-evaluate your tool designed to assist women facing online harassment, which is still in development, consider using the framework as a checklist to achieve the best score before launching your product.

Scenario C: If you are an international organization aiming to support small tech startups and promote good practices around gendered privacy in tech tool development, consider adopting the Scenario A approach. However, during the second phase, researchers could collaborate with the tool teams, potentially allowing the final phase to be skipped.

What constitutes evidence? 

Overall, we aim to encourage tool teams to publish as much information as possible related to the standards outlined in this framework, as it contributes to the gendered privacy principles we wish to support with the framework. However, we understand that there might be risk scenarios where publishing some of the documentation might cause more harm than good. 

Evidence in this context refers to verifiable information or data that supports the findings and conclusions of the evaluation. This includes various forms of documentation, such as reports, surveys, interviews, case studies, user feedback, and any other relevant sources of information providing insight into the performance and impact of the evaluated tools. The evidence should be credible, reliable, and robust enough to support the assessment criteria and conclusions drawn from the evaluation process.

The type of evidence also depends on the scenario you fall under. In Scenario A, during the preparatory steps, you can decide if you are only considering publicly available information. In that case, when seeking company feedback, you need to clarify that you cannot consider documents and statements from the tool teams unless they are published. In Scenarios B and C, you can rely on internal documents, as you are either the tool team itself or working closely with them.

Scoring

Score Description
0 Inadequate - The platform does not meet the benchmark. This score indicates a complete lack of consideration for the specified criteria.
25 Developing - The platform meets this criteria in part, but not in full. There is some attempt or progress towards meeting the criteria, but it falls short of full implementation.
50 Partial - The platform meets this criteria in part, but not in full. The researchers see an attempt to meet the criteria or progress towards meeting it.
75 Substantial - The platform meets this criteria to a significant extent. This score indicates that the platform largely implements the criteria with minor shortcomings or deficiencies, reflecting a strong but not complete adherence to the standards.
100 Comprehensive - The platform meets this criteria unequivocally. This score indicates that the platform fully implements the specified criteria without any shortcomings or deficiencies.
0 Undeterminable - Researchers were not able to determine using publicly available information and personal use of the platform whether the platform meets the criteria.
N/A This criteria does not apply to this platform. Excluded from score and averages.

Criteria

1. Tools should clearly commit to privacy, as a human right

Criterion Description of the standards or point of reference Why? How to measure? How do you find this information?
Does the tool establish a clearly stated and explicit policy commitment towards human rights, which encompasses privacy? This metric aims to find proof that the company has clearly stated policy commitments to both freedom of expression and information, as well as privacy. The tools do not need to explicitly mention human rights and privacy but need to be clearly stated that they consider human rights as part of their work. Look for annual reports, privacy policies, ESG reports, etc. for relevant mentions. Look in the company’s annual reports, human rights briefs, or privacy policy for references to human rights or privacy commitments.
Does the tool establish systems to uphold its pledges and build capacity regarding gendered privacy internally? This metric aims to ascertain if and how the tool supports its employees in understanding online gender-based violence and privacy. The tool should provide information about employee training on privacy issues related to technology-facilitated gender-based violence. Look for evidence in the company’s internal documentation regarding employee training and its commitment to privacy standards.
Does the tool assess gendered privacy risks associated with its tool before the actual development? This ensures that risks related to gendered privacy are understood before development begins. Check for evidence of risk assessments or documentation detailing how gendered privacy risks are identified. Search for reports, documentation, or research briefs regarding the tool's risk assessment processes.

2. Support structures for privacy and gender inclusivity

Criterion Description of the standards or point of reference Why? How to measure? How do you find this information?
Is the tool’s grievance mechanism usable, transparent, and consistent, allowing users to lodge complaints if they believe their human rights have been negatively impacted by the company's policies or actions? Digital tools addressing women safety are required to transparently outline a mechanism for addressing grievances related to gendered privacy and other human rights concerns. The tool should provide clear and accessible grievance mechanisms. Review the tool’s grievance policy or user interface to confirm the accessibility and transparency of grievance mechanisms.
Does the tool provide end user support 24/7 including support for urgent needs and requests from users, including editing or deleting personal data? This metric ensures the tool provides sufficient support for users who may need to urgently edit or delete personal data. Assess if the tool provides 24/7 user support and offers quick assistance with personal data management. Look for the tool’s support policy, or test the support system to confirm its availability and responsiveness.
Does the tool clearly explain its procedures of reporting mechanisms for incidents of harassment, doxxing, stalking, and non-consensual sharing of intimate images? Clear reporting mechanisms are essential to address harassment and protect users from abuse online. The tool should explain how users can report incidents and make it easily accessible. Look for the tool’s documentation on how to report incidents of harassment or abuse.

3. User education on gendered privacy

Criterion Description of the standards or point of reference Why? How to measure? How do you find this information?
Does the tool produce informative resources aimed at educating users on safeguarding against gendered privacy related threats pertinent to the products or services they offer? Due to the extensive data digital tools possess about their users, they often become targets for malicious individuals. This assists users in safeguarding themselves against gender-based violence risks. The tool should produce resources such as tips, FAQs, tutorials on privacy and account security. Check the tool’s website or user interface for clear, understandable educational content related to gendered privacy.

4. Data handling practices

Criterion Description of the standards or point of reference Why? How to measure? How do you find this information?
Does the tool ensure robust encryption measures for both data stored and in transit, ensuring the protection of sensitive information from unauthorized access or interception? Ensuring strong encryption helps protect sensitive user information from unauthorized access. The tool should implement encryption protocols such as AES or TLS to protect data both at rest and in transit. Review the tool’s security documentation to verify encryption standards used for data protection.

5. Gender-Inclusive and User-Centered Design

Criterion Description of the standards or point of reference Why? How to measure? How do you find this information?
Does the tool incorporate participatory methods and human-centered approaches to understand and address gender-based violence (GBV) and harmful practices (HP) in digital spaces, resulting in actionable outputs such as risk assessments and user profiles? Participatory and human-centered research helps in understanding digital risks and producing outputs like risk assessments. Check if the development process includes community input, such as consultations or user feedback. Look for research reports, methodology documents, or deliverables from the tool’s development process.

6. Stakeholder engagement

Criterion Description of the standards or point of reference Why? How to measure? How do you find this information?
Does the tool team engage women's rights organizations and community stakeholders to gather feedback on product features and initiatives? Engagement with stakeholders ensures privacy concerns are addressed through collaboration with experts in the field. The tool should engage relevant stakeholders, particularly women's rights organizations, in its development process. Review reports or statements from the tool’s development team about stakeholder involvement.

Glossary 

  1. Policy commitment: A publicly available statement that represents official company policy which has been approved at the highest levels of the company. Source: Ranking Digital Rights’ Glossary
  2. U.N. Guiding Principles on Business and Human Rights' Operational Principle 16: it states that companies should adopt formal policies publicly affirming their commitments to international human rights principles and standards. Source: UNGPs on Business and Human Rights 
  3. Gendered privacy: Gendered privacy involves the protection of individuals' privacy rights with specific consideration for their gender identity and experiences. It addresses the unique privacy concerns and risks faced by individuals due to their gender, ensuring that privacy protections are inclusive and sensitive to gender-related factors. Source: 
  4. Technology-facilitated gender-based violence: it refers to the misuse of technology and digital spaces to perpetrate violence against individuals based on their gender. This form of violence encompasses various behaviors such as sextortion, cyberbullying, online harassment, and doxxing, among others. It invades both online and physical spaces, impacting the health, safety, and well-being of women and girls, as well as their communities. Source: UNFPA
  5. Gender-based violence (GBV): it refers to harmful acts directed at an individual based on their gender. It encompasses a wide range of physical, sexual, and psychological abuse, including domestic violence, sexual harassment and assault, trafficking, forced marriages, and female genital mutilation. GBV disproportionately affects women and girls but can also impact men, boys, and non-binary individuals. The violence is rooted in power imbalances and systemic gender inequality, and it can occur in various settings, including homes, workplaces, and online environments. GBV has severe physical, emotional, and social consequences, undermining the safety, dignity, and human rights of those affected.
  6. Harmful Practices: Actions, behaviors, or policies that result in negative consequences for individuals or groups, often perpetuating inequality, discrimination, or violence. In the context of gender and digital privacy, harmful practices may include online harassment, misuse of personal data, discriminatory algorithms, or any digital tool that exacerbates gender-based violence or privacy violations. These practices can undermine human rights, safety, and well-being, particularly for marginalized or vulnerable populations
  7. ICT assessment: Evaluations of Information and Communication Technology systems to determine their effectiveness, risks, and potential impacts, especially in the context of gender-based violence and privacy concerns.
  8. Gendered privacy governance: The policies and practices that address privacy concerns specifically related to gender issues.
  9. Operational Principle 16 of the U.N. Guiding Principles on Business and Human Rights: A principle mandating companies to publicly adopt formal policies affirming their dedication to international human rights principles and standards.
  10. ESG reports: Environmental, Social, and Governance reports that detail a company’s performance in these areas.
  11. Technology-facilitated gender-based violence: Online behaviors or actions that target individuals based on their gender and cause harm.
  12. Digital Services Act (DSA): EU legislation aimed at creating a safer digital space by regulating online platforms.
  13. Bulk text and assumed consent: Terms referring to overwhelming or unclear user agreements that assume user consent without explicit approval.
  14. Data minimization: The practice of limiting data collection to only what is necessary.
  15. Personally Identifiable Information (PII): Data that can identify an individual, such as names, locations, IDs, or IP addresses.
  16. Universal Declaration of Human Rights (UDHR): An international document stating fundamental human rights to be universally protected.
  17. General Data Protection Regulation (GDPR): EU legislation on data protection and privacy.
  18. Encryption: The process of converting information or data into a code to prevent unauthorized access.
  19. Participatory methods: Approaches that actively involve stakeholders in the research or development process.
  20. Human-centered design: Designing products or systems with a focus on the needs, preferences, and limitations of the end-users.
  21. Mixed methods: Research approaches that combine qualitative and quantitative techniques.
  22. Grievance mechanisms: Procedures for addressing complaints and resolving issues raised by users.
  23. Stakeholder engagement: The process of involving individuals, groups, or organizations that may affect or be affected by a project or decision.
  24. Cybersecurity: The practice of protecting systems, networks, and programs from digital attacks.
  25. Human rights impact assessments: Evaluations conducted to understand the impact of activities or policies on human rights.
  26. User education on gendered privacy: Providing resources and information to help users safeguard against privacy threats related to gender.
  27. Digital intervention: The use of digital tools and technologies to address specific issues, such as gender-based violence.

Authorship 

Leave us a note at team@womensrightsonline.net