Pictured: Astrid Holzinger (Austrian Centre for Peace) & Nathan Coyle (AIT) presenting at “Centring Women, Peace and Security in the Humanitarian-Development-Peace Nexus” in Vienna
BY John Bölker / ON 10 MAY, 2026
The disparities facing women in the digital world are well documented, yet remain deeply unacceptable, especially when viewed through a peace and security lens. Globally, women make up less than 30 per cent of the tech workforce, and their representation drops sharply in senior technical and leadership roles. In artificial intelligence, women hold only around 22 per cent of roles, reflecting a persistent participation gap highlighted by the World Economic Forum.
This imbalance is not simply a matter of fairness. Recent research shows that societies with higher levels of gender equality experience significantly lower levels of conflict and greater stability, because inclusive governance strengthens social resilience. The digital sphere mirrors these dynamics. A global study by the Economist Intelligence Unit found that more than 70 per cent of women have experienced online harassment, with the highest rates among young women. UN Women identifies technology-facilitated violence as a growing barrier to women’s participation in political and peace processes, limiting their ability to organise, mobilise, and lead.
At the same time, algorithmic systems often encode or amplify bias. The “Gender Shades” study found facial recognition error rates above 34 per cent for darker-skinned women compared with under 1 per cent for lighter-skinned men. Audits of automated content moderation have also shown that women’s speech is disproportionately flagged or removed. These failures have real consequences for peacebuilding: when digital systems misclassify, silence, or expose women to harm, they restrict access to public life, reduce civic engagement, and weaken the inclusive processes essential for sustainable peace.
These imbalances directly shape who benefits from digital innovation and who is put at risk by it. If PeaceTech is to have any credibility, gender equality must be a non-negotiable red line from the start. A PeaceTech ecosystem built on biased data, exclusionary design, or digital environments unsafe for women risks reinforcing the very inequalities peacebuilding seeks to dismantle.
It was against this backdrop that participants came together at the Global Conference “Centring Women, Peace and Security in the Humanitarian-Development-Peace Nexus”, organised by the Global Network of Women Peacebuilders (GNWP). I had the opportunity to support my colleagues Astrid Holzinger of the Austrian Centre for Peace and Nathan Coyle of AIT in preparing a session on how digital tools can support women’s and youth empowerment in peacebuilding.
Participants from Armenia, Azerbaijan, Georgia, Moldova, Ukraine, Palestine, Uganda, South Sudan and other conflict-affected regions reflected on what digital transformation actually looks and feels like in fragile contexts. Their contributions painted a complex picture, not of technology alone, but of power, vulnerability, and possibility.
Astrid shared several practical examples of how PeaceTech is already supporting women’s participation and leadership in peacebuilding. She spoke about WANEP’s Early Warning and Response System, in which women peace monitors use SMS alerts to report local tensions directly to national and regional hubs, ensuring that grassroots insights inform rapid decision-making.
Astrid also highlighted the African Women’s Situation Rooms, which blend hotline reporting from women and youth observers with the rapid deployment of women mediators to help de-escalate election-related tensions. She further pointed to the PeaceFem App, which offers case studies and strategies that support women’s inclusion in peace processes. These examples grounded the discussion in practical realities, showing that when technology is designed with and for women peacebuilders, it can strengthen participation, protection, and early action in meaningful and accessible ways.
Across contexts, participants described how digital environments are increasingly shaping the peacebuilding landscape, not always for the better. Many spoke about online harassment, stalking, gendered disinformation campaigns, and the targeted misuse of personal or community data. These forms of digital violence affect both men and women, but participants stressed that women and young peacebuilders often bear the brunt of them, directly affecting whether they feel safe speaking up, organising, or taking public roles.
Others described growing mistrust in digital platforms, especially where data practices are opaque or where digital surveillance has become a tool of political control. AI-enabled profiling, behavioural monitoring, and automated systems operating without transparency were discussed not as hypothetical risks, but as everyday realities influencing how people move, communicate, and participate in civic life. In some places, entire populations are managed by digital systems they did not choose, cannot question, and do not trust.
Misinformation surfaced as one of the most significant threats. Participants shared examples of manipulated narratives designed to provoke fear, distort public perception, or influence electoral choices, often pushed by political actors or groups seeking to shape conflict dynamics. In regions with unreliable connectivity or low digital literacy, these narratives can circulate quickly and become accepted truth long before they are challenged.
Youth perspectives added another vital dimension. Young people are some of the most active users of digital platforms in conflict-affected regions, often leading online mobilisation and peace messaging. Yet they also face heightened exposure to misinformation, harassment, and social pressure. Several participants noted a worrying gap: while digital communication tools are widespread, tools specifically designed to protect peacebuilders from intimidation, targeting, or digital manipulation are almost nonexistent.
The conversation also underscored how digital divides across gender, age, geography, education, and connectivity limit who can participate meaningfully in the digital sphere. In communities with limited internet access or low digital literacy, people are far more vulnerable to online manipulation and exclusion.
Participants repeatedly emphasised that education is the real starting point for PeaceTech: understanding how information spreads, how AI-generated content works, how to recognise manipulation, and how to stay safe online. Without these foundations, even well-intentioned technologies risk worsening inequality.
Questions of data sovereignty also resonated strongly. Participants highlighted that in fragile, occupied, or highly securitised environments, communities often have no meaningful control over their data or its use. While there is growing interest in community-owned digital models, including emerging decentralised approaches, participants noted that such systems remain largely aspirational in the settings that would benefit most from them.
What tied the entire conversation together was a shared insistence that PeaceTech must be fundamentally human-centred. Nobody in the room was asking for more technology for its own sake. They were calling for safer digital environments, accountable systems, and tools that respond to the realities they navigate daily.
Participants were clear that innovation means little without protection, that digital progress must be matched by digital rights, and that inclusion cannot be an afterthought.
For Austria’s growing engagement in PeaceTech, these insights are invaluable. They reinforce that meaningful PeaceTech begins with people, their safety, their agency, and their contexts. They show why tools must be designed for environments where connectivity is limited and trust is fragile. They highlight the importance of addressing gendered digital harms directly rather than peripherally. And they underscore the need to move beyond abstract concepts toward practical tools that genuinely support peacebuilders in the field.
Most importantly, the session reminded us that PeaceTech is not about technology itself. It is about power, protection, dignity, and participation.
The conversations in the room made it clear that if PeaceTech is to be credible, it must begin with digital literacy, accessible tools, and real safeguards for those who face the greatest risks online. It must be grounded in the lived realities of women and young peacebuilders, designed for low-connectivity environments, and built on transparent, accountable data practices.
These are not theoretical ideals, but practical lessons from practitioners working on the frontlines of peace. If we build PeaceTech with these principles at the centre, and if gender equality remains our red line, digital tools can genuinely strengthen peace rather than undermine it, giving people not only a voice but also a safer, more resilient space in which to use it.
If you would like to learn more about the organisation driving this work, the Global Network of Women Peacebuilders (GNWP) is a coalition of more than 100 women’s rights organisations across over 50 countries affected by conflict and humanitarian crises. Their members are women and youth-led groups who confront some of the most difficult challenges in their communities while pushing for lasting, inclusive peace.
GNWP’s mission is simple and powerful: equality for women and peace for all. Their programmes strengthen local leadership, support women peacebuilders, and advance the global Women, Peace and Security agenda. You can explore their work and get involved at https://gnwp.org/.
Another workshop on this important topic will be held as part of the Austrian Forum for Peace 2026 (29 June - 2 July) under the title “Technology for Peace - Empowering Women and Youth at the Grassroots.” The workshop will explore how PeaceTech can strengthen the leadership of women and youth in peacebuilding, while examining how technological innovation can support inclusive, locally grounded peace processes and connect community needs with policy and technology development.
Buolamwini, J., & Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Proceedings of Machine Learning Research.
Economist Intelligence Unit. (2021). Measuring the Prevalence of Online Violence Against Women.
UN Women & UNDP. (2023). Gender Equality as a Critical Driver of Peace.
UN Women. (2020). Online and ICT-Facilitated Violence Against Women and Girls: A Brief.
World Economic Forum. (2023). Global Gender Gap Report.
ACM CSCW Conference Papers on gender bias in automated content moderation (2020–2023).