The United Kingdom has called for an urgent United Nations investigation into what it describes as the weaponisation of sexual violence by Hamas during the 7 October attacks. Foreign Secretary David Lammy condemned the acts as a brutal violation of international law and human dignity, vowing to press for accountability at the highest levels. This demand comes as survivor testimonies and forensic evidence accumulate, painting a harrowing picture of systematic rape and mutilation used as tools of terror.
The UK’s stance echoes growing international outrage but also raises complex questions about evidence gathering in conflict zones and the politicisation of such inquiries. For technologists like myself, this is a stark reminder that our tools—whether satellite imagery or encrypted communications—can both expose and obscure the truth. The digital battlefield now includes documenting atrocities, and we must ensure that blockchain verification and AI-driven analytics serve justice, not propaganda.
The UN’s response will test the global community’s commitment to protecting civilians, especially women, in asymmetric warfare. But as we catalogue these horrors, we must also confront the uncomfortable reality: technology can amplify trauma as much as it can heal. The algorithm that identifies patterns of abuse is the same one that could be gamed by bad actors.
The UK’s call is a necessary step, but the path to accountability is littered with technical and ethical landmines.







