Blog by Tatiana-Maria Cernicova-Dragomir


1. Digital borders and why they matter, through a liberal feminist lens

In today’s world, when a refugee reaches a border, the first gatekeeper they meet and must interact with may not be a uniformed guard but an algorithm. Their iris is scanned in a biometric kiosk, a risk-scoring programme predicts their credibility, and a blockchain wallet stores what may become their only proof of legal existence. These digital checkpoints promise speed and efficiency. However, they are also profoundly political spaces, where power, privilege and patriarchal biases are coded into software. A liberal feminist analysis, one that brings forward equal rights, institutional reform and women’s agency (Okin, 1989), helps one see both the potential and the hidden dangers of this technology-driven turn in migration management and governance.

Within this framework, the analysis draws on core liberal feminist commitments to equality and agency. Equality is understood here as the expectation that women and gender-diverse persons should have the same access to rights, services and opportunities as men, without being held back by structural disadvantages coded into legal or technological infrastructures. Agency refers to the capacity of individuals to make meaningful, informed choices about how they interact with institutions and technologies that govern their lives.

To translate these commitments into a practical method for assessing digital borders, the blog relies on three interconnected dimensions. The first is accessibility, which concerns who can actually use a given digital system, whether they have the necessary devices, connectivity and literacy, and whether the surrounding social context enables or inhibits their participation. The second is transparency and accountability, focusing on who writes the code, who owns the data, whose interests are embedded in algorithmic benchmarks, and who is able to question or challenge technological decisions. The third is institutional design, which asks whether the legal and policy frameworks surrounding these technologies incorporate enforceable gender-equality guarantees or instead reproduce gender-blind assumptions. Together, these dimensions form the evaluative scaffold that shapes the discussion throughout the remainder of the blog.

A “digital border” is shorthand for an expanding ecosystem of digital tools that extend migration control beyond the physical frontier. These tools include digital identity platforms (ID2020, the EU’s E-ID), AI-enhanced surveillance (in the form of facial recognition cameras at checkpoints – such as CBP Biometric Entry-Exit Program, predictive analysis tools that flag “high risk” asylum claims – such as the Netherlands’ IND Risk Profiling System), automated decision making systems that pre-screen visa or asylum applications (ETIAS, ATS), mobile and web applications that deliver legal information and advice, cash assistance or remote education inside refugee camps (Refugee.Info, KITUO Cha Sheria, JusticeBot, RedRoseKolibri, EduApp4Syria). Of course, the list is far from being exhaustive, but it does point to the broad reach of digital tools in virtually all areas that concern identification, assistance and management of migrants.

Because these systems operate in the cloud, they mediate refugees’ lives before, during, and long after any literal border crossing. Like the three Sisters of Fate in Greek mythology – Clotho, Lachesis and Atropos – these digital tools increasingly shape the arc of a migrant’s journey through time and space. Digital ID platforms spin the thread of a refugee’s legal identity, often from fragments of past records or biometric scans, much like Clotho, who begins each life by spinning its thread. AI-driven risk assessments and automated decisions function like Lachesis, measuring and allocating opportunities, such as the right to remain, to work or to access aid in real time, based on pre-determined metrics. Finally, systems that flag claimants for removal echo the third sister of Fate, Atropos, who severs the thread of life without appeal, closing off futures through opaque algorithms that often cannot be challenged. Together, these technologies do not merely process data. They pattern migrants’ lives across time, determining whose stories will continue and whose will end. The mythological framing is not just poetic. It highlights that digital border infrastructures are not passive tools, but active agents in shaping human trajectories.

To bring the discussion back into an academic setting, scholars have begun to speak of a functionally integrated digital border architecture (Broeders, 2007), where a border is no longer defined by geographic lines, but rather by networks of data-driven technologies that accompany, sort, and often constrain migrants.

Through a liberal feminist lens, one can ask an apparently simple question: do new institutions enhance women’s equal freedom? Applying this to migration-related technology gains meaning threefold, by enabling one to address issues pertaining to accessibility, transparency and accountability, and institutional design. In terms of accessibility, the follow-up question becomes “who has the devices, connectivity and literacy skills to use digital IDs or safety applications?” Regarding transparency and accountability, we can probe to see who writes the code, owns the data and sets the benchmarks that decide outcomes with such impact on people’s lives. Institutional design raises the question of whether refugee law and data-protection laws (standalone or combined) include enforceable gender-equality guarantees.

This blog piece recognises and allows for the need of technologies, without rejecting them outright. One can recognise the potential of technology, so long as the rules are written or rewritten in a way that secures equal freedom.  First, this blog highlights how digital identity has positive benefits as well as risks for displaced women. Second, it moves beyond digital identity to expose the myth of neutrality in AI surveillance and algorithmic decision-making that are a male paradigm. Moving to solutions to promote gender equality in technology for migrant women, the blog identifies three different means: empowering women’s voices; legal and institutional reform; and coding for success. These reforms can transform digital borders from a barrier to a facilitator of protection for women migrants.

This blog proceeds in four steps. Section 1 introduces the concept of digital borders and the liberal feminist principles used to evaluate them. Section 2 examines digital identity as both an enabler of recognition and a potential source of exclusion. Section 3 analyses biases embedded in AI surveillance and algorithmic decision-making. Section 4 highlights technologies that strengthen women’s agency from the ground up. Section 5 calls for legal and institutional reforms consistent with liberal feminist goals, before a concluding reflection on how technology can be steered towards equal freedom.

2. Digital Identity as a path to both recognition and exclusion

Digital identity schemes such as the UN-backed ID2020 alliance offer refugees a portable credential that can unlock banking, schooling or a SIM card. For displaced women (who often lack birth certificates because of gender-biased civil registration practices) this can be transformative. However, there are pitfalls regarding device and connectivity gaps, consent and coercion, and function-creep (Koops, 2021). Studies show that women in refugee camps are less likely to have a mobile phone and mobile internet than men. Moreover, a biometric ID obtained under duress, at a border crossing or food distribution point for instance, fails any meaningful liberal standard of choice. The data thus collected by a humanitarian agency may later be shared with state security agencies (expanding the use of the data beyond the original purpose), exposing people affected by gender-based violence to renewed harm.  For women, purpose-limitation clauses, opt-out pathways and gender impact audits before rolling out digital IDs becomes even more important.   

3. AI Surveillance and Algorithmic Decision-Making

Automated asylum screening systems are viewed as neutral and consistent, but evidence suggests otherwise. The UK’s (now discontinued) “visa-streaming” algorithm was found to carry nationality-based biases. In a Canadian context, proxies like travel history, domestic violence victims and membership to the LGBTQ+ community risk tainting the automatic decision-making process for refugees belonging to certain groups. From a liberal feminist angle, three actions are essential, to ensure safety and equality for women migrants: explainability by design – giving refugees clear reasons for a negative algorithmic finding in a language they understand; representative training data, to include women’s trajectories in the datasets; and independent review bodies with gender expertise, to audit systems for disparate impact. Virginia Eubanks (2018) warns that opaque scoring tools punish the poor just for existing, by flagging them as “high risk” based on broad data patterns, penalising them for conditions beyond their control and turning poverty itself into a predictive marker of instability, potential fraud and unreservedness. The same logic punishes migrant women, who diverge from the archetype of a “rational economic migrant”. They may not fit the narrow indicators of credibility or economic utility, shaped around a male-centric model. Despite these challenges, there are techniques that offer greater gender equality.

4. Empowering technologies and voices from the ground

Various initiatives have proven that technology is not just a top-down instrument of control but can be leveraged as an emancipatory tool for refugee women. Several such tools have been designed and promoted (e.g. BANQU, the British Red Coss initiative for digital empowerment of refugee women). These examples answer the liberal feminist test of agency by putting information and resources directly in the hands of migrant women. Yet even here, data-protection safeguards remain thin, and support rarely covers non-binary or LGBTQI+ persons.

In practice, women have revealed that digital skills gave them a sense of direction. However, it is not a catch-all solution, since “unstable, poorly localized, or inaccessible information continues to limit participants’ ability to access services and engaging with host-country systems.” (Berg, 2025) Testimonies remind us that digital inclusion demands infrastructure, literacy and safe social norms, not merely software and hardware.

5. Revisiting legal and institutional frameworks: liberal feminist goals in action

For historic reasons, existing refugee law instruments are mostly silent on technology. Data protection frameworks are in place, but these do not cover the entire array of issues raised by the use of technology in relation with women migrants. Data protection asymmetry becomes most visible when refugees enjoy weaker privacy rights than citizens, under national security exemptions. Due process deficits may be exacerbated by automated decision-making in areas such as risk assessment and visa issuing. Moreover, the gender-blind drafting of norms and regulations result in few (if any) instruments that incorporate the intersectional burdens that affect women, LGBTQ+ persons in digital screening and assessment processes.    

These frameworks should be revisited, through amendments that address algorithmic accountability, equal connectivity initiatives and gender impact assessments.

To ensure equality and protect rights in the context of new technologies, a liberal feminist approach focuses on six key goals: digital literacy, equal agency, fair processes, privacy and safety, and intersectional data.

Achieving these goals calls for concrete action: co-designing technology and funding gender-balanced tech hubs for migrant women, mandating explainability-by-design and a right (and obligation) to human review for all AI asylum tools, limiting biometric use and adopting differential privacy norms for humanitarian datasets, collecting sex-disaggregated and non-binary inclusive metrics to monitor tech impact.

These reforms require collaboration across all sectors. Governments, NGOs, UN agencies, tech developers, and refugee-led organisations must all be part of the effort, as this is not just a matter of technical design, but one of fairness, justice, accountability, and the right to be seen and heard in a system that is slowly transforming people in little cogs in a wheel.

Conclusion: Coding equality

In the whirlwind of scholarly debates, technological innovation will not pause and wait. For migrant women, even beyond borders, socio-technical ecosystems impact resettlement, public life, home and community. (Dahya et al., 2023) This brings on the imperative to steer technology comprehensively and on the go. A liberal feminist lens refutes a stark division between optimism and doom in the face of technology and the changes it brings. Instead, it addresses the core of the issue: how to design code, law and institutions to enlarge freedom equally for all, both at the border and beyond. Returning to the earlier metaphor, if Clotho, Lachesis and Atropos once shaped human destinies, today digital systems play a comparable role in spinning, measuring, and sometimes cutting the threads of migrant lives. A liberal feminist approach insists that these technologies be redesigned so that they extend, rather than limit, women’s equal freedom.

Practically, this implies more than embedding privacy-by-design. It means good procedural guarantees, such as ensuring that algorithmic audits become as routine as safety checks when boarding a plane. It means ensuring good and balanced data. And beyond what law alone can impose, it means funding digital literacy programmes with the same urgency as providing for basic needs. Achieving these goals is both a social and a legal imperative, grounded in the equal rights tradition championed by liberal feminism.

When a person reaches a digital border, their biometric scan should open the door, not prompt suspicion scores. Beyond that point, IT systems should act in support and not as a hindrance of normal life. Since the technology is already here, we should look into a software update to better include justice (not just hard law) as well. The three Sisters of Fate in Greek mythology – Clotho, Lachesis and Atropos – can shape the arc of a migrant’s journey through time and space to enhance and not deny protection through digital means.

Tatiana-Maria Cernicova-Dragomir is a Refugee Law Initiative Visiting Fellow, PhD candidate West University of Timisoara (Romania) and Udine University (Italy)



The views expressed in this article belong to the author/s and do not necessarily reflect those of the Refugee Law Initiative. We welcome comments and contributions to this blog – please comment below and see here for contribution guidelines.