Massive amounts of personal and biometric data are being gathered from hundreds of thousands of Rohingya refugees in Bangladesh. This should set off multiple alarm bells.
As bystanders to likely crimes against humanity against the Rohingya, the humanitarian community has a particular responsibility to ensure their rights are not violated further, through data and technology. Now is the time to push for safeguards, before it’s too late.
Gathering data on marginalised groups can be a risky business, and the Rohingya are no strangers to having information about them used to further diminish their human rights. What is being proposed in Bangladesh raises broad concerns about the responsible and ethical use of data and is potentially dangerous.
The Bangladeshi government registration process includes scanning in “biometric” data – at this stage, fingerprints, with the UN providing “technical assistance”. At the same time, the UN’s refugee agency, UNHCR, announced this week that it is carrying out a separate counting exercise, including taking photographs.
Refugees may reasonably think their access to aid and protection may depend on one or both registrations, so the power asymmetry is stark between those designing and carrying out the data collection and those on the receiving end of it.
The responsible data considerations are numerous and complex.
What data should be collected, by whom? Who has access to it? In case of machine or human error, what processes are in place to review and make changes? What could be the unintended consequences of these growing databases? How could the data be abused?
All of these questions, and more, need to be thought through and the conclusions intentionally planned into any kind of data collection about the Rohingya, before more harm is done.
Registering the Rohingya
According to local media, Bangladeshi firm Tiger IT has provided the government with a software system to register Rohingyas. It will record the individual’s fingerprints, alongside name, gender, age, photograph, parents’ names, birthplace, nationality, country, and religion – all of which will be linked to an ID card. The card does not use the term “Rohingya”, and some are refusing to be registered because of this omission.
The experience in Bangladesh echoes that in Myanmar. The 2014 census in Myanmar listed 135 ethnic groups but deliberately omitted any option for “Rohingya”. This led many to refuse the national ID cards that followed, which used the loaded substitute term “Bengali”. Rohingya worried that this was just another attempt to erase them as a community. In a mirror move, a Bangladeshi census in 2016 labelled Rohingya as “Myanmar nationals” – a status Myanmar itself does not recognise.
This month, UNHCR announced it is engaged in a separate counting exercise: Once refugees’ information, including photographs, have been gathered digitally with an unnamed “app”, a laminated yellow card with a unique number is assigned by the Bangladeshi government’s Refugee Relief and Repatriation Commission. It’s unclear how, if at all, this exercise links up with the other registration, led by the Department of Immigration and Passports.
Producing separate datasets (and possibly providing more than one official identification card) is not an efficient use of resources and indeed might lead to complications for the refugees on the receiving end of the questions and the cards.
Firstly, it can be used to drive repatriation (voluntary or otherwise). Bangladeshi Industry Minister Amir Hossain Amu has openly stated that the country has “no plan to give any refugee status to Rohingya”, adding: “the reason behind the biometric process is to keep record of Rohingya. We want them to go back to their own place.”
Secondly, it can digitally enable discrimination. Rohingya have to follow a ”code of conduct” that forces them to stay inside the camps and limits their interaction with locals. If the database of Rohingya people were to be leaked, hacked, or shared (for example, with the Myanmar government), it could make it easier to deny Rohingya access to basic services, or target them, or discriminate against them. For example, Bangladeshi mobile phone operators have been banned from selling SIM cards to Rohingya refugees. Biometric data could in theory be shared with mobile phone operators to enforce the ban.
Thirdly, errors and omissions can be harder to resolve. Unlike passwords, fingerprints can’t be changed. Once collected, it may be virtually impossible to get rid of them or correct them. Biometric devices are not 100 percent accurate – and it’s unclear what action could be taken if mistakes are made.
So how could data collection on the Rohingya help their cause?
There need to be accountability processes in place in case of error, and responsible data practices must clearly be followed. According to data minimisation principles, only data that is necessary should be collected, and access to stored personal data should be strictly limited – many decisions around aid can be made with aggregate numbers, rather than personal data on individuals.
Choices around what data is gathered should be made with a long-term strategy in mind. The collection, storage, and use of that data should then be carefully planned in line with this strategy.
The Rohingya have long been persecuted, even before the horrors they face today in Myanmar.
We have a responsibility to ensure their rights are respected and protected – and that the data purportedly gathered as part of the humanitarian response is used for that purpose and not for further persecution.