Refugees can pay for groceries with the blink of an eye in Jordan, interactive maps track thousands of people on the move in Syria, a plastic ration card can hold a whole Rohingya family’s details. More digital innovations for humanitarian work are on the horizon, too. But at what risk?
Plenty, according to a panel on handling sensitive data in humanitarian contexts, the first time the topic was discussed as part of the Humanitarian Congress Berlin. As aid delivery and end-user registration are increasingly digitised, sensitive data on millions of the world’s most vulnerable populations could be hacked, sold, or shared with abusive governments. The humanitarian sector must start asking itself how, as one panellist put it, to “do no digital harm”.
IRIN senior editor Ben Parker moderated the four-member panel that brought together representatives from the humanitarian and private sectors. Nora Dettmer, a member of the event’s steering committee, noted that “though data collection and analysis offer exciting opportunities… it also comes with serious risk.”
Yet digitalisation and data don’t just spell risk, as the panelists pointed out; they also mean that recipients of humanitarian aid have more tools to find information, make decisions and stay in touch with loved ones. “It's no longer only food, shelter, or water when they reach a humanitarian organization,” ICRC data protection officer and panelist Maria-Elena Ciccolini said. “They will ask for connectivity.”
Karl Steinacker said that the UN refugee agency (UNHCR), where he heads the Global Service Centre, is heavily invested in the use of biometrics (for example, fingerprints and iris scans) for identifying refugees. Steinacker said his agency tries to limit risk with strategies like minimising the amount of data recorded. With less information, held for shorter periods of time, any data breach would cause less damage.
It’s right to consider new technologies, but innovations, including blockchain and biometrics, should not be brought in just for their novelty value or future promises of efficiency, Zara Rahman, a researcher at consultancy The Engine Room, noted. Humanitarian workers are used to making fast decisions, but “being slow, and thoughtful, and intentional is a key part of doing it right,” she said.
Paul Currion, COO of Disberse, suggested that a digital “apocalypse” in society has already begun in terms of privacy and personal control over data. He cautioned that the humanitarian sector may only wake up to the risks if there is a catastrophic data breach, putting vulnerable people at even more risk.
Those risks may sometimes lurk in plain sight, as a few people at the discussion – including a panelist – found out. Parker had set up a fake WiFi network, a tempting alternative to often slow internet access at the event. About 12 people had logged on to it by the time the panel started – an unwise move in network security. Why? “I made it and it's not secure and it's not what the organisers gave you,” Parker chided.
Highlights of the conversation, edited for clarity and length, are excerpted below.
Video of the full conversation is online here.
Maria-Elena Ciccolini, Deputy Data Protection Officer for Europe and Central Asia, International Committee of the Red Cross
Paul Currion, Chief Operating Officer, Disberse
Zara Rahman, Research and Engagement Team Lead, The Engine Room
Karl Steinacker, Head Global Service Center, Copenhagen, UNHCR
On privacy and consent to use aid recipients’ data
Maria-Elena Ciccolini: “You can't just analyze or reprocess the data for a purpose that is not reasonably expected by the data subject… You do not have a blanket authorization just because people consented to provide you the data for a specific purpose.”
Zara Rahman: “Would we be okay with this kind of disclosure or this kind of data collection if it were for ourselves?”
Karl Steinacker: “It is true that we have not designed our systems [with] ‘privacy by design’ or ‘privacy by default’… we have to do a retrofit. The High Commissioner does indeed want refugees to have agency over their data. But in order to have agency over your data, in order to manage your data, that data has to be credible.”
Maria-Elena Ciccolini: “Consent is the preferred legal basis to collect and process data, but we realize it cannot be used in many instances, and we would prefer, for example, public interest or vital interest, which gives us the same duty of informing the people from whom we collect data.”
On power imbalances in the humanitarian sector
Currion: “By giving them [affected people] connectivity, you give them access to information, potentially you give them more power, and this is something that the humanitarian community has obviously struggled with greatly over the years.”
Rahman: “The humanitarian space is probably home to what must be the biggest power asymmetry between the people who are gathering the data versus the people from whom the data is being gathered… I think the way in which we see the power asymmetry playing out is in ownership of the data.”
Currion: “This is not about data; this is about power. And we should always remember that. Don't get distracted by the technology. Don't get distracted by the hype.”
Ciccolini: “We deal with people that sometimes have a very low level of education or data literacy. How can we pass all these messages about new technology or even more basic messages? And from a data protection point of view, it is, how can we say that consent is informed and valid?”
On the risks of collecting and storing data
Currion: “We need to test. We need to experiment. Now, it's not good to experiment on vulnerable communities. It's good to experiment on ourselves as organisations and humanitarians first.”
Rahman: “Biometric data is one that I worry specifically about, just because it's immutable, it's permanent. It can't be changed by the individual. It really takes away a lot of the kind of potential agency that people could have.”
Steinacker: “If you deal with hundreds of thousands, with millions of people walking in, how much time [do] you spend on the assessment before you start the action — the humanitarian action, that is?”
Rahman: “I would love for everyone to see data as more of a toxic asset, rather than a thing that they should be collecting more and more of.”
On commercial interests and weighing the benefits of big data
Currion: “Historically, the humanitarian sector is insanely bad at connecting information that it gathers to decisions that it makes. We're not good at evidence-based decision-making. … Should we be collecting this data, should we be doing this analysis? I'd say maybe, but I'd like to see that actually there was some evidence that we were using that to make better decisions.”
Steinacker: UNHCR would be “irrelevant” and “risk the welfare and the protection of refugees if we would not try to go this way and do it right. So the question is not whether or not we do it. The question is whether we do it right.”
Ciccolini: “It's not just about risk for our beneficiaries. It also has an impact on our reputation. So I don't think that just a very nice deal or a very good price would make us let go on our principles.”
Steinacker: “I think it's not only technology which will define the future. It's exactly that relationship between the private sector, the commercial sector and the aid industry – which has been so far, to a certain extent at least, non-commercialised – which will define how this thing is going forward.”
Graphics: Event cartoonist Claudia Meier of GPPi