The ninth-century polymath Muhammad ibn Musa al-Khwarizmi began his most famous book: “When I considered what people generally want in calculating, I found that it is always a number.” A millennium later, people want a lot of calculating, and luckily al-Khwarizmi lent his name to the simple mathematical concept that helps us to meet that need: the algorithm.
An algorithm is only a set of step-by-step instructions for carrying out a task. For example, my recipe for Plumpy'Nut Surprise is the algorithm I follow when I have Jan Egeland over for dinner. (Jan, if you’re reading this, I hope you’re feeling better.) Yet algorithms are more powerful than they first appear. Google's internet search? Based on algorithms. Amazon's book recommendations? Based on algorithms. Facebook's news feed? Go on, guess.
It's not just our online lives that are heavily influenced by algorithms: they have huge and increasing reach into the stock market, law enforcement, immigration controls, even our sex lives. It may be sacrilegious to suggest this, but Beyoncé is wrong: it's not girls who run the world, it’s algorithms. This worries many people, and it should worry you, because these algorithms are not just powerful – they’re also opaque.
There's increasing evidence that relying on algorithms to make decisions is problematic due to their inherent bias. Nobody knows how any of the algorithms listed above actually work, except for a handful of people in the organisations that develop them. Sometimes not even they know; machine learning is increasingly leading to those algorithms evolving in ways that are not easy to predict. Why is this happening, and what does it mean for the aid industry?
In 2011 investor Marc Andreessen wrote an article entitled “Why Software Is Eating The World”, in which he made a widely-accepted case that “software companies are poised to take over large swathes of the economy”. The aid industry isn’t immune; aid organisations follow trends in sectors such as health and education, which are already being eaten by software, but the biggest and earliest impact is likely to be in cash transfers.
Even when physical cash is handed out, such transfers are made possible by information technology; and in future most distributions will not be physical cash but via mobile phone or bank card. This is part of a wider revolution in fintech, or financial technology, and the data that cash transfers produce will be of great interest, since data is the new oil. Syrian refugees are already becoming a market segment for financial services providers.
So cash distributions are essentially digital distributions, and being eaten by software really means being eaten by software companies. If humanitarian organisations begin to be replaced by software organisations, then these challenges won't be addressed by aid professionals, but by computer scientists. This is the heart of the problem; in the words of Pedro Domingos in his recent book The Master Algorithm: “in computer science, a problem isn’t really solved until it’s solved efficiently”.
In this scenario, algorithmic humanitarianism will be based on software deciding who receives what assistance and evaluating whether they’re using that assistance appropriately – all in the interests of efficiency. Despite what the donors might want you to believe, however, “efficiency" is not a core humanitarian principle.
The algorithmic future also goes hand in hand with a tidal wave of automation – self-driving cars, for example, rely on algorithms – which is predicted to destroy jobs on an unprecedented scale. In a futile attempt to make our robot overlords less terrifying, some suggest that automation is least likely to affect jobs that are based on human relations, such as care work with old or sick people. It would be easy to think that humanitarian work falls into the same category, but as aid has responded to a changing environment, it has become less about relations, and more about transactions.
Ideally the humanitarian economy is relational, based on solidarity. This is why MSF tries to hang around so long after everyone else has left; not just to provide medical care, but also to bear witness and speak out. Solidarity can only be built on a foundation of human relations, but automation threatens to undermine that foundation by accelerating the transition of the humanitarian economy from a relational model to a transactional model.
A transactional model isn't built on solidarity: it's built on contracts. Solidarity means trust; contracts indicate a lack of trust. The transactional model therefore has to be built on technical standards and key performance indicators and logical frameworks – all of which are desirable, but none of which are sufficient to satisfy the humanitarian imperative, which risks being swept away.
Algorithms plus automation will dematerialise humanitarian assistance. For example, aid organisations are starting to grapple with the implications that some humanitarian programming – food distributions, for example – will collapse as cash distribution becomes more widespread. This isn’t necessarily a bad thing, given the problems that food distribution can cause, but what takes its place is potentially a heady cocktail of platform capitalism, surveillance capitalism, and disaster capitalism.
Karl Marx saw this coming, with the Communist Manifesto giving us a clear view of the end-state of bourgeois capitalism: “All that is solid melts into air, all that is holy is profaned, and man is at last compelled to face with sober senses his real conditions of life, and his relations with his kind.” The kids love Marx not just because of his hipster beard but also because of his apocalyptic tone: our organisations will melt into air, our principles will be profaned, and we are being compelled to face the true nature of our relations with the people we claim to be helping.
Algorithmic humanitarianism doesn't have to be apocalyptic for the humanitarian sector, but only if we invest in ensuring that our algorithms reflect our values. That means that rather than be overtaken by software companies, we may need to become software companies – otherwise our lack of computer literacy means that the coding is going to be left to the hyperactive imagination of the hackathon.
The end result could be much worse than just overhyped, underperforming, and outright bullshit mobile apps; it could be a hollow humanitarianism, its essential humanity discounted by machines of loving grace.
We uncovered the sex abuse scandal that rocked the WHO, but there’s more to do
We just covered a report that says the World Health Organization failed to prevent and tackle widespread sexual abuse during the Ebola response in Congo.
Our investigation with the Thomson Reuters Foundation triggered this probe, demonstrating the impact our journalism can have.
But this won’t be the last case of aid worker sex abuse. This also won’t be the last time the aid sector has to ask itself difficult questions about why justice for victims of sexual abuse and exploitation has been sorely lacking.
We’re already working on our next investigation, but reporting like this takes months, sometimes years, and can’t be done alone.
The support of our readers and donors helps keep our journalism free and accessible for all. Donations mean we can keep holding power in the aid sector accountable, and do more of this.