Just before winning an award at the Mobile World Congress for its innovative use of big data, the World Food Programme (WFP) announced a controversial partnership with Palantir Technologies, Inc. The $45 million pro bono agreement is intended to harness the power of big data to help the WFP be more efficient in delivery food aid to some 92 million people worldwide.
Unsurprisingly, this agreement has raised a number of concerns among proponents of responsible data practices, including JustPeace Labs. The deal raises a number of concerns, not the least of which is a lack of transparency about what exactly it entails and the ethically dubious reputation of Palantir. In response, JPL spearheaded a recent move to raise these concerns with the WFP, activists and the media by bringing together like-minded civil society organizations in the Responsible Data community to draft an open letter to the WFP.
The letter marked a pivotal moment for the Responsible Data community as it is really the first time this group has come together to speak truth to power, with members sharing a wealth of expertise and differing–and sometimes conflicting–perspectives in an honest and respectful discussion. It also exposed three pressing problems (not addressed by the letter) that should inform advocacy efforts in this area.
Unregulated Public/Private Partnerships
The WFP/Palantir partnership highlights the risks associated with large-scale public/private technology partnerships. The problem, in other words, is not just limited to Palantir, or to the WFP, or to the humanitarian assistance sector. It is a larger-scale problem, one that demands concerted action.
Many public agencies, like the United Nations, lack the resources and expertise to be able to utilize the mass amounts of data they have collected in the pursuit of their mandate. This data can do tremendous good–if used responsibly and ethically. Private companies, on the other hand, often have the expertise and technological tools and infrastructure needed to mine and analyze mass data sets at scale. However, they currently lack any robust regulation, whether in the form of laws or industry standards.
Partnerships that combine unregulated industries with UN agencies–which are immune from national, regional and international laws–result in a high risk for low to no accountability and unethical business practices. UN agencies have internal guidelines on cooperation with the business sector (including for big data projects), which include calls for due diligence, transparency and accountability (three issues highlighted as lacking in the WFP/Palantir agreement). However, the only such guidelines governing accountability for corporations are the (voluntary) UN Guiding Principles for Business and Human Rights.
The Guiding Principles, while a valuable resource and overall guidance for ethical business practices, do not always adequately address many of the pressing concerns raised by new technologies–such as ethics, conflict sensitivity, changing power structures, and new forms of oppression. Moreover, compliance is based on self-reporting, and the principles are not legally enforceable. While the business and human rights framework has considerable strengths and potential, to date it has not fully captured the risks associated with partnerships like the one between Palantir and WFP.
This leaves a serious gap in regulation and enforcement.
The WFP operates in acute and complex humanitarian emergency situations. By its own admission, a significant majority of the people it serves are or have been affected by violent conflict. In these fast-paced and volatile contexts, the risks associated with irresponsible data practices are even more acute. Issues like bias, the mosaic effect and privacy can not only harm individuals in these situations, but can spark or foment community-wide violence and war.
Most current standards for ethical or responsible data do not include discussion of conflict sensitivity let alone require conflict sensitivity analysis. It is critical that any evaluation of potential negative unintended consequences of technology projects and partnership–especially those of the scale and context of the WFP/Palantir project–include an analysis of conflict connectors and dividers, and the potential to spark community-wide violence.
The process of coming together to take collective action and express our common concerns about the agreement between WFP and Palantir was humbling and inspiring. It was also enlightening. We civil society stakeholders must address problems holistically and propose solutions that embrace systemic changes, rather than reactive, ad hoc ameliorative measures.
The diversity of expertise in the Responsible Data community is extraordinary. However, in this community and in others we engage with, there is often a semantic disconnect that makes it hard to agree on a theory of change. Is this an AI issue? A privacy issue? A humanitarian issue? Machine learning? Data? Ethics? Human rights? Private software industry? UN? Many of us come from relatively siloed and specific practice areas, and finding a common way forward to tackle interconnected challenges is difficult.
The WFP/Palantir collaboration showed us that working together in this way brings about a tremendous opportunity to amplify our voices and augment our relative bargaining power. It also demonstrated how much more powerful civil society is when we overcome our organizational differences and work together on responsible technology issues in a cohesive and strategic way. We’re excited to continue pushing forward in this new space.