Emerging technologies have changed the nature of socializing and communicating. Connecting billions of users worldwide, social media platforms and communications apps have tremendous potential to widely publicize and amplify messages, both positive and unifying on the one hand and incendiary and divisive on the other. This phenomenon has led to radically unexpected and controversial election outcomes and, in high-risk settings, conflagration of violence and armed conflict.
And we’re now starting to see the fallout, with almost-daily headlines announcing new corporate scandals, employee discontent and plummeting stock prices. As civil society and the tech industry struggle with how best to ensure that the tremendous power and potential of these platforms respect basic human rights and are not utilized to foment hate and violence, there has been a rush inside many companies to create and hire for new staff positions geared towards addressing these challenges.
Having been the subject of the most significant public scrutiny and loudest media critiques, the last few months have seen Facebook create, announce and recruit for new positions that seek to improve the management of its core products—Facebook and WhatsApp. Facebook has recently announced the following positions (among others): Policy Associate Manager, Borderline Content & At-Risk Countries; Public Policy Associate Manager, Content (Elections); Policy Campaigns and Programs Manager; Public Policy Manager, External Oversight Board Liaison; Public Engagement Director, Global Affairs; and Content Policy Manager to Product Policy Director, Human Rights. This follows a previous batch of hiring privacy and policy experts last Fall.
And Facebook’s not alone. Last fall Twitter recruited for a Director of Human Rights position, to be added to its legal department. Nokia recently hired a Head of Human Rights, housed in its marketing department. Others are also adding policy or social impact professionals to their rosters.
The first spate of hiring reflected efforts to take privacy obligations more seriously. Just this week, Kevin Bankston announced he would be leaving his role as Director of the Open Technology Institute to direct Privacy Policy at Facebook. This newest announcement is already on the heels of Facebook’s recent hiring of some of its harshest privacy critics in an effort to “bring in new perspectives” and “build better approaches to privacy in the future.” And the efforts have been rewarded with public acknowledgements from academia, media and civil society alike. These developments certainly present an opportunity to advance meaningful social change.
However, it is difficult to view the rush to hire alone as a significant win for the causes the companies are attempting to serve. The speed with which companies have shifted from denying there is a problem at all to looking for in-house expertise to address the negative consequences of their technology belies the complexity of the problems associated with operating in high risk settings.
It also remains to be seen whether these impressive new hires will be allowed free reign to truly disrupt the status quo and usher in a new era of ethical business practices that respect users’ privacy and the human rights of all.
Academia and civil society—often independent counterweights to private industry—have yet to reach consensus on many important questions. What are the best ameliorative measures to keep social media and communications platforms and apps from contributing to armed conflict in the future? How do we apportion responsibility and causality for the consequences of harmful online content in this new connected era? How is social media impacting conflict in different contexts? If independent experts are still debating these questions, how can companies “solve” these large and looming problems (often thousands of miles away in conflict-sensitive countries) by simply rushing to create and fill new positions within the Silicon Valley echo chamber?
Given all the questions still unanswered, companies cannot rest on their laurels after creating in-house rights expertise. They must continue to support open and transparent dialogue with responsible tech organizations to learn from their research and expertise. The creation of Corporate Social Responsibility and Human Rights positions in the textile, food and extractives industries in the 90s and early 2000s followed years (perhaps in some cases decades) of open consultative processes with public sector stakeholders. An independent external body composed of stakeholders from an array of industries and fields is needed here. Only through such collaboration can we hope to solve the now seemingly intractable problem of social media and communications apps being used to propagate dangerous speech and conflagrate violence.
18,916 thoughts on “Silicon Valley’s Rush to Hire New Human Rights Personnel Belies the Complexity of the Challenges it Faces”