Ethical technology has never before been as critical, or as hotly debated, as it is today. Technology–specifically, information and communication technology (ICT)–is being critiqued for its influence on issues as wide-ranging as privacy, free speech, hate crimes, terrorism and mental health.
Tech’s potential to make the world a better place is vast. But deployed unethically it puts our communities, countries and even our mental health at risk.
Dividing Communities. Facebook, Twitter, and other social media platforms are increasingly used to divide communities and foment conflict. In some contexts, this can have deadly consequences. For example, the UN has concluded that hate speech and rumors posted on Facebook contributed to brutal crimes against Rohingya, playing a critical role in what some claim may be a genocide. The dangerous proliferation of hate speech on social media platforms has led Germany to pass strict hate-speech removal legislation, and EU regulators have threatened to follow suit if corporations do not do more to remove posts involving hate speech.
Reinforcing inequality. Privacy International warns that targeted advertising can unintentionally reinforce existing inequalities. For example, a system that detects that a person is a man of a certain ethnicity or religion will automatically send targeted ads to that account. Unfortunately, those ads usually reflect prejudices and can exclude certain groups of people from information, jobs, housing, or other benefits.
Discriminating algorithms. Biased algorithms are another major concern, especially with the rapid proliferation of machine learning and artificial intelligence applications. For example, algorithms that help judges with sentencing convicted criminals have been shown to be prejudicial. A study done by ProPublica showed that biases in software used by judges to assess the likelihood that a person will convict another crime led to disproportionately high risk scores for blacks as compared to whites convicted of similar crimes. Because they use artificial intelligence systems, there is less human accountability for this type of discrimination.
Free speech. With a concern over online hate speech, “fake news,” and combating terrorism, governments are becoming more aggressive about forcing companies to restrict online content. In response, private companies have turned to automated systems to police and manage content. This, in effect, leads to broad censorship that restricts free speech of rights activists and others while also failing to protect vulnerable groups from online harassment. In Myanmar, Facebook has removed posts of human rights activists but allowed hate speech against Rohingya to remain online. In other jurisdictions, governments target–and often imprison–activists who post critical information online under the guise of stopping the “spread of fake news.” In addition, content regulation rules and algorithms developed by private companies are opaque and there is a real need for more accountability and transparency to ensure their decisions do not violate human rights.
Privacy. Drones and satellites are increasingly taking photos and capturing data about our homes, farms, and natural environment. Location tracking devices like Strava record our movements. “Smart cities” track our movements, moods, and associations, all in the name of providing safety and security.
These problems are even more complex when considered in conflict sensitive and complex settings–like where there is a history of war, repression, human rights abuses, or extreme poverty.
So What Do We Do?
We develop a community and a set of standards to increase ethics, transparency and accountability.
At JustPeace Labs, we promote ethical and secure approaches to deploying technology in complex settings. As part of our mission, we’re developing a set of tools for corporations, civil society, and donors who use or deploy technology in these contexts.
Deployed ethically, tech can transform the world for the better. But without regulation, its potential for harm is one of the biggest concerns of our time.
Together, we can use tech for good, when it matters most. Contact us to see how you can get involved.
22,510 thoughts on “Why We Need Ethical Standards for the ICT Industry”