Technology is playing an increasingly important role in our lives, from facial recognition systems in policing, to algorithms for job recruitment, to virtual assistants in our homes.
As technology becomes ever-more interwoven with our daily lives, technology companies must ensure that their work is for the human good.
This awareness is reaching the upper ranks in Silicon Valley. When Chris Cox recently departed Facebook, he wrote:
“Social media’s history is not yet written, and its effects are not neutral. It is tied up in the richness and complexity of social life. As its builders we must endeavour to understand its impact — all the good, and all the bad — and take up the daily work of bending it towards the positive, and towards the good. This is our greatest responsibility.”
Despite Cox’s claims that Mark Zuckerberg and Facebook have been contemplating the social impacts of technology for more than a decade, we need only take a cursory glance at the mounting investigations about the ways in which our personal data has been sold and used to know that whatever work Facebook and other big technology companies have been doing to bend technology toward the positive have fallen critically short.
It is an important step that Facebook is calling to understand the social impacts of its technologies — but these can’t just be empty words. We need serious, sustained research on how technology is shaping society — in the United States and around Science, technology and society studies.
Technology is not neutral, as Cox correctly identifies. Technology and society are deeply connected and influence each other in profound and lasting ways.
Yet, we know very little about the on-the-ground social impacts of new technologies. A December 2018 report from Elsevier found that despite the urgent imperative to develop ethical artificial intelligence technologies, little research is being done on the topic. The report cites this critical gap in knowledge as “one of the most pressing questions” in the field.
These findings confirm what we as social scientists have known for a long time: we simply are not studying the ethical implications and social impacts of new technologies.
The good news is that there is an entire scholarly field that has been thinking through these issues for nearly 50 years: science, technology, and society studies — abbreviated to STS. Scholars of STS come from fields including anthropology, history, and political science. STS investigates a wide range of issues including social media, big data, technological surveillance, medical technologies, and climate politics. Academic STS offers a nuanced understanding of these complex topics.
The bad news is that STS insights are largely locked in the ivory tower.
This needs to change. We need rigorous research on the social impacts of technology. We need this research to be available to a wide variety of stakeholders so it can be used to create better technology and to understand how the technologies we already have are shaping our lives.
Bending technology toward the positive
If we are truly committed to bending technology toward the positive we need to do the following three things:
- STS scholars must publish, present, and teach in public forums.
As Cathy O’Neil has said in no uncertain terms, “academics have been asleep at the wheel.”
We must have a voice in one of the most important conversations of our times — how new technologies are shaping our world. Scholars of STS have a crucial role to play to educate the public, lawmakers, journalists, and technology leaders about our research on the nuanced and complex relationship between technology and society.
- Technology thought leaders and makers must become literate in the social impacts of technologies.
Sadly, Silicon Valley regularly demonstrates an alarming lack of comprehension about the social effects of its technologies. This was on clear display when Mark Zuckerberg promised Congress purely technological fixes for social problems — don’t worry, AI will fix hate speech and fake news. Likewise, when Cox declares the need to understand the impacts of social media but Facebook fails to offer any concrete plan to monitor its social effects, we see a dangerous blind spot.
- We need independent, expert evaluation and monitoring of the social impacts of technology.
Independent, because we cannot rely on technology companies to evaluate the effects of their own technologies. Expert, because we need highly-trained social scientists to do this work.
Although “ethnography” has become a catch-phrase in technology and design circles, it takes years of education and training to carry out the kind of robust, systematic empirical research we need. Social scientists must be dedicated to the crucial endeavour of understanding the impacts of technology on society.
As the president of MIT has said, “Humanity faces urgent challenges — challenges whose solutions depend on marrying advanced technical and scientific capabilities with a deep understanding of the world’s political, cultural, and economic complexities.” Technology is shaping society, and it is our responsibility to make sure we are shaping it for the human good.
Alexa Hagerty and Igor Rubinov, Co-Founders of Dovetail Labs. — techradar.com