Intervention! How Ethical Speculation Could Prevent Future Tech Trouble
Ģż āI think that many of the ethical problems in tech are unintentional; itās not that people donāt care about ethics, itās that they donāt necessarily see the potential harms. And I think this is something we can get better at.Ģż
Casey Fiesler
Assistant Professor
Information Science
- āāĢż
- āāĢż
- āāĢż
Almost every news cycle delivers headlines alerting readers to yet another tech transgression. In each case, by the time an issue is reported, damage has already been done.
Since the founding of early Silicon Valley companies like Apple, Atari and Oracle in the 1970s, the tech industry has expanded to shape almost every aspect of peopleās livesāāfrom the ways we connect and communicate to how police investigate crimes to the tools we use to learn and work.Ģż
With this widespread influence, itās reasonable to expect that tech companies would carefully consider the consequences of products before launching them. Instead, the motto made famous by Facebook Founder Mark Zuckerberg, āmove fast and break things,ā sums up how the industry often prioritizes speed and innovation over caution and care.Ģż
āOne known consequence of āmove fast and break thingsā is whatās known in the tech industry as ātechnical debtāāthe implied cost of future bug fixes when you rush to release something now with the intention of handling problems later once you know what they are,ā says Casey Fiesler, assistant professor in CMCIās Department of Information Science. ā, weāve been talking about āethical debtāāthe cost of assuming you can deal with ethical harms once tech is out in the real world and you see what happens. The problem is, by that point, itās too late because the harm is already done.ā
For the next chapter in her career, Fieslerāāwho studies technology ethics, internet law and policy, and online communitiesāāwill launch a five-year research project on ethical speculation in technology design. Her work will be supported by a $549,513 CAREER grant from the National Science Foundationāāone of the most prestigious awards given to faculty in the early phases of their careers.
The project will entail four phases, all of which Fiesler has tied to courses at CU on programming, information ethics and other areas of computing.Ģż
The first two phases will focus on identifying patterns and pitfalls in the industry and speculating on future harms, with a particular focus on marginalized communities. Throughout the final two phases, sheāll teach students to engage in group-based speculative design and work with them to develop and evaluate a speculative ethics toolkit aimed at helping designers anticipate potential technology problems before their products are released.Ģż
Key outcomes will include taxonomies of past tech ethics controversies and speculative harms; a speculative ethics toolkit with guidance on potential pitfalls for new technology; and an understanding of the differences in harm perception for and by marginalized groups.
āI think that many of the ethical problems in tech are unintentional; itās not that people donāt care about ethics, itās that they donāt necessarily see the potential harms. And I think this is something we can get better at,ā Fiesler says.
We caught up with Fiesler to learn more about her project, āScaffolding Ethical Speculation in Technology Design,ā how her background in law and science fiction supports her research, and the most common questions she fields on her popular TikTok channel.Ģż
Ģż ĢżIn a nutshell, what is the focus of this five-year research project?
When you hear about some new tech ethics controversy in the news (misinformation! privacy violations! harassment!), you might think: shouldnāt they have known that would happen? One way of thinking about ethics in tech is that it requires speculating about future harms. This project is about helping people do that. My students and I will be doing work to find patterns in the controversies of the past, to center traditionally marginalized voices in imagining future harms, and ultimately, to develop strategies to help technologists do this kind of ethical speculation, both in the classroom and in the tech industry.Ģż
Ģż ĢżYouāve pointed out that ethics is often left out of computer science education altogether. When it is included, what issues can arise and how do you hope to address them?
Iām sometimes shocked by how many students (at many different universities) tell me they barely hear about ethics at all as part of their computer science curriculum. Even when it is strongly included, the most common model I see is standalone ethics classes, as opposed to teaching it in context all along the way. Standalone classes are great too (I teach one!) but I think that this model alone can give the impression that ethics is a specialization or an add-on rather than something everyone thinks about as part of their technical practice. Iāve been really excited to integrate ethics content into intro programming classes in both Information Science and Computer Science here at CU, and am looking forward to working with a number of project-based computing classes as part of this new work!
Ģż ĢżDo you see a hunger among students for lessons on technology ethics?Ģż
Absolutely! Not only among our students but also with the large number of students I interact with online, I am seeing a huge increase in interest in tech ethics. One of the most common questions I get on TikTok is āhow do I learn more about this?ā or āwhy isnāt this in my classes more?ā My hope is that this growing awareness contributes to a cultural shift that directly impacts tech companiesāāwhen they hire these young people in the future, or potentially when they lose good talent because ethical reputation influences decisions about where to work.
Ģż ĢżSpeaking of consequences, is there an impetusāāfinancial or otherwiseāāfor tech companies to begin baking ethical considerations into their process from the beginning, rather than viewing ethics as an afterthought?Ģż
I would like to think that many tech companies want to do the right thing, or at least donāt want their tech to do harm. However, even when thereās not an altruistic motive, thereās always the threat of bad PR! This is one of the reasons that Iāve centered controversy as part of this research; whereas I donāt think that media coverage can account for all types of harms (which is why I think itās important to also work directly with groups traditionally underrepresented in media and in the tech industry), it can be a window into how we can frame ethical issues in ways tech industry folks might care about.
Ģż ĢżYour background includes studying law and writing science fiction. How did these interests and skills lead you toward a focus in tech ethics, and how will they play into this project?Ģż
In law school, they teach you to āthink like a lawyer,ā which is in part about issue spotting. Can you look at a complicated fact pattern and identify the potential legal issues? Meanwhile, writing science fiction is about imagining the world as it might be in the future, and often, the things that might go wrong.Ģż
When I teach ethics, I try to cultivate both of these skills in my students so they can see new technology and spot potential issues, even those that might arise in the future. Discussing Black Mirror in class is engagingāāso if you can get excited about imagining the ethical implications of some technology that someone might build in 50 years, then why canāt you imagine the ethical implications of the technology you might build tomorrow? A large part of my motivation for this project was imagining how I could cultivate these skills even better.
Ģż ĢżIs there anything about whatās going on in the tech world todayāāor just the world in generalāāthat makes you think, nowās the time when this could really catch on?
āEthicsā as a term can be shorthand for a lot of related issues, including responsibility and justice. Iāve talked a lot about harm but thinking about making tech better goes beyond that. I think that thereās a hunger for social justice right now, and that tech has an important role to playāānot just in making sure it isnāt contributing to injustice, but also how it might contribute to making things better.Ģż
Ģż ĢżWhat makes you furious about the world of tech ethics and what gives you hope?
Iāve been thinking a lot lately about how often technological harm seems to be in part rooted in a lack of empathyāāan inability to understand or even perceive harms to someone who isnāt like you. And not only are marginalized groups (e.g., BIPOC, LGBTQ+ people, and people with disabilities) disproportionately impacted by technological harms, but they are also underrepresented in computing.Ģż
This is one reason why I think that diversity in the tech industry implicates ethical design as well. Changing who is in the room is only part of the solution, but itās a good step. So itās incredibly frustrating to continue to see a lack of diversity at all stages in the tech pipeline, and to continue to hear excuses for it. But at the same time, I also see movement in the right direction. And what gives me hope are the increasingly loud voices calling for change when it comes to equity, justice, ethics, and just making tech better.Ģż
Learn more about Fieslerās work in her recent article for Wired, āā (co-authored with Information Science PhD student Natalie Garrett), her interview with Forbes, ā,ā and in CMCI Now and CU Boulder Today. You can also check out her to learn even more about her teamās research!
.