Technically wrong by Sara Wachter-Boettche
Technically Wrong by Sara Wachter-Boettcher discusses the unintentional biases and assumptions that are baked into tech products, arguing that the main drivers of these are the homogeneity within the teams building these products. The book provides various recent anecdotes from Silicon Valley tech companies like Uber, Google, Facebook, and Twitter.
What’s discussed in this book is not surprising or new to me, but I found it a very engaging read. Frankly, this book made me feel upset and infuriated all over again.
This book reminded me of a post I wrote two years ago, What are our responsibilities as software developers beyond writing code?. Early in my career, I felt confused about my place in tech and wary about the amount of influence tech carried on all of our lives. I’ve wondered about the lack of discourse around these topics both within companies and within computer science education. Technically Wrong should be a required read for anyone entering the tech field.
This one quote from the book especially stood out to me:
“The assumption that technical skills are most difficult to learn — and that if people study something else, it’s because they couldn’t hack programming. As a result, the system prizes technical abilities — systemically devalues the people who bring the very skills to the table that could strengthen products, both ethically and commercially.”
I thought about the recent events at Twitter where the teams that were disbanded included human rights, accessibility, AI ethics, and content curation. These are teams that aim to strengthen the product ethically, protect people, and make the product more inclusive. It’s a disturbing reality that there are people in positions of high power who completely disregard and devalue these works. (Related: Elon Musk just axed key Twitter teams like human rights, accessibility, AI ethics and curation)
The majority of Technically Wrong discusses anecdotes of where things have gone wrong. However, it does present an example of a company that the author believes is doing things right – pointing to Slack. The author points to the CEO of Slack who encourages designers to think about what a person might have experienced in their life before sitting at their desk, a practice which may ultimately have helped result in a more respectful messaging product, as well as their hiring practices which have resulted in a more diverse staff. I would have liked to hear more examples of positive practices and other recommendations to propagate position change. I was left to wonder what each of us should be doing individually – maybe this is a thought exercise left for us to do.
Examples presented in the book
- Google Photo Tagging incorrectly tagged black people as gorillas. In the book, Wachter-Boettcher questions, why wasn’t this caught in the development phase? She points to the lack of black employees who were involved in the development process. (Related: Google Photos Mistakenly Labels Black People as Gorillas)
- A resume screening AI app that finds candidates similar to top-performing employees of a company. This system may start to identify people with certain traits — more “male” names, Ivy-league backgrounds, etc. as more suitable for a role, further perpetuating homogeneity. (Related: U.S. warns of discrimination in using artificial intelligence to screen job candidates)
- Uber is treated as a tech company, rather than a taxi service, allowing it to get around regulations that it would otherwise need to follow. Tech companies want to be seen and treated as special.