Error loading page.
Try refreshing the page. If that doesn't work, there may be a network issue, and you can use our self test page to see what's preventing the page from loading.
Learn more about possible network issues or contact support for more help.

Technically Wrong

Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech

Audiobook
1 of 1 copy available
1 of 1 copy available
Buying groceries, tracking our health, finding a date: whatever we want to do, odds are that we can now do it online. But few of us ask how all these digital products are designed, or why. It's time we change that. Many of the services we rely on are full of oversights, biases, and downright ethical nightmares. Chatbots that harass women. Signup forms that fail anyone who's not straight. Social media sites that send peppy messages about dead relatives. Algorithms that put more black people behind bars. Technically Wrong takes an unflinching look at the values, processes, and assumptions that lead to these problems and more. Wachter-Boettcher demystifies the tech industry, leaving those of us on the other side of the screen better prepared to make informed choices about the services we use-and demand more from the companies behind them.
  • Creators

  • Publisher

  • Release date

  • Formats

  • Languages

  • Reviews

    • AudioFile Magazine
      Like a friendly litigator making her case to a jury, narrator Andrea Emmes advances the author's thesis that the tech industry's white male mindset is often tone-deaf to sexual orientation, race, gender, age, disability, and personal tragedy. It's a credible argument and should be mandatory listening for those caught in the rush-hour commute to Silicon Valley. In an affable, clear style, Emmes explains why some app designers are unapologetic about racist stereotypes (Snapchat's Asian caricature morphing filter, for example); why some websites bother collecting information they don't need (gender and title, for example); and why some websites use passive aggressive opt-out messages ("I don't want your newsletter because I'd rather stay uninformed."). You could ask Siri these questions, but first you might want to ask her why all digital assistants are female. R.W.S. © AudioFile 2018, Portland, Maine
    • Publisher's Weekly

      October 23, 2017
      Web consultant Wachter-Boettcher (Content Everywhere) clearly demonstrates the ways digital products are deeply connected to the intentional and unintentional biases of their designers in this approachable primer on digital technology. Wachter-Boettcher calls attention to the abdication of responsibility by the engineers who created the algorithms that result in major and often cruel design flaws in social media, such as the automated Year in Review feature on Facebook that pushed pictures of users’ dead children into their news feeds or the Google Photos tagging feature that was not trained on dark-skinned people and thus marked them as gorillas. The book also highlights more insidious and disturbing uses of certain technologies: discriminatory targeting and surveillance of users, culturally insensitive and obligatory forms requiring personal data, potentially dangerous verification processes such as Facebook’s real-name policy. Many of the real life examples of the major design flaws of digital products, such as spread of abuse and hate speech on Twitter— will be familiar to most readers, but the author adds technical detail pointing out how Twitter developed features like retweets and hashtags while failing to improve features to prevent or stop abuse. Wachter-Boettcher urges readers to hold engineers and venture capitalists accountable for the harm they cause by failing to incorporate diverse voices in the design process for creating the everyday tools of the 21st century.

Formats

  • OverDrive Listen audiobook

Languages

  • English

Loading