One reason I value my newspaper subscription is that it reminds me not to take things for granted. Especially when it comes to technology.
In a recent column, the Washington Post’s Geoffrey Fowler recounted a conversation with alternative medicine advocate Deepak Chopra. Chopra has been criticized for promoting medical treatments based on pseudoscience, and his views on technology seem to be just as misguided.
“Technology is neutral, number one. Number two, it’s unstoppable,” Chopra told Fowler as they walked through the tech industry’s trade show, CES, earlier this month.
No, and no.
Technology does not just fall from the sky. People create it. Which means that our human frailties get baked right in. Type a question into a search engine, and the results can be just as racist as the responses you might get from people. Use a mathematical model to make lending decisions, and the output can be just as discriminatory as if you ask a loan officer.
People create technology, which means people can also address problems that result from (or are exacerbated by) technology. The question is who is responsible for doing so. Chopra lays that burden squarely on everyday users, blaming them for succumbing to technology’s power.
“I think technology has created a lot of stress for a lot of people, but that’s not the fault of technology,” Chopra told Fowler, “It’s the fault of the people who use technology.”
Sure, we could all be more intentional in our technology use. But absolving technology of any role in stress is disingenuous. It ignores the fact that the people who create the digital technologies we use every day design them to be persuasive. To Chopra, this isn’t the problem, but the solution; the app Chopra developed and the company he advises also employ persuasive design principles to hook people into using their products.
Chopra thinks technology will support well-being by collecting data and using it to optimize our environmental conditions, by, for example, changing the lighting in your house.
“So we just have to accept more surveillance as the price of this way of living?” Fowler asks. “For the advancement of your well-being, what’s wrong with that?” Chopra responds.
First, this kind of blind faith in technology prevents us from talking about the limits of what technology can and should do. Second, it ignores the fact that using data-driven systems to address social problems too often ends up penalizing poor and working-class people. And third, it perpetuates a belief that human experience is simply raw material for companies to datify and exploit for financial gain.
Digital technology is, as Chopra says, “here to stay.” But this flawed understanding of technology needs to go.
For further reading, see the books I cited in the links above:
- Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble
- Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O’Neil
- Artificial Unintelligence: How Computers Misunderstand the World by Meredith Broussard
- Automating Inequality: How High Tech Tools Profile, Police, and Punish the Poor by Virginia Eubanks
- The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power by Shoshana Zuboff