With efforts to use meditation instead of detention making national news, we find out how one local elementary school is implementing mindfulness practices with teachers and students. We also talk to an author about why she believes mathematical models are tearing at society’s fabric.
Featured in this Show
-
Weapons of Math Destruction: How Mathematical Models Can Reinforce Discrimination
From school admissions to jail time, insurance costs to job applications, mathematical algorithms are being used more and more in ways that affect our lives. While the goal is to lower the risk of bias, a data expert says these algorithms often reinforce discriminatory policies.
-
What Happens When You Ask Elementary School Kids To Meditate
Can you imagine asking five to seven year old kids to sit quietly in the middle of a busy day and meditate? We talk to two guests who do just that. The principal of a local elementary school shares how she has implemented mindfulness meditation on a school-wide level to curb the anxiety of students and staff. We also talk to the founder of a Baltimore-based organization that uses meditation rather than traditional punitive action when students misbehave.
-
Author Breaks Down How Math Is Reinforcing Discrimination
Sure, algorithms are pretty innocent, convenient even, when suggesting “customers who bought this item also bought” recommendation lists, but a new book breaks down the more nefarious side of mathematical prediction models.
Cathy O’Neil, a data scientist and creator of the MathBabe blog, said she saw firsthand how today’s reliance on mathematical models for decision-making isn’t creating a more equitable society – it’s actually reinforcing discrimination.
In her new book, “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy,” O’Neil writes that we need to pay more attention to the way mathematical algorithms are used to determine who gets a job, who goes to jail and who gets a bank-approved loan.
When it’s all added up, O’Neil said, these algorithms — human-constructed models based on historical data to predict future behavior — end up stacking the deck against poor people and American minorities.
“The number one thing I wish to do with my book is to slam home the fact that algorithms are not inherently objective,” O’Neil said. “They are biased, the data they’re given is often biased itself. But also they have embedded values of the person building the model.”
O’Neil grew up loving numbers. She began her career in the academic world, then moved on to a private business, crunching numbers for a hedge fund. The experience opened her eyes to the way math is systemically abused throughout many American institutions.
“I was truly disillusioned by working in finance,” she said. “I had this naive idealistic concept leaving academics in 2007 that I could bring sort of the beauty and honesty and clarity of mathematics to the real world in finance. But when I got there, I realized that there were a bunch of mathematical lies.”
In the financial industry, the biggest lie of them all, said O’Neil, was at the heart of the economic collapse: the risky investments that were the result of mortgage-backed security ratings. By and large, mathematicians were sacrificing scientific principles in the name of higher profits, said O’Neil.
“In fact, they knew that the mortgage-backed securities were extremely toxic, and they were selling those ratings, but they were calling it mathematical, and it was kind of a weaponized form of mathematics that I was really ashamed of,” she said.
It led O’Neil to leave her hedge fund and join the Occupy Wall Street movement, affording her the time to explore other ways in which math was being misused for her book. She found a variety of problems inside the criminal justice system, where every criminal defendant is assigned a recidivism score. Those with higher scores are substantially more likely to receive more severe sentencing.
“The kind of data that goes into the scores is highly bias against poor people and against people of color,” she said. “So there’s every reason to think that people are assigned these high risk scores by dint of their demographics rather than their actual behavior.”
Episode Credits
- Rob Ferrett Host
- Veronica Rueckert Host
- Rob Ferrett Producer
- Aarushi Agni Producer
- Cathy O'Neil Guest
- Ali Smith Guest
- Sarah Galanter-Guziewski Guest
Wisconsin Public Radio, © Copyright 2024, Board of Regents of the University of Wisconsin System and Wisconsin Educational Communications Board.