Industrial Hemp Bill, Sexism In Apps, Best Fall Reads

Air Date:
Heard On Central Time

A bipartisan bill that would allow for the cultivation of industrial hemp in Wisconsin gets a hearing today. We find out what was said and discuss the details of the bill. There’s been a lot of discussion about sexism in Silicon Valley workplaces, but our guest says that same troubling culture can be found in the apps we use everyday. She discusses how apps are programmed in ways that leave out women and people of color. Also, one of our favorite literary experts stops by to talk about the best fall-time books to crack open.

Featured in this Show

  • State Lawmakers Back Bill To Legalize Growing Hemp

    Wisconsin lawmakers held a hearing today for a bill that would allow Wisconsin farmers to grow industrial hemp. The bill is getting support from GOP lawmakers and farm groups, and is being touted as an opportunity to grow another profitable crop in Wisconsin. A reporter shares the details.

  • How Biased Culture Gets Built Into Tech Products

    For many, technology is part of everyday life. That may involve everything from tracking meals on a fitness app to finding a date to adjusting the thermostat at home while you’re at work. According to the Pew Research Center in late 2015, 73 percent of Americans go online on a daily basis and one-fifth of Americans report going online “almost constantly.”

    However, not all tech products are helpful to all. As our guest, the author of “Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech,” saw as she started paying attention to how these products are designed, she saw that they’re often “full of blind spots, biases, and ethical blunders.” Take, for example, a 2016 study showing that artificial intelligence built into smartphones from Apple, Google, Microsoft, and Samsung isn’t programmed to help during a crisis involving domestic violence or mental health. Another example involves a 2016 face-morphing filter on Snapchat outraged thousands of users who deemed racist because of how it depicted Asians. Those oversights, according to our guest, can increase unfairness and leave people out.

    Join us as we explore how biased culture gets built into the tech products we use everyday and how it affects us. We also look at what our guest says needs to change in the technology industry in order for it to improve.

    Have you had an experience with a tech product that negatively affected you because of a blind spot or bias? What happened? You can be a part of the conversation by emailing ideas@wpr.org, posting on the Ideas Network Facebook page and tweeting at @centraltimewpr.

  • New Fall Books

    As the weather gets cooler, we gear up for fall reading season. We’ll talk about what’s new and the latest literary awards.

  • Reading Recommendations In Time For Award Season

    In the literature world, fall is the season of awards. It officially kicked off earlier this month, with the Nobel Prize in Literature awarded to British author Kazuo Ishiguro.

    Ishiguro is best known for his novels, “The Remains of the Day” and “Never Let Me Go,” both of which later became films.

    “He never writes the same book twice, which is one of the things about great literary writers. On the outside,” says Daniel Goldin, owner of Milwaukee’s Boswell Book Company. “On the inside, his themes repeat a lot. And definitely things about memory and how it is distorted by time and our inability to face the past.”

    Ishiguro’s most recent book is “The Buried Giant,” a fantasy novel set in post-Arthurian Britain.

    It’s a seasonable time for good books. Goldin recommended a few he’s had his eye on:

    On Vietnam

    Ken Burns’ 18-hour PBS documentary, “The Vietnam War,” gained acclaim as it began airing in September. The documentary is spurring a re-interest in books related to the conflict, Goldin said.

    Recent picks are Mark Bowden’s “Hue 1968,” which is nonfiction, as well as “The Sympathizer” by Viet Thanh Nguyen, which won the 2016 Pulitzer Prize in fiction.

    “There’s also, of course, reaching back to a lot of other wonderful novels and story collections like Karl Marlantes’s ‘Matterhorn,’ Robert Olen Butler’s ‘A Good Scent from a Strange Mountain,’ and Tim O’Brien’s ‘The Things They Carried,’ which is probably the king of Vietnam war fiction,” Goldin said.

    A New Spin On Classics

    Some of Goldin’s new favorites this season have incorporated the classics.

    “One book that came out in August that a whole bunch of us really loved is called ‘The World Broke In Two,’” by Bill Goldstein, Goldin said.

    The book tells the story of four famous authors — Virginia Woolf, T. S. Eliot, D. H. Lawrence and E. M. Forster — in one year: 1922. Three of them were creating their most famous work. One of them had a book banned.

    “It’s a group biography, it really makes all these writers accessible,” Goldin said. “What I love about these kinds of books, is even if you haven’t read the book … you want to read one of these books. It shows how the creative process is not always easy. We think of Virginia Woolf as being this genius of language, and she was struggling, just like all these writers were.”

    In the same vein, Goldin loved “Books for Living” by Will Schwalbe. Each chapter focuses on a different book that influenced Schwalbe’s life.

    “Once again, you read the book, and you can’t not pick up a book that he recommended,” Goldin said. “I ended up finally reading ‘Rebecca‘ … the mother of all psychological suspense. He put it in such interesting perspective.”

    And Wisconsin author Michael Perry’s new book, “Montaigne in Barn Boots,” inspired Goldin to go out and buy a book of Montaigne’s essays.

    Shortlisted

    One of the next awards on the docket is the National Book Awards. The shortlist is out, and this year’s nonfiction picks are incredibly issue-driven. Winners are set to be announced Tuesday, Nov. 14.

    Among them, Goldin recommends “Never Caught: The Relentless Pursuit of the Runaway Slave” by Erica Armstrong Dunbar, and David Grann’s “Killers of the Flower Moon.”

  • Algorithms Reveal Bias In Tech Industry, Author Argues

    We live in a world of algorithms.

    Predictive computer programming determines everything from your Facebook news feed to your Google search results to even the length of a criminal sentence.

    It’s easy to believe that because the programming is run by computers, it’s completely objective. It’s also easy to believe technology is unbiased. But that’s not entirely true, says Sara Wachter-Boettcher.

    “The reality is, it’s all made by people, and people have biases, all of us do,” she said. “So those things can make their way right into tech products.”

    Wachter-Boettcher writes about this concept in a new book, “Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech.”

    She explains it like this: brains are bad at seeing details they’re not expecting.

    That’s famously shown in a 1999 selective attention test. A group of people are playing basketball. The viewer is directed to count how many times they pass the ball.

    Mid-video, a person in a gorilla suit walks into frame, pounding their chest, and leaves. Although that image may seem outrageous, researchers found that about half of the people who watched the video didn’t see the gorilla.

    “In the tech industry, if you have a really narrow group of people that’s designing products, and they only have the experience that they have, and they’re not being trained to look beyond that, they’re simply going to miss out on a lot,” Wachter-Boettcher explains. “So having more diverse people in the room will actually make a difference.”

    She says the tech industry often unknowingly designs products based on their limited experiences, without realizing they are harmful or off-the-mark.

    For instance, Snapchat has come under fire for racist photo filters some say are no more than “yellowface.”

    Last year, a Propublica investigation found an algorithm called COMPAS, used to calculate criminal risk scores, was biased against African-Americans.

    Claims of the program’s bias went all the way to the Wisconsin Supreme Court, which ruled last year that the state could continue to use the program.

    In 2014, Facebook promoted a concept called “Year in Review,” taking each user’s most popular content from the year and repackaging it to repost.

    The idea didn’t entirely work as planned. Some people were shown events from their year they didn’t want to re-live. One man was shocked to find a photo of his daughter displayed, who had died that year of brain cancer.

    It had been his most popular post, but he hadn’t asked to re-live it, Wachter-Boettcher said.

    “Facebook put that back in front of him … and surrounded it by their own graphics of people dancing, and balloons and streamers, and said, ‘Hey Eric, here’s what your year looked like,’” she said. “He was just heartbroken by it. That was an example of a tech company sort of taking his content, and putting it out of context and inserting it in this place that he’d never agreed to.”

    Facebook later apologized for the feature.

    Wachter-Boettcher said, despite the problematic bias of some algorithms, it’s no reason to get rid of them altogether. Algorithms are still one of the best ways to deal with large amounts of data.

    However, she said developers should know that algorithms come with risk. The programming changes the way it functions depending on what data it is fed.

    Developers in Silicon Valley need to be trained to think about the blind spots, Wachter-Boettcher said.

Episode Credits

  • Rob Ferrett Host
  • Veronica Rueckert Host
  • Haleema Shah Producer
  • Breann Schossow Producer
  • Veronica Rueckert Producer
  • Todd Richmond Guest
  • Daniel Goldin Guest
  • Sara Wachter-Boettcher Guest

Related Stories