Plexus News Banner
Thursday Complexity Post
 
Plexus Institute is a professional networked community that addresses real-world challenges through the understanding, advancement and diffusion of ideas and practices rooted in the principles of complexity. 

 

What's in the Algorithms That Pervade Our Lives?
 
Vital aspects of our lives, such as the schools we attend, the jobs we get, where we live, whether we qualify for loans and how much we'll pay for insurance increasingly rely on invisible and impenetrable algorithms. So do government policies, ranging from timing of trash collection to policing patterns to your fate should you run afoul of the law.
 
Don't think decisions that rely on abstract mathematical models are necessarily any wiser than decisions made by the flawed humans. It's not just that our prejudices are built into our models. It's also that models built with the purest of intentions have unintended consequences. In her richly informative and disturbing book Weapons of Math Destruction     data scientist  and mathematician Cathy O'Neill argues that the "dark side of big data" is increasing economic inequality and creating a "toxic cocktail for democracy."
 
She describes how in late 2015 the political data firm, Cambridge Analytica, paid British academics to gather Facebook profiles of U.S. voters with demographic detail and records of each voter's "likes." The researchers then developed a psychographic analysis of 40 million voters, sorting them into different personality types. She reports that groups working with the Ted Cruz presidential campaign used these algorithms to create TV ads targeted to different types of voters and placed with programming that each type would be most likely to watch. The social and political significance of this kind mathematical modeling is being revealed repeatedly in ongoing news stories.
 
Financial troubles raise can your auto insurance rates
 
On a more personal level, O'Neill explains that insurance companies use the vast amount of data available about us to make mathematical calculations of various risks.   Fair enough. But O'Neill, who is also a skilled story-teller, explains how auto insurers creating their own rating systems decided to use money management-measured by credit scores-as a proxy for responsible driving. As a result, a New York driver whose credit rating slipped from excellent to good faced a $255 hike on auto insurance. In Florida, adults with clean driving records and poor credit scores paid an average of $1,552 more than the same drivers with excellent credit and a drunk driving conviction.
 
Why would auto insurers focus on credit scores? Automated systems can plow through them efficiently and at enormous scale. O'Neil says it's also profitable. If an insurer can get an extra $1,552 a year from a driver with a clean record, why not?  The victims are likely to be poor, less educated, less likely to know they are paying more, and less equipped to challenge the charge if they do.  
 
Wildly irregular work schedules that change with little notice in retail and food businesses are another innovation from big data, and they bring havoc to lives of low paid workers trying to juggle school, child care, transportation and other responsibilities. But businesses profit by running with minimum manpower using tools for sophisticated analysis of peak customer traffic and staffing needs.    
 
Some algorithms for crime and justice have unintended results
 
Police departments nationwide use hot spot crime predicting algorithms algorithms based on geographic locations where crimes are most likely. Those models have their virtue, O'Neill writes. The problem is that if murder and arson are the focus, will be a lot of down time. So the focus is often broadened to less serious crimes, such as vagrancy, aggressive panhandling, minor drug transactions and public drinking, offenses that are endemic to poor neighborhoods and would probably go unreported without police presence. But these more common nuisance crimes result in more arrests, creating a "pernicious feedback loop." The arrests generate new data, which justifies more policing, and generates more arrests. Again, those impacted tend to be minorities. "In our largely segregated cities," O'Neill writes, "geography is a highly effective proxy for race."
 
A risk for recidivism assigned to 7,000 criminal defendants by a private company's algorithm was found to have been wrong about 40 percent of the time, with black people falsely rated as future crime risks twice as often as white people. The study, by ProPublica, examines how algorithms were used in sentencing. O'Neill tells how low income and immigrant students are steered towards for-profit colleges that mores sophisticated students would avoid, and desperate debtors are directed toward payday loans. People seeking low income jobs may never know they've been rejected because of an answer they gave on a personality test.  
 
Many of theses algorithms are proprietary, so those who use them and those who may suffer from them don't know what information was used to create them. O'Neill suggests a Hippocratic Oath for data scientists. James Vacca, a New York City Councilman from the Bronx, recently introduced bill that would require the city to publicly release the information contained in algorithms used for government decision making. It could turn out to be a very pioneering piece of legislation.
 
 
 
 Another exciting event: 
 Leading Organizations to Health - Begins in November

  Organizational change initiatives succeed or fail based on the quality of relationships. Relational problems have been cited as the single biggest obstacle to quality improvement projects. Yet most leaders of change projects are not well-prepared for the relational dimension of their work; their training has typically focused on technical analytic tools and linear, control oriented mindsets. As a result, performance suffers.
To address this gap we created Leading Organizations to Health, a 7-month institute on leading organizational change for senior and mid-level healthcare leaders. LOH offers you the chance to:
  • learn new complexity-inspired ways to think about organizations and how they change,
  • enhance your process-awareness and facilitation skills, and
  • deepen the authentic presence and personal meaning that ground your work and give you the courage to step forward as a leader.
 
LOH consists of 4 long weekends (Thursday evening through Sunday noon) at a Rocky Mountain ranch and monthly small group peer-coaching video-conferences in between sessions. The next cohort begins in November 2017. For more information, please visit www.lohweb.com or contact the faculty members: Tony Suchman ([email protected]) and Diane Rawlins ([email protected]). We hope you'll join us!

The American Academy on Communications in Healthcare and Plexus Institute are co-sponsors.

Keep Connected with Plexus Institute

Follow us on TwitterView our profile on LinkedInLike us on Facebook
 
  Phone: 202-643-0633