User:Silentrod/sandbox

From Wikipedia, the free encyclopedia

The use of predictive analytics, data mining, and automated decision-making in managing welfare in America fails the impoverished and creates an oppressive surveillance system that deters the vulnerable from seeking assistance[1], argues Virginia Eubanks in her 2017 book Automating Inequality. A Professor of political science at the University of Albany, SUNY[2], Eubanks adds to the growing chorus of concern[3][4] that challenges the idea that computer-based algorithms and systems are unbiased and accurate.

Part of the problem is that the mathematical formulas and programming in these systems are both opaque and lack all the information there is about each person; these programs substitute other variables or “proxies” for the missing information[5][6]. An example of this in predictive analytics would be a program declaring a person as unlikely to repay a loan because of where they live, or their formal education level, or even their language pattern[7]. The decisions are not fully informed or based on inaccuracies or assumptions[8]. These systems are called opaque because of the variables and proxies – the information that informs the programs – are not known by the average user. Understanding how an automated system makes the decisions is virtually unknowable to the public: every system and program uses different criteria, information, and substituting proxies.

The other part of the problem is the faith people put in automated decision-making: the idea that the program or system is accurate and fair is not always true[9]. The belief that the computer must be right can cause people to second-guess their experience, knowledge, and judgement[10]. Fortunately, as Eubanks explains, hybrid systems that permit workers higher decision-making power over programs can reduce these instances[11], and are in use.

Instead of helping people in need, these unintentional issues may worsen the problem[12]. These systems compound the misery of poverty, creates systemic racism and bias, and perpetuates the established concept that social policy less of a social commitment and more of a personal fault[13]. In much the same way as the Industrial-era Poorhouses of the 19th and 20th centuries acted as a moralized determent to seeking poverty relief[14] by separating families[15] and exploiting the poor as a labour force[16], the use of automated systems in modern poverty management continues to deter, exclude, and dehumanize the vulnerable. These modernized systems discourage and divert users from public benefits, and erode civil liberties such as the right to mobility. The moralized suffering of poverty – the idea that poverty is a personal fault or character flaw - becomes grounds for racist and classist hierarchical structures that would not be tolerated by the wealthier classes[17].

Using ethnographies and interviews involving the management of automated welfare systems and the lived-experience of users, Eubanks shows how these modern systems create what she calls “the Digital Poorhouse.”

Chapter Summary[edit]

From Poorhouse to Database[edit]

The Poorhouse was a communal institution of last resort for the poverty-stricken. The infamous Rensselaer County House of Industry – the first Poorhouse built in Troy, New York – was an institutionalized nightmare, where the mentally-ill slept on urine-soaked hay and lacked access to sanitation. At the same time, they were housed in 4.5 x 7-foot cells for upwards of six months at a time[18]

The economic Depression of 1819 fuelled a growing upper and middle-class fear of the publics’ increasing reliance on social assistance or “pauperism”[19]. This concern led to Josiah Quincy III’s idea of separating the needy into two primary groups: those deserving of charity, and those that can work for a living, or the “able poor.” In Quincy’s mind, the worthy were the elderly and infirm, infants, and those with a “corporeal disability”; the rest can work to some degree or another. Those deemed able yet failed to work were heaped with scorn[20].

The advent of the scientific charity movement created the perspective that families in need of public benefits were like curious cases in need of solving[21], further increasing a public distrust and suspicion of welfare users.

Collecting information and creating databases about poor people is not a new development. During the eugenics movement in America, social scientists began interviewing, photographing, finger-printing, and evaluating the poor across America. They inventoried children, mapped out family trees, and measured people’s heads.  They investigated and identified people’s sexual histories and preferences and their personal habits and behaviours. The notes from the studies were moralized, noting poor people as imbecilic and feeble-minded, "harlot," and 'dependent”[22].

Modern welfare in America still reflects some of the values and practices of the Poorhouse and the scientific charity and the eugenics movements. Service workers are still referred to as  ‘caseworkers’ and clientele are ‘cases’; the poor continue to have databases of information about their personal lives generated, compiled, evaluated and shared with others[23]. Like the institutional deterrent of the Poorhouse, the use of predictive analytics, data mining, and automated decision-making diverts and discourages the poor and vulnerable from seeking and getting the help they need[24].

Automating Eligibility in the Heartlands[edit]

Turning her attention to the state of Indiana, the author surveys the problematic evolution of automated welfare services, from system-wide crashes to court cases and protests. The tragic failures of the system are also featured, such as Omega Young, who’s medical, and public benefits were cancelled because she failed to attend a meeting; she was hospitalized for her terminal cancer at the time and was receiving treatment[25].

           The reason why thousands in Indiana were denied assistance or purged from the welfare rolls, Eubanks argues, is because of the classist and racist assumptions about the poor that persists today. The suspicion that people needing help are lazy, dishonest, and that they should be discouraged[26].

Although the system has improved by adopting a hybrid system rather than a more automated option[27],  poverty continued to increase despite a reduction in public benefit use[28]

High-Tech Homelessness in the City of Angels[edit]

Los Angeles, California, has had a long-standing problem with homelessness[29]. The use of a coordinated entry system - connecting the homeless with shelter - was intended to make things easier to shelter the homeless by cutting red tape and promoting stabilized housing[30]. However, a lack of stable rent history and low or no credit kept landlords for housing those in need[31].  Unfortunately, the homeless problem continues[32].

Data collected for users of the program – including personal history, photograph, social insurance number, and known hang-outs[33] – would then be shared with 168 different organizations ranging from charity to government to police services.

An additional concern is lax or nonexistent security provided to this data. Specifically, before 1996, access to state welfare records such as these required due legal process: warrants and approval were requirements for police and other services to see the files. After the welfare reforms of 1996, the homeless lost the due process afforded to other Americans, in that the police could access a person’s welfare history on request. An example would be Operation Talon, when American authorities mined the data of food stamp recipients with a database of warrants. Recipients with any warrant were called in under the pretense of a meeting and were instead arrested[34].

This erosion of civil rights is only perpetrated against the poor and vulnerable, Eubanks carefully notes. Other sensitive data, such as mortgage histories or student loans, still require warrants for the authorities to access them[35]. The more affluent classes would not tolerate these intrusions and infringements of rights.

The Allegheny Algorithm[edit]

The Child, Youth, and Family Services of Pittsburgh, Pennsylvania, used predictive analytics to guess the likelihood and severity of alleged child maltreatment. Included in the algorithm was two proxied values: the frequency of reports of mistreatment, and whether the child was with their biological parents, or in the care of another[36]. Although touted for its “fair to good” accuracy, the predictions had a 70% error rate[37].

That inaccuracy created false positives, resulting in the surveillance and intervention of the state when it was not required and potentially exacerbating tough times[38]. Of even greater danger was when the program under-rated the severity of the maltreatment, possibly ignoring children in real need[39].

Like LA’s homeless, those in need in Pennsylvania likewise faced an enhanced data investigation compared to wealthier families. Social workers mine social media sites and public benefit databases for a more detailed evaluation of the family[40]. What is problematic for the author is that while public benefit-usages records are used to inform the caseworker, private or out-of-pocket service records are not included[41]. That is an issue not because Eubanks thinks that all service records – counselling, therapy, etc. – should be made available. Instead, the fact that people in need access to supportive services means that those private details are included in the worker’s evaluation. People who can afford private services do not have that history included in the caseworker’s assessment, which both potentially endangers the children and privileges the appearance of the more affluent family. The family in need, on the other hand, by previously seeking help, appears worse because there is a paper-trail of supportive services. As Eubanks argues, this means the system demonstrates both the lack of privacy afforded the poor, and the biased idea that richer families are more deserving of privacy[42]. In essence, this creates an algorithmic bias against the poor.

The inclusion of data from previously accessed public benefit services can itself be a deterrent to seeking further aid. The growing – and lasting – history of use and the erosion of privacy may cause people to refrain from getting help[43][44].

The Digital Poorhouse (and How to Dismantle It)[edit]

For Eubanks, the primary roadblock in creating a more equal and equitable society is that our ethical treatment and regard for the poor has not evolved nearly so quickly as technology has[45]. In the context of the social net of welfare services, this means that society cares less about the suffering and more about the perceived threat the impoverished may cause the wealthier, such as the middle- and upper-classes[46].

Eubanks offers two recommendations to dismantle the Digital Poorhouse and improve the outcomes of people needing social assistance: a declarative “Principles of Non-Harm for Big Data,” and the implementation of a guaranteed basic income. Intended for the digital age, Eubanks offers an oath of respect for and the consent of end-users, the acknowledgement and removal of biases and barriers to seeking aid, and the overall use of the welfare state as a mechanism to help people rather than surveil them[47].

A Guaranteed Basic Income is precisely that: a modest income provided by the government without conditions as a replacement for the current welfare system[48]. Advocates of the plan claim that a ‘strings-free’ income could reduce financial stress in low-wage households, encourage educational pursuits, and eradicate the stigma associated with welfare[49][50].

Critical Reception[edit]

           Writing about all the issues involved in poverty in contemporary America – including race and racism – would make for a very thick book indeed. Although Eubanks does specifically write about the disproportional representation and impact on minorities[51], some have stated that the role and connection between discrimination and race were not explored critically enough[52]. Further to that, the book has been criticized for failing to explore the relationship between social services-related data and police action[52].

Automating Inequality: How high-tech tools profile, police, and punish the poor successfully bridges the gap between academic research material and mainstream reading due to the accessible writing Eubanks delivers[53][54]. Described as “riveting”[55], the book has raised both questions and concerns about the invasive and expansive surveillance web services users find themselves under. More directly, the fear that what is applied to the vulnerable may just as quickly be done to the rest of the populace[56].

References[edit]

  1. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. pp. 183, 193. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  2. ^ Featherstone, Liza (2018-05-04). "How Big Data Is 'Automating Inequality'". The New York Times. ISSN 0362-4331. Retrieved 2020-04-08.{{cite news}}: CS1 maint: url-status (link)
  3. ^ Noble, Safiya Umoja,. Algorithms of oppression : how search engines reinforce racism. New York. ISBN 978-1-4798-3724-3. OCLC 987591529.{{cite book}}: CS1 maint: extra punctuation (link) CS1 maint: multiple names: authors list (link)
  4. ^ O'Neil, Cathy,. Weapons of math destruction : how big data increases inequality and threatens democracy. London. ISBN 978-0-14-198541-1. OCLC 991124136.{{cite book}}: CS1 maint: extra punctuation (link) CS1 maint: multiple names: authors list (link)
  5. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. pp. 144–146. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  6. ^ O'Neil, Cathy (2017). Weapons of math destruction : how big data increases inequality and threatens democracy. London: Penguin. pp. 17–18. ISBN 978-0-14-198541-1. OCLC 991124136.
  7. ^ O'Neil, Cathy (2017). Weapons of math destruction : how big data increases inequality and threatens democracy. London: Penguin. pp. 18, 110. ISBN 978-0-14-198541-1. OCLC 991124136.
  8. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. pp. 138, 144. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  9. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. pp. 141, 179. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  10. ^ Dare, Tim (April 2017). "Section 2 - Ethical analysis: Predictive risk models at call screening for Allegheny County" (PDF). Allegheny County Analytics.{{cite web}}: CS1 maint: url-status (link)
  11. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 75. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  12. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 169. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  13. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 199. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  14. ^ Katz, Michael B. (1996). In the shadow of the poorhouse : a social history of welfare in America (10th anniversary ed., rev. and updated ed.). New York: BasicBooks. p. 5. ISBN 978-0-465-02452-0. OCLC 727647996.{{cite book}}: CS1 maint: date and year (link)
  15. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 18. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  16. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 15. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  17. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 183. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  18. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 15. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  19. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 17. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  20. ^ "Joint Comm. on Pauper Laws. Report, 1821". www.llmc.com. 1821. Retrieved 2020-04-10. {{cite web}}: Check |archive-url= value (help)CS1 maint: url-status (link)
  21. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. pp. 21–22. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  22. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 23. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  23. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. pp. 93–94. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  24. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 178. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  25. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. pp. 77–78. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  26. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 81. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  27. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 75. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  28. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 82. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  29. ^ Holland, Gale (2019-06-05). "Why L.A. County's homelessness crisis has been decades in the making". Los Angeles Times. Retrieved 2020-03-12.{{cite web}}: CS1 maint: url-status (link)
  30. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. pp. 84, 92. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  31. ^ Eubanks, Virginia. Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 101. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  32. ^ Holcombe, Madeline (2020-03-14). "Homeless Californians join in a lawsuit to mandate Los Angeles provide shelter for thousands". CNN. Retrieved 2020-04-14.{{cite web}}: CS1 maint: url-status (link)
  33. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. pp. 93–94. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  34. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 116. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  35. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. pp. 116–117. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  36. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 144. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  37. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 137. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  38. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 169. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  39. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 146. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  40. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 141. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  41. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 146. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  42. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 167. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  43. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 158. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  44. ^ Srinivasan, Janaki (2018). "Privacy at the Margins| The Poverty of Privacy: Understanding Privacy Trade-Offs From Identity Infrastructure Users in India". International Journal of Communication. 12: 1242 – via USCAnnenberg.
  45. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 217. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  46. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 182. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  47. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. pp. 212–213. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  48. ^ Van Parijs, Philippe (2004-03). "Basic Income: A Simple and Powerful Idea for the Twenty-First Century". Politics & Society. 32 (1): 4. doi:10.1177/0032329203261095. ISSN 0032-3292 – via SagePub. {{cite journal}}: Check date values in: |date= (help)
  49. ^ Calnitsky, David (February 2016). ""More normal than welfare": The Mincome Experiment, stigma, and community experience". Canadian Review of Sociology/Revue canadienne de sociologie. 53: 32 – via Wiley.com.
  50. ^ Forget, Evelyn L. (December 2014). "Reconsidering a guaranteed annual income: lessons from MINCOME". ResearchGate. Retrieved 2019-03-22.{{cite web}}: CS1 maint: url-status (link)
  51. ^ Eubanks, Virginia (2019). Automating inequality : how high-tech tools profile, police, and punish the poor (First Picador edition ed.). New York: Picador, St. Martin's Press. p. 153. ISBN 1-250-21578-1. OCLC 1050280177. {{cite book}}: |edition= has extra text (help)
  52. ^ a b Bevan, Jillian (2020-03-31). "Eubanks, Virginia, Automating Inequality". Canadian Journal of Sociology. 45.
  53. ^ Betts-Green, Dawn (2018-03-11). "Book Review: Automating Inequality". The International Journal of Information, Diversity, & Inclusion. 2: 89 – via University of Toronto.
  54. ^ Bevan, Jillian (2020-03-31). "Eubanks, Virginia, Automating Inequality". Canadian Journal of Sociology. 45: 93 – via University of Alberta.
  55. ^ Featherstone, Liza (2018-05-04). "How Big Data Is 'Automating Inequality'". The New York Times. Retrieved 2020-04-03.{{cite news}}: CS1 maint: url-status (link)
  56. ^ Nam, Michael (2018-01-23). "'Automating Inequality' warns of a dystopian future punishing the poor — in the present: book review". New York Daily News. Retrieved 2020-04-03.{{cite news}}: CS1 maint: url-status (link)

Further Reading[edit]

  • Noble, Safiya (2018) Algorithms of Oppression. New York University Press. ISBN 978147984994
  • O'Neil, Cathy (2016) Weapons of Math Destruction. Crown Books. ISBN 0553418815
  • Pasquale, Frank (2015). The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press. ISBN 978-0-674-36827-9