Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy

Category: Mathematics
Author: Cathy O'Neil
4.1
This Year Hacker News 4
This Month Reddit 2

Comments

by doodcool612   2021-12-10

There's a great book called Weapons of Math Destruction. If you're interested in these kinds of problems, this is a quick resource to get up to speed.

by race_bannon   2021-12-10

> make his claims but not provide souce code showing how a bias could be hidden in an algorithm without it being immediately obvious to many coders at google

Because with machine learning and AI, even the developers don't understand how the decisions are made.

You should read Weapons of Math Destruction by Cathy O'Neil, which goes into how biased training data, programmers, etc can result in biased algorithms. It's pretty fascinating.

by randcraw   2021-10-15
"Explainable" just means you didn't build your app/service using any technology that isn't interpretable. Her argument is that the strategy of reverse engineering a method to convert it from inexplicable to explicable is inherently less effective than maintaining explicability at all times in the app's genesis -- from the design phase through implementation.

But Rudin's Premise is philosophical more than practical. If the problem at hand is better solved using a black box (in terms of accuracy, precision, robustness, etc), her premise says simply, don't do it. Unfortunately in the cutthroat world of capitalism, that strategy can't compete with the cutting edge.

Where Rudin's Premise is more suitable is in writing regulations to address AI app problems where social unfairness is unchecked (like the COMPAS app that advises legal authorities on meting out parole decisions without explaining its reasoning). There are many such (ab)uses for AI today in social services or policing which merit rethinking since AI-based injustice so offer bedevils the proprietary lack of transparency in such apps.

Another excellent discussion of problems like these is Cathy O'Neil's book "Weapons of Math Destruction". Too bad she couldn't share the Squirrel prize. https://www.amazon.com/Weapons-Math-Destruction-Increases-In...

by denzil_correa   2017-10-20
> In other words, by the time we notice something troubling, it could already be too late.

For me, this is the key motivating point - the horse may have left the barn by the time we act. A lot of times people say this is exaggeration but "Weapons of Math Destruction" is a nice read on unintended side effects of this phenomena [0].

[0] https://www.amazon.com/Weapons-Math-Destruction-Increases-In...

by jasode   2017-08-19
The "critical thinking" Rob Kitchin is talking about is analyzing algorithms' impact with a social lens. Because algorithms affect people's lives, we shouldn't be content with letting them be opaque black boxes.

It seems to have overlap with the themes in the book by Ed Finn "What Algorithms Want - Imagination in the Age of Computing".[1]

Both say that algorithms are intensely studied from a technical perspective. E.g. O(log n) is better than O(n^2), etc.

Their idea is that the algorithms themselves are creating their own "culture" or "reality" and this should be studied through the lens of "humanities" or "sociology" instead of just "mathematics".

E.g. neural net or statistics algorithm computes that Person A is better credit risk than Person B. However, observers notice that Person B is always black and therefore claim that algorithms are (re)creating racial inequality. Or algorithms that provide sentencing guidelines for convicted felons. Or algorithms that diagnose medical problems.

Other writings with somewhat similar themes:

- Cathy O'Neil, "Weapons of Math Destruction - How Big Data Increases Inequality and Threatens Democracy"[2]

- Eli Pariser, "The Filter Bubble"[3]

There doesn't seem a universal term coined that generalizes the ideas in all 4 of those books but nevertheless, I'm sure more and more writers will notice they are talking about similar ideas.

Side observation about language usage... What I notice in all 4 books is that authors are using the word "algorithms" as a catch-all term for "machine learning". They're not really concerned about building-block algorithms such as "quick sort" or "discrete Fourier transform". What they're all talking about is "Facebook machine learning" is imposing X on us, or "Google's machine learning" is making us think Y. For some reason, the word "algorithm" has gained more currency than "machine learning" in these pop science books.

[1] https://www.amazon.com/Weapons-Math-Destruction-Increases-In...

[3] https://www.amazon.com/Filter-Bubble-Personalized-Changing-T...

by WillPostForFood   2017-08-19
Calling BS on big data is really important, but this article is weak. The New Yorker should be doing better. Try Weapons of Math Destruction by Cathy O'Neill for a much more informed critique.

https://www.amazon.com/Weapons-Math-Destruction-Increases-In...