Analytics and GIS: Discriminating Discriminants
This is part four in a five part series on Analytics and GIS. We’ve previous seen a position for Road Safety Predictive Analyst, that roads can be scored, that hazard can be modeled with existing tools. Today, we hit ethics.
Discrimination
You may have heard about Microsoft’s GPS Patent, the ghetto avoidance algorithm. It caused a bit of furor. Concretely:
- Microsoft filed a patent for a GPS guided walking app.
- It has an algorithm allowing the user to avoid a certain neighborhood if the crime threshold is too high.
- Some people say that it’s racist and prejudicial.
- Other people say that it has nothing to do with race or prejudice.
Like People Clump Alike
People who are alike tend to live in alike places. So, it’s no surprise that wealth, education, religion, ethnicity and professions tend to clump up together. And sometimes they do in ways that planners don’t foresee and forecast.
There’s an effect in municipal politics that nice areas get nice things while poor areas don’t. It’s why some neighborhoods get traffic calming while others get one-way roads stabbed through their hearts. Everybody has somewhere to go, they want to get there fast, but they don’t want that traffic going through their neighborhood.
Indeed, we’ve had a recent outbreak of tension between commuters and dwellers in Toronto:
“When I went door to door in Ward 16, I had a woman who almost jumped through the screen at me. She told me … ‘The one thing that the city has done that has changed my life is the bike lanes on Jarvis. I bought this house, I have four children, and I can’t get home to them for dinner.’ – Source.
Speed, Safety, Efficiency, and Political Interests
In many ways, certain cities may explicitly choose to tolerate a higher incidence of car on pedestrian fatalities and serious injury in depressed communities just so that others can get to their children sooner. Some people say that in some cities, like Berkeley, pervasive traffic calming has made the road network unusable.
In many ways, the derivation of road hazard rates may lay bear the reinforcing relationships between traffic, real estate prices, and political influence in a given area.
Discrimination by accident
If a machine learning algorithm is left alone, with no subject matter expertise guiding it, it is very possible for it to generate discriminatory policies against neighborhoods that are unsafe from a road safety perspective and that are coincidentally concentrated along some socio-economic line. The machine isn’t a bigot. It’s the complete opposite. It can’t see those factors.
If a subject matter expert knows of these reinforcing variables, and attempts to control for them, and then fails, it could be argued that they’re a bigot. And, good luck trying to explain something so nuanced in the newspapers. You just can’t do it.
This is a major problem in the science of policy analytics.
It’s an important one to acknowledge.
This is part four in a five part series on Analytics and GIS. Tomorrow, we’ll look at other applications.
***
I’m Christopher Berry.
I tweet about analytics @cjpberry
I write at christopherberry.ca