(Credit: Stamen Design)
It’s no secret that freeway proximity hurts property values, and well-maintained green space has the opposite effect. But a new tool that allows users to move around elements of a given cityscape — basketball courts, solar panels, parking lots — to see how they affect a neighborhood’s median income goes beyond those intuitive polarizations for a more layered view that, correct or no, gives some interesting insights into both city planning basics and artificial intelligence (AI).
The tool is called Penny, and it’s billed as “an AI to predict wealth from space.” Created by Stamen Design and Carnegie Mellon University, Penny uses high-resolution satellite imagery courtesy of GBDX (an analytics platform from DigitalGlobe) and neural networks trained on both census data and the imagery “to learn which features in the satellite images are correlated with household income,” according to a release.
Go to the Penny website and you’ll be able to look down from above on either New York City or St. Louis. You can move around to different neighborhoods to see how the AI predicts medium income, and how that compares to census data. It’s not always the same. In New York, for example, one area northwest of Central Park and the American Museum of Natural History, classified as “high income” by census data, is labeled “medium-high” by the tool.
From there, you can drag and drop city features onto the area to see how they impact income. In the New York neighborhood, a freeway, of course, makes the tool predict that the area will fall to “medium-low.” A parking lot isn’t favorable either, decreasing the tool’s confidence in its “medium-high” ranking by 19 percentage points.
Some features are less intuitive. Add a tennis court, supposedly a symbol of wealth and leisure, and Penny’s confidence in the “medium-high” score drops by 11 points. Add some trees and it also decreases. Add solar panels and the tool’s confidence level stays the same.
And those features aren’t consistent. Percentage points change depending on the area. Add a tennis court to a neighborhood classified as “medium-high” in St. Louis and the tool registers no change in confidence.
As a recent Wired article points out, that’s not necessarily because Penny has some hidden knowledge that us mere mortals fail to grasp. Sometimes, Penny’s inconsistencies highlight the failures of what AI can do, but it’s hard to pinpoint exactly when that’s the case.
Dropping the Plaza Hotel into Harlem makes Penny even more sure that it’s a low-income area. Adding trees doesn’t help, either. Scenarios in which the AI defies intuition highlight both the power and the limitations of any system based on machine learning. “We don’t know whether it knows something that we haven’t noticed, or if it’s just plain wrong,” [Aman Tiwari, a computer scientist at Carnegie Mellon University who trained the AI] says.
So which is it? Hard to say. “Sometimes an AI does amazing things, or locks onto some very intelligent solution to a problem, but that solution is inscrutable to us, so we don’t understand why it’s behaving in counterintuitive ways,” says Jeff Clune, a University of Wyoming computer scientist who studies the opaque inner workings of neural networks. “But it’s simultaneously true that these networks don’t know as much as we think they know, and they often fail in bizarre or baffling ways — which is to say they make predictions that are wildly inaccurate when it’s obvious they shouldn’t be doing so.”
In the end, Penny is as much about exploring AI as playing Urban Planner God.
“We’re hoping Penny provokes a lively conversation about how artificial intelligence is being used to make sense of our world,” Jordan Winkler, ecosystem product manager at DigitalGlobe, said in an announcement about the tool. “Sometimes Penny sees the world just like we do, and sometimes quite differently, in sometimes useful, and sometimes curious ways. It is as powerful as it is playful.”