Doubling Down on Algorithmic Bias and the Implications for Workers of Color
Here at Greenlining, we spend a lot of time thinking about the gig economy—ride services like Uber and Lyft, food delivery services like Caviar and DoorDash, home rental services like AirBNB, and the grocery delivery company Instacart. Most of us don’t often think about the gig economy as a racial justice issue, but we should: Thirty-one percent of Hispanic adults and 27 percent of African American adults earn money through the gig economy, compared to 21 percent of White adults.
For the most part, we’re not fans of the gig economy. Multiple studies show that the gig economy is bad for traffic, bad for the environment, bad for small businesses, and bad for workers. Really bad for workers. Lately, there has been a deluge of articles discussing how gig companies abuse workers by improperly classifying their workers as independent contractors. While this is certainly the highest-profile example of treating gig workers unfairly, it’s by far not the only example. Gig companies have been accused of stealing workers’ tips, blocking the creation of worker unions, and, as recently reported by Bloomberg, forcing workers to take jobs that aren’t worth it.
That Bloomberg article is worth a read. It describes how Instacart makes it very hard to decline a job and punishes you if you do so. Instacart doesn’t give workers a “cancel” or “decline” option in the app—rather, the worker has to listen to four minutes of beeping before the app cancels the job. Additionally, the Instacart app tracks how often a worker declines a delivery, and if the worker declines too many, the app will stop offering the worker more desirable jobs, or even cancel a worker’s shift entirely. It’s important to note that this entire process is automated—the app uses an algorithm to decide which jobs a worker gets, or whether to cancel a worker’s shift. There’s no human review of those decisions, which results in, for example, a worker getting dinged for rejecting jobs because their car broke down.
Instacart’s gotten some heat for these practices, and the company says it’s taking steps to make things better. The Bloomberg article includes this statement:
I suppose it’s nice that Instacart is thinking about ways to address the flaws in its algorithm. However, the idea of using customer reviews to decide which workers get more desirable jobs demonstrates that Instacart hasn’t thought about racial equity at all. The company is ignoring the fact that customers have consistently discriminated against workers of color. For example, taxicab customers consistently tip black drivers less than white drivers, and many consumers will rate workers of color lower than white workers even if the workers are providing identical levels of service. This is what’s known as the bias blind spot—people don’t acknowledge that they could be biased. If Instacart makes this proposed change, it’s almost certain that the Instacart app will assign workers of color less desirable jobs and cancel their shifts more frequently—a fact that should make company lawyers very, very nervous.
This whole saga is grounded in a very simple fact about tech companies, a fact that has huge implications for the gig economy: For all their talk about “judging purely on merit,” or using “data-driven decision-making,” they are particularly terrible at acknowledging implicit bias—unconscious attitudes or stereotypes about people. Instacart’s current algorithm suffers from what’s commonly known as automation bias—the fact that people think automated or computerized decisions are more reliable than decisions by humans. By changing to a system based on customer reviews, the company would be falling prey to the “bias blind spot”—the failure to acknowledge that customers are tainted by bias, and that those customers’ reviews are therefore biased as well. Instacart’s solution is, in a nutshell, to solve problems with their app by adding more bias. Instacart, like virtually every other tech and gig economy company, should know better.