New study finds AV tech less likely to detect darker-skinned pedestrians

By Marybeth McGinnis

Clarification: This article should have made clear that the study “Predictive Inequity in Object Detection” examines camera-based machine-learning object detection systems, which are one component of full camera-lidar-radar based detection systems. In addition, the paper on which the article is based has not yet been peer-reviewed. At the same time, we would underline the statement in this piece that more relevant analyses could be conducted with access to the actual algorithms and training datasets, which AV developers have not made available for review.

A recent study found that the leading automated detection systems are less accurate in detecting pedestrians with darker skin tones.

On average, the study found that detection was five points less accurate for dark-skinned pedestrians than for light-skinned ones. This points toward the concept of predictive inequity: biases in automated technology models that lead to worse outcomes for people with darker skin. This is not the first study to bring up serious questions about inequitable and potentially dangerous impacts of AV and machine learning technology.

One critique of the study could be that the researchers did not use the datasets used by companies that manufacture AVs. That, however, is all the more to the point. Companies do not make their algorithms and datasets publicly available, making it difficult to ensure equitable safety and oversight.

These questions have never been more important. As AV technology continues to push forward, the country is in a moment of questioning what—and for whom—a “safe street” is. Critical for transportation policymakers and engineers is to consider the impacts on black and brown people when adopting and implementing new technologies. AV policy needs to ensure the safety of vulnerable road users, and that safety is equal, regardless of a pedestrian’s skin tone.

Marybeth McGinnis is an outreach specialist at SSTI.