The Intelligence is Artificial. The Bias Isn’t.

In 2002, the Wilmington, Delaware police department made national news when it decided to employ a new technique – “jump out squads.”  The police would drive around the city in vans, jump out in high crime areas, and take pictures of young people.  The officers engaged in these impromptu photo sessions to create a database of future criminals. 

If this plan sounds offensive, imagine if it were aided by facial recognition technology or other forms of artificial intelligence. 

Now, seventeen years after the Wilmington Police used vans and Polaroids, police have artificial intelligence at their disposal.  Police departments use AI in a variety of ways and for a variety of purposes.  Crime forecasting – also known as predictive policing – has been used by police in New York, Los Angeles, and Chicago.  Video and image analysis are used by many departments.  While AI might make law enforcement easier, the legal profession needs to keep a careful eye to make sure that AI doesn’t compound the disparities that already exist in criminal justice and other areas of the legal system.

AI and bias – Or, How AI Misses the Picture

Facial recognition and other types of AI may seem innocuous.  After all, every human has the same basic body and face.  But when AI technologies are used to classify people of different races, trouble often follows.

» Read more