Last week, Claire wrote about how Fourth Amendment precedents and facial recognition technologies could allow law enforcement to use AVs and other camera-equipped transportation technologies as a means of surveillance. In that post she mentioned the case of Robert Julian-Borchak Williams, who last year was arrested by the Detroit Police Department based on faulty facial recognition evidence. The same day Claire’s post went up, law students from Michigan Law’s Civil Rights Litigation Initiative, along with the Michigan ACLU, sued the City of Detroit in Federal court for false arrest and imprisonment in violation of Mr. Williams’ rights under the US Constitution and the Constitution of the State of Michigan.
Given the growing use of facial recognition technology by law enforcement (including in the pursuit of the January 6th insurrectionists) cases of misidentification and wrongful arrests like Mr. Williams’ will no doubt continue to occur. Indeed, there is longstanding concern about facial recognition systems misidentifying people of color – due in large part to their designer’s failure to use diverse datasets (i.e. diverse faces) in the training data used to teach the system how to recognize faces. Beyond the digital era camera technology itself has built in biases, as it was long calibrated to better capture white skin tones. As cameras become more ubiquitous in our vehicles (including cameras monitoring the driver) issues of facial recognition will continue to collide with the emerging transportation technologies we regularly discuss here.
With all of that in mind, let’s turn to a recent case in Massachusetts that gives us a good example of how vehicle camera data can be used in a criminal investigation. On December 28, 2020, Martin Luther King, Jr. Presbyterian Church, a predominantly Black church in Springfield, MA, was destroyed by arson. Last week, the U.S. Department of Justice brought charges against a 44 year old Maine man, Dushko Vulchev, for the destruction of the church. Just how was the FBI able to identify Mr. Vulchev as a suspect, you ask? Thanks to video footage from a Tesla vehicle parked near the church on the night of the fire. When Mr. Vulchev damaged (and later stole) the Tesla’s tires, the vehicle used its onboard cameras to record him in clean, clear footage (you can see the photos in this Gizmodo post on the case). Tesla vehicles are equipped with a number of cameras and a feature called “Sentry Mode,” which remains turned on even when the vehicle is parked and otherwise inactive. If the vehicle is damaged, or a “severe threat” is detected, the car alarm will activate and the vehicle’s owner will be able to download video of the incident beginning 10 minutes before the threat was detected. In this case, this video footage was instrumental in identifying Mr. Vulchev and placing him near the church on the night of the fire.
While the FBI didn’t use facial recognition software in this case (as far as we know), it still illustrates how the quantity and quality of vehicle generated material will continue to be of interest in future investigations. How long before law enforcement proactively seeks video footage from any vehicle near a crime scene, even if that vehicle was otherwise uninvolved? If more OEM’s turn to Tesla’s camera-based security features, could we face a feature where every car on the block becomes a potential “witness?” Further, what happens when the data they produce is fed into faulty facial recognition software like the one that misidentified Mr. Williams? We live in an era of ever-more recording and our vehicles may soon be just another device watching our every move, whether are aware of it or not.