Problem Solving Initiative

With roughly a clip a month – most of these being corporate fluff – Waymo’s YouTube channel is not the most exciting nor informative one. At least, those (like me) who keep looking for clues about Waymo’s whereabouts should not expect anything to come out of there.

That was until February 20th, when Waymo low-key published a 15 second clip of their car in action – the main screen showing a rendering of what the car “sees” and the corner thumbnail showing the view from the dash cam. The key point: Waymo’s car apparently crosses a broken-lights, police-controlled intersection without any hurdle. Amazing! Should we conclude that level 5 is at our very doorsteps?

The car and tech press was quick to spot this one, and reports were mostly praise. Yet Brad Templeton, in his piece for Forbes pinpoints at a few things that the clip does not say. First, we have the fact that Waymo operates in a geographically-enclosed area, where the streets, sidewalk and other hard infrastructure (lights, signs, and probably lines) are pre-mapped and already loaded in the algorithm. In other words, Waymo’s car does not discover stuff as it cruises along the streets of Northern California. Moreover, the street lights here do not work and so technically, this is just another four-way stop-signed intersection, with the difference that it is rather busy and there is a traffic police directing traffic in the middle. Finally, the car just goes straight, which is by far the easiest option (no left turn, for example…)

Beyond that, what Waymo alleges and wants us to see, is that car “recognizes” the policeman, or at the very least, recognizes that there is something person-shaped standing in the middle of the intersection and making certain gestures at the car, and that the car’s sensors and Waymo’s algorithms are now at the level of being able to understand hand signals of law enforcement officers.

Now I heard, less than a year ago, the CEO of a major player in the industry assert that such a thing was impossible – in reference to CAVs being able to detect and correctly interpret hand signals cyclists sometime use. It seems that a few months later, we’re there. Or are we? One issue which flew more or less under the radar, is how exactly does the car recognize the LEO here? Would a random passerby playing traffic cop have the same effect? If so, is that what we want?

As a member of the “Connected and Automated Vehicles: Preparing for a Mixed Fleet Future” Problem Solving Initiative class held at the University of Michigan Law School last semester, my team and I have had the opportunity to think about just that – how to make sure that road interactions stay as close as possible as they are today – and conversely how to foreclose awkward interactions or possible abuses that “new ways to communicate” would add. Should a simple hand motion be able to “command” a CAV? While such a question cuts across many domains, our perspective was a mostly legal one and our conclusion was that any new signal that CAV technology enables (from the perspective of pedestrians and other road users) should be non-mandatory and limited to enabling mutual understanding of intentions without affecting the behavior of the CAV. Now what we see in this video is the opposite; seemingly, the traffic police person is not equipped with special beacons that broadcast some form of “law enforcement” signal, and it is implied – although, unconfirmed – that there is no human intervention. We are left awed, maybe, but reassured? Maybe not.

The takeaway may be just this: the issues raised by this video are real ones, and are issues Waymo, and others, will at some point have to address publicly. Secrecy may be good for business, but only so much. Engagement by key industry players is of the highest importance, if we want to foster trust and avoid having the CAV technology crash land in our societies.

This fall, the University of Michigan Law School is offering its third Problem Solving Initiative (“PSI”) course concerning connected and automated vehicles. The first class, offered in the Winter 2017 semester, involved a team of fifteen graduate students from law, business, engineering, and public policy who accepted the challenge of coming up with commercial use cases for data generated by connected vehicles using dedicated short-range communication (“DSRC”) technology.

In the Fall of 2017, we offered our second PSI Course in CAV—this one to 23 graduate students. That course focused on the problem of Level 3 autonomy, as defined by the Society of Automotive Engineers (“SAE”). Level 3 autonomy, or conditional automation, is defined as a vehicle driving itself in a defined operational design domain (“ODD”), with a human driver always on standby to take over the vehicle upon short notice when the vehicle exits the ODD. As with the first course, our student teams spent the semester collecting information from industry, governmental, and academic experts and proposing a series of innovative solutions to various obstacles to the deployment of Level 3 systems.

This semester, our PSI course is entitled Connected and Automated Vehicles: Preparing for a Mixed Fleet Future. I will be co-teaching the course with Anuj Pradhan and Bryant Walker Smith. Our focus will be on the multiple potential problems created by unavoidable future interactions between automated vehicles and other road users, such as non-automated, human-driven vehicles, pedestrians, and bicyclists.

Although cars can be programmed to follow rules of the road, at its core, driving and roadway use are social activities. Roadway users rely heavily on social cues, expectations, and understandings to navigate shared transportation infrastructure. For example, although traffic circles are in principle governed by a simple rule of priority to vehicles already in the circle, their actual navigation tends to governed by a complex set of social interactions involving perceptions of the intentions, speed, and aggressivity of other vehicles. Similarly, while most states require bicyclists to obey stop signs and traffic lights, most cyclists do not; prudent drivers should not expect them to.

Can cars be programmed to behave “socially?” Should they be, or is the advent of robotic driving an opportunity to shift norms and expectations toward a greater degree of adherence to roadway rules? Will programming vehicles to be strictly rule compliant make CAVs “roadway wimps,” always giving in to more aggressive roadway users? Would that kill the acceptance of CAVs from a business perspective? Is reform legislation required to permit CAVs to mimic human drivers?

More generally, is the advent of CAVs an opportunity to reshape the way that all roadway users access roadways? For example, could the introduction of automated vehicles be an opportunity to reduce urban speeds? Or to prohibit larger private vehicles from some streets (since people may no longer be dependent only on their individually owned car)? These questions are simply illustrative of the sorts of problems our class may choose to tackle. Working in interdisciplinary groups, our graduate students will attempt to identify and solve the key legal, regulatory, technological, business, and social problems created by the interaction between CAVs and other roadway users.

As always, our class will rely heavily on on the expertise of folks from government, industry, and academia. We welcome any suggestions for topics we should consider or experts who might provide important insights as our students begin their discovery process next week.