Imagine you are driving on a suburban street and all of a sudden a ball bounces onto the road. You will immediately slow down and be very cautious whilst continuously checking for kids who might follow the ball onto the street.
Within less than a second your brain and body have performed quite a few tasks, which can be broken down into three simple steps.
Your eyes see the ball and send that signal to your brain
Your brain takes this new information (‘ball on the street’) and draws the conclusion that a kid might follow.
Your brain sends out electric signals to your right foot to step on the break, another signal to your hands to grab the steering wheel tighter and to your internal systems to release a shot of adrenalin to be prepared for further action.
How would an autonomous vehicle behave in such a situation? Looking at it without attention to details, an autonomous car does the same three steps, just somewhat different.
A variety of sensors continuously monitor their surroundings. To be frank, in many regards, autonomous vehicles are better at this step than humans are. Autonomous cars do not blur their senses with drugs and alcohol, they have better night-vision, and are potentially less affected by reflections.
While humans have brains, autonomous cars have their central processing unit. Just like our brain, it integrates all the information received from the various sensors and processes them towards the situation-specific appropriate action. Admittedly, compared to our individual (and especially our social memory), the autonomous car still lacks experience to draw from but over time it will increase. More concretely, while for us humans it does make no difference whether the ball on the street is blue, green or white – a football, a tennis ball or a basketball, these are distinct situations for an autonomous car and learning to deal with this complexity will take time.
This is where it becomes very tricky! Not so much in the specific kid-ball example (breaking is a very good option here), but in more general traffic situations. Autonomous cars are trained and programmed to stick to the traffic rules 100%, no exception. Which human driver does that? Nobody, 0%, no exception! In order to participate in our complex car-based traffic system effectively, we break small rules all the time. Take the example of merging onto a busy road where, by law, we are supposed to wait until we can freely enter, every human driver ‘creates’ his gap by slowly trying to enter until one kind person leaves some room – In an autonomous car, we would be waiting without any action.
To conclude this little, and in parts simplified, example: Any situation where we have mixed traffic, meaning human drivers interacting with autonomous vehicles, our autonomous friends will have a hard time unless they learn to ‘appropriately break the law’ when necessary to enable traffic flow. However, allowing that to happen and thus making autonomous vehicles truly human brings about an entire new set of ethical discussions.
Most fundamentally, mixed traffic is still a key challenge; we are just starting to understand how big it is.
Getting to the core of this issue is an important task for autonomous vehicle developers, insurance providers, functional safety experts and the public transportation departments of larger cities.
You want to find out more? Get in touch & let’s discuss