Say you’re driving down a two-way street and there’s a lorry unloading a delivery in the opposite lane. The oncoming traffic needs to pull out into your lane to overtake.
What do you do?
Many of us just drive on as we have right of way. But eventually one of us feels charitable and slows down to allow the oncoming car to overtake, giving permission with a quick flash of headlights or a beckoning wave.
But what if the car waiting patiently behind the parked lorry is a driverless or autonomous vehicle (AV)?
Will this robot car be able to understand what you mean when you flash your lights or frantically wave your hands?
Its sensors could decide that it’s only safe to overtake when there’s no oncoming traffic at all. On a busy road at school home time, this may be never, leading to increasingly exasperated passengers and increasingly angry drivers queuing behind.
And how will a robot car nudging out from a T-junction into oncoming traffic be able to make the necessary eye contact with a human driver?
These safety-first robot cars could become victims of their own politeness and end up being bullied and ignored by aggressive, impatient humans.
This, at any rate, is one of the conclusions to be drawn from research carried out by Dr Chris Tennant of the psychological and behavioural science department at the London School of Economics. His Europe-wide survey, commissioned by tyre-maker Goodyear, finds that nearly two-thirds of drivers think machines won’t have enough commonsense to interact with human drivers.
And more than two-fifths think a robot car would remain stuck behind our hypothetical parked lorry for a long time.
Robot v. human: Driving isn’t just about technology and engineering, it’s about human interactions and psychology. “The road is a social space,” as Carlos Cipolitti, general director of the Goodyear Innovation Centre in Luxembourg, puts it.
And it is this social aspect that makes many people sceptical about driverless cars. “If you view the road as a social space, you will consciously negotiate your journey with other drivers. People who like that negotiation process appear to feel less comfortable engaging with AVs than with human drivers,” says Mr Tennant in his report.
Of course, humans are always sceptical about new technologies of which they have little experience. That scepticism usually diminishes with usage, however. And even many sceptics accept that emotionless AVs could cause fewer accidents than we humans, with our propensity to road rage, tiredness and lack of concentration.
A statistic often trotted out is that human error is responsible for more than 90% of accidents.
But 70% of the 12,000 people Mr Tennant and his team interviewed agreed that: “As a point of principle, humans should be in control of their vehicles.”
An an even greater proportion – 80% – thought an autonomous vehicle should always have a steering wheel.
Learning to drive: AV pioneer Google – which aims to develop cars without steering wheels – reckons it can meet most of these real-world challenges. It has already filed patent requests for tech that it claims will be able to identify aggressive or reckless driving and respond to it; and recognise and react to the flashing lights of police cars and emergency services.
In time then, it may well be able to programme its cars to recognise the different meanings of headlight flashes, and interpret the intentions of human drivers by their behaviour.
In the latest Google self-driving car project monthly report, head honcho Dmitri Dolgov says: “Over the last year, we’ve learned that being a good driver is more than just knowing how to safely navigate around people, [it’s also about] knowing how to interact with them.”
These interactions are “a delicate social dance”, he writes, claiming that Google cars can now “often mimic these social behaviours and communicate our intentions to other drivers, while reading many cues that tell us if we’re able to pass, cut in or merge.”
Google’s test cars have now racked up more than two million fully-autonomous miles of driving on public roads in California, Arizona, Texas and Washington, reporting a handful of minor accidents to the Californian authorities.
Interestingly, quite a few of these accidents have involved human-driven vehicles going into the back of the Google cars, suggesting perhaps that the ultra-cautious robots, with safety as their first priority, are more timid in their approach than we’re used to.
Mr Dolgov admits that the self-driving software is not yet ready for commercial release.
Islamabad : Kaspersky experts have uncovered a new phishing scam targeting businesses that promote their…
Lahore – 26 December 2024: As the fastest-growing smartphone brand in the world, realme has…
Prime Minister Muhammad Shehbaz Sharif on Wednesday said the country’s fundamental agenda of development and…
Survivors and families of victims of the Indian Ocean tsunami 20 years ago visited mass…
The military court has sentenced 60 more individuals, including Hassan Khan Niazi, the nephew…
One time, I was sitting with a few senior bureaucrats, and they were continuously blaming…
Leave a Comment