Self-driving cars - too much or too little trust?
Great to have a whole session almost devoted to trust in governance, this time for automated vehicles!
Here is the link to the whole 3 day European conference on 20-22 April on on Connected and Automated Driving. Our was the most popular session supposedly, but they probably say that to everyone!
Here is the video of the session:
As well as me the panel also featured:
Jean-François Sencerin, France Autonomous Vehicles Program Director | SAM Project Leader, PFA (download presentation)
Kristina Lindfors, Director General of the Urban Transport Administration, city of Gothenburg
Dimitris Milakis, Head of the research group Automated driving and new mobility concepts, Institute of Transport Research, German Aerospace Centre (DLR)
Rough answers to the questions plus link to the reminder which focused on response to a Slido questionnaire on concerns about the tech and the involvement of citizens.
How can CCAM be deployed for social good?
Social good is an interesting entry point. For me CCAM for social good is one which takes its place in mobility system which has community and sustainability as its key focus. It’s worth remembering that what we are looking at with CCAM is a perpetuation of perhaps four incumbent systems each of which are not focused in a way that prioritises the social good.
Car companies who want to sell as many individual cars as possible. Of course, the social good aim is to prevent the huge numbers of deaths on the road, which is important and laudable. But is their business model of individual car ownership in fact at odds with the broader social good, both socially and environmentally.
Tech companies looking for a new home for their AI whether it is the optimal tool for the job or not. We have self-driving cars because, in theory AI enables them to happen. But we need to not just think about their safety, and effectiveness (the one thing we know about IT is that quite often they seem to randomly stop working), but also areas like facial recognition technology embedded and cybersecurity, particularly with the connectedness focus. Cyber-security folks are having heart attacks at the potential for bringing cities to a halt by hacking this infrastructure. Other issues such as There is a social good aspect of this too.
Even Horizon Europe is an incumbent system where many feel that tech solutions get an outsize share of the money and attention in place of other systemic, behavioural or social solutions
The 4th incumbent is us! Feeling we have the right to have our own car in our own drive and get quickly as possible to where we want regardless of the health, community or sustainability implications of that. We see this where I live in South London at the moment with some what appears to be unfortunately ill-conceived, unsystematic restrictions on car usage, raising millions in fines, but causing outrage among car owning residents. The overwhelming view is ‘we want our cars back’.
It’s worth remembering if you ask people why they want a self driving car, they want one so that they can (a) look at their phones and (b) sleep. What is exciting, we already have a technology which delivers that incredibly effectively and much more sustainably. It’s called a bus. Or a tram, or a train. Whether these are autonomous or not is perhaps a minor issue, the issue is what we need are solutions which will enable our communities to bring people together, not keep them apart, be sustainable, safe as well as convenient for us all.
So if you are taking a systemic view, with community and sustainability as the focus, the place for CCAM as it is popularly imagined as the individual self-driving car, may not be quite as pervasive as many hope.
What is needed to support trustworthy CCAM systems and services?
What is needed is visible effective governance. Regulation that works. This has been the focus of my last three years work. The idea that the governance, not just the technology itself should aim to be trustworthy and strive to earn public trust.
At least 35% sometimes many more of those surveyed say they would use a fully self-driving vehicle (without a driver or steering wheel) once one was available to them. That is showing trust in governance. They wouldn’t allow them if they weren’t safe, so if it’s allowed I can trust it.
Our Centre for Data Ethics & Innovation Covid-19 Repository & Public Attitudes 2020 Review explains “trust in the rules and regulations governing technology is the single biggest predictor of whether someone believes that digital technology has a role to play in the COVID-19 response. This trust in governance was substantially more predictive than attitudinal variables such as people's level of concern about the pandemic, or belief that the technology would be effective; and demographic variables such as age and education.
As we have seen with Covid trust in the approvals process is as important as trust in the vaccines themselves. And CCAM will be just the same. Get your regulation in early. Make the intent of governance totally focused on the public good, not simply smoothing the path of markets for the car and AI companies, make it competent, able to deliver on what it says. There are lots of cross regulatory issues which are not easily resolved. Involve citizens as part of the decision making process and make this much more open and fair process.
I did a seminar on trust in governance at the Governance of Emerging Tech Conference at Arizona State Uni in Phoenix, where governance has basically been suspended in favour of innovation. No-one seemed to care, even citizens, the uproar was mainly heard elsewhere. As someone said to me not entirely tongue in cheek ‘you have to remember, this is the Wild West Hilary after all”. Luckily, this is not the Wild West, its Europe!
Our research shows that fundamentally citizens trust governance that works. Where they can see that laws are sound and enforced and companies (not individuals) are effectively punished for wrong doing. If cars are allowed on the road which don’t deliver on what they promise, trust will be wrecked.
Regulators need to move, as PA Consulting said in their excellent report ‘Rethinking Regulation’ from being ‘Watchdogs of Industry to Guardians of the Public’ . Regulation which is firmly focused on the public good, communicative regulators who are visible and accessible and swift, serious penalties for companies who breach the rules should be clear to see.
What are the main challenges in ensuring public acceptance and trust in CCAM?
I have been having interesting conversations recently in relation to all sorts of digital technologies that far from there being a problem with public acceptability and trust there is far too much trust! So to rather contradict myself - the assumption always with self-driving cars is that we in Europe are all too nervous and will be very resistant so manufacturers and regulators need to try really hard to earn trust in the tech and the governance. Much as I would like it to be like that, I wonder. I suspect many of us may be more like the American Tesla drivers than many people think. As with other areas of tech, we will swallow the hype, mentally gloss over any risks and buy them because basically they are cool and frankly, WE WANT ONE! Let's hope if that is the case that BMW, Toyota and other more responsible car makers and of course regulators will save us from ourselves and take the trustworthiness points seriously!
We also discuss further the role of citizens and co-creation of governance and mobility in general in relation to Slido questions from the audience.