Professor Gabriele Jacobs, Professor of Organisational Behaviour and Culture, is examining ways in which we can use ‘smart technology’, such as CCTV monitoring, wireless hotspots and facial recognition, for the benefit of public safety, while respecting human rights and public values.
Artificial Intelligence (AI) plays an important part in your work as a social scientist.
“True. AI can help us make better decisions when dealing with complex problems. Take climate change, for example. Extreme weather and earthquakes not only cause inconvenience, but often great human suffering as well. AI allows us to combine and integrate many data from different sources. As a result, we may become aware of patterns in climate change that we otherwise might not notice and be able to predict and prevent dangerous situations. In the social field, for example, AI is used for crowd management in the Netherlands as a means of preventing accidents.”
Can you tell us something about your involvement in the AI-MAPS consortium?
“I am the chair of AI-MAPS, where I mostly look at public safety. In AI-MAPS (Artificial Intelligence for Multi-Agency Public Safety Issues) 26 organisations work together on AI and public safety. Partners are the Leiden-Delft-Erasmus Universities, but also TNO, the police and companies such as Deloitte and Nokia. By combining our knowledge and expertise we want to develop methods to help prevent societal unrest or handle it better.
Social unrest is often accompanied by a lack of trust. If you no longer trust the government or your neighbours, you’re more likely to withdraw from or fight against society. The use of technology can reinforce that feeling. That’s why it’s important that every layer society can be part of the discussion about how technology is used: public bodies, parties from the private sector, scholars and, above all, citizens. Citizens in particular should be involved in decisions about their living environment. In other words, what we should not do is invent, implement and use technologies and only then ask citizens if they like what we’ve done.
When it comes to technology in public spaces, citizens are often regarded as consumers. However, most of them are oblivious of all the technology used in public spaces: sensors, cameras trained on you, facial recognition, motion mapping… This may well be perfectly watertight in legal terms, but try looking at it from a social and moral perspective! As citizens, we can no longer shrug our shoulders and think there’s nothing we can do about it anyway. Public safety is something that affects us all and that we all share responsibility for.”
Although Artificial Intelligence has a seat at the table, it can never be allowed to make the decisions".
So what you’re saying is that new technology isn’t always welcome?
“Looking at public safety, I feel technology is sometimes deployed inordinately early. Some streets have sensors installed for capturing shouting. Is that really a safety problem? And do we really need to put up cameras or other equipment in areas where young people congregate? Don’t they have an equal right to be on the streets? If the plan is to provide all residents of a neighbourhood with a pleasant living environment, they have the same right as anyone else to such an environment.
In too many cases, technology is seen as a silver bullet without the social impact being properly analysed. With AI-MAPS, we ask ourselves when we genuinely need AI and when it’s better to solve a problem based on purely human intelligence. Although Artificial Intelligence has a seat at the table, it can never be allowed to make the decisions.
Take the child allowance affair, where a naive faith in computers caused major personal tragedies – just try winning back the trust of the victims. People with a higher level of education usually know their way around society. Those who fare less well in society often seem or are less engaged. Is that because of lack of interest? Of course not! Many people have no idea of what’s going on or don’t have their questions or concerns taken seriously. And let’s be honest, this is obviously quite a complex issue.”
If you want to read the whole interview, please visit the website of Erasmus University Rotterdam.