Public safety is vital for the functioning of societies. It is an important condition for health and wellbeing, personal development, creativity and prosperity. In todays digital world, sensoring and Artificial Intelligence (AI) are increasingly implemented in public systems to enhance public safety. But as these technologies continue to advance into our systems, communities and lives, its speed of development surpasses our ability to control their consequences.
Recent examples that demonstrate the need for thorough consideration before implementing AI are privacy breaches by surveillance systems, data loss, remote warfare and digital espionage. To enforce the needed balance between public safety, personal privacy and ethical issues, The Hague in the Netherlands has provided a central stage for the subject of human-centred AI. In recent years, work groups and real-life experimentation sites have been established and conferences have been organised, inviting international leaders to discuss the central question: how can we make sure that AI is used only at the service of safe, just and inclusive societies?
“In essence, it was only natural that the Dutch government raised concerns about the ethical side of technology, and feels the urgency to take that discussion to the international arena.” says Joris den Bruinen, chair of the working group Security, Peace & Justice of the Netherlands AI Coalition (NLAIC) and director of Security Delta (HSD).
In the global context, where the US leave AI applications nearly completely up to the commercial market while in China the government controls virtually all AI applications but uses it to concentrate and enforce power, Europe takes a more moderate approach. The Netherlands’ focus on human-centred AI, especially in the security and defence industries, fits well within this context.
To advance the development of human-centred AI, in early 2022 the Dutch government announced an investment of nearly €11 mln in five ELSA- labs: projects focusing on applications of human-centred AI. ELSA is short for Ethical, Legal, Social Aspects and the labs are based on a scientific concept, which involves collaboration between government representatives, academic researchers and experts from the business community as well as civil society organisations and citizens.
In order to keep social interests at the center of their efforts, the concepts developed in the ELSA Labs are put to the test in public environments, and involve citizen stakeholders. That pragmatic approach is also why the participation of the different parties is necessary; Academic researchers make sure the most advanced knowledge, technology and processes, are applied, the government can make public space and permits available and companies can create commercially viable and scalable services out of the concepts.
Joris: “with the ELSA labs we have a unique Dutch concept that already gets international traction. By applying our double-loop learning as well as security & privacy by design principles, committing to high ethical standards and with open and transparent procedures and communication, we make sure that we benefit both socially and economically from the right use of AI.”
Living Lab Scheveningen, which works according to the ELSA principles and is indirectly affiliated with the ELSA Labs, is an example of such a public environment. On this boulevard area, a digital crowd management tool was studied in 2022. The tool is a 3D dashboard displaying a digital map with current crowd congestions, but it also predicts where crowding tends to happen. This enables the municipality, the police, emergency services and event organisations to better and faster anticipate on the developments.
The 3D dashboard works by combining a number of anonymous data sources. For example, the system counts the number of people on the boulevard and looks at information from public transport, parking data, weather forecasts and traffic counts. Here too, ethical and legal concerns are weighed against the safety benefits of surveillance technology. On top of that, policies for transparency are developed to warrant accountability and responsibility of applied technology in public spaces.
The findings of this study are currently being evaluated, and will result in a scalable tool and hands-on guidelines for use by all Dutch municipalities and emergency services. “We see the urgency of these tools in disasters such as most recently at the Halloween party in Seoul, South Korea. Unfortunately, these disasters still happen too often. Therefore we are prioritising the development of this tool and guidelines, to help prevent such disasters while warranting secure data storage, retention, sharing, application and privacy.