“The temptation of an automated surveillance company should make our fellow citizens react”

MPs must examine, in early March, a bill relating to the Olympic and Paralympic Games of 2024, which, among other measures, will authorize, on an experimental basis, the use of Automated video surveillance for the safety of sporting, recreational or cultural events.

Let’s go to the fact, already widely commented, that the text will apply well before – as soon as the law is adopted, especially during the Rugby World Cup – and again well after the Olympic Games, since the experiment will take End almost a year later, June 30, 2025…

Let’s go again on the experimental mode so often prized by the government in terms of security, which consists in displaying a posture of caution with regard to the use of a new technology, but which will not give rise to any evaluation Independent … Once the device has been taken in hand by the public authorities, it will be difficult to go back. Beyond its maintenance, let us bet more or later that it will sooner or later be extended to all public space.

Finally, let’s move on that the text maintains the scope of these automated surveillance systems, since it is a decree that will specify the type of event they will be called upon to detect. Doubt still reinforced when the impact study which accompanies the bill suggests that on the identification of abandoned objects and crowd movements will also be targeted “abnormal events” and “situations presuming the commission of offenses “.

In doing so, by dealing with behavioral characteristics of identifiable individuals after the fact – in the event of an arrest, for example, these systems would eye, according to certain lawyers, on the side of biometric data, which the bill claims yet do not do. As a reminder, European regulations provide for a very strict supervision regime for the processing of these sensitive data. 2>

discriminatory bia

Let us rather stop for a moment on one of the arguments put forward by the government to reassure the legislator and the population: the software that will equip the cameras will only “report” a suspicious behavior to the agents in charge of viewing the videos . It would therefore only be a tool intended to assist them, at present unable to carry out this detection, faced with the overflow of images that reach viewing centers.

We are therefore sold the efficiency and neutrality of a computer tool, while insisting on human intervention at the end of the chain: it is not the machine that will ultimately decide attention and suites to be given to reporting. The reality of the process, from the development of the tool to its use, nevertheless has enough to arouse legitimate concerns.

You have 54.64% of this article to read. The continuation is reserved for subscribers.

/Media reports cited above.