Critics of the surveillance plans say they infringe on privacy rights ©Getty Images

The French National Assembly has controversially approved the use of artificial intelligence (AI) video surveillance for security purposes at the Paris 2024 Olympics and Paralympics.

The provision was the most contentious of the Olympic and Paralympic Games Bill, drawing concerns from civil rights groups, but allows AI software to be used to analyse images captured by surveillance cameras at "sporting, recreational or cultural events" on an experimental basis.

The Government claims this can detect risky situations including abandoned luggage and unusual crowd movements at Paris 2024.

Chair of the French Parliament's Committee on Legal Affairs Sacha Houlie claimed that the technology could have prevented incidents including the 2016 terrorist attack in Nice and the chaos at last year's UEFA Champions League final at Paris 2024 venue the Stade de France.

France's National Commission on Informatics and Liberty is backing the Bill provided no biometric data is treated.

The article passed through the National Assembly by 59 votes to 14 in the 577-seat chamber, having already cleared a preliminary vote in the Senate, with a joint-chamber Committee now expected to seek compromise on any differences in the text agreed on during the debate.

It could still be challenged by France's highest constitutional Court.

Supporters of the AI provisions claim the technology could help to avoid chaotic scenes such as those at last year's UEFA Champions League final in Paris ©Getty Images
Supporters of the AI provisions claim the technology could help to avoid chaotic scenes such as those at last year's UEFA Champions League final in Paris ©Getty Images

Under the law, the Rugby World Cup from September 8 to October 28 in France this year could serve as a rehearsal for the AI technology, and the experimental period would run until the end of 2024.

It would make France the first country in the European Union to legalise AI-powered surveillance.

Groups including Amnesty International have argued the technology "sets a dangerous precedent for human rights", and fear it risks transforming Paris 2024 "into a massive assault on the right to privacy".

The EU is holding ongoing discussions over its own AI Act, governing anyone who provides a product or service using AI.

Critics fear the security measures could outlast the Olympic and Paralympic Games, citing examples at London 2012 and the 2018 FIFA World Cup in Russia.

Security has become a key priority for Paris 2024 organisers given the plans for 600,000 people to attend the Olympics Opening Ceremony along the River Seine, particularly after the widely-criticised management of the Champions League final.

An additional €25 million (£22 million/$27 million) was approved for security in the Paris 2024 budget at the end of last year and a further €10 million (£8.8 million/$10.8 million) allotted for cybersecurity.