Police use controversial AI tool that looks at people’s sex lives and beliefs ...Middle East

News by : (inews) -

An internal police memo obtained by The i Paper and Liberty Investigates confirms an intention to “nationally”  apply the “Nectar” intelligence system, currently deployed as a pilot by the Bedfordshire force after being developed with Silicon Valley data analysis group Palantir Technologies.

The 34-page briefing, which deals with data protection issues related to Nectar and Bedfordshire Police, makes clear the ambition by senior officers for the system to be used across policing, including in the fight against serious organised crime by regional units.

The crime fighting tool is the latest manifestation of the Government’s push to harness artificial intelligence to improve the performance of cash-strapped public services – from the health service to defence – with the help of private sector players such as Palantir.

The system, which is already in use by the Bedfordshire force with a similar system understood to be under development by Leicestershire Police, could be used to target “persons suspected of having committed or being about to commit a criminal offence”.

The police memo states that Nectar will “require and be used to access” 11 different types of “special category information” held on an unspecified number of individuals. This information includes “race”, “political opinions”, “sex life”, “religion”, “philosophical beliefs”, “trade union membership” and “health”.

Palantir and Bedfordshire police have insisted the AI software will only access existing information held by law enforcement agencies and no police data will be routinely accessible to non-police staff.

“A single source of truth”

Sensitive data such as details of a person’s sexual activity or philosophical beliefs can be gathered by police forces for a number reasons, including seeking intelligence on potential offences or offenders. Such data can be gleaned from so-called human sources, including informants or witnesses, as well as social media platforms such as online forums, dating website profiles to financial records. Much of this information can only be obtained with permission such as a court warrant and is then held securely on classified systems.

Senior officers believe that the AI software – billed as a “real-time data-sharing network” – will dramatically speed up the identification of suspects and criminal networks as well as offering an enhanced ability to crack down on domestic abusers and protect minors at risk of harm.

However, senior MPs and privacy campaigners have expressed concern over what they say are its far-reaching implications for the ability of police to sift vast amounts of data about individuals and the way in which such information is used and safeguarded.

He told The i Paper: “There is a real problem with technology being applied to policing without the necessary statutory underpinning and police simply appropriating the powers they want. There are lots and lots of reasons to be concerned by this [Nectar] software and it should be scrutinised by Parliament.”

Campaigners said the sheer breadth of information available to Nectar raised serious questions about the ability to safeguard privacy.

Who are Palantir Technolgies?

If proof were needed of Palantir’s status as a key mover and shaker in Britain’s attempts to grapple with the AI revolution, it was to be found in the Prime Minister’s itinerary on his trip to Washington in February. 

After finishing his crucial first meeting with President Trump in the Oval Office, Keir Starmer – accompanied by his national security adviser Jonathan Powell and Lord Mandelson, the UK’s ambassador to the US – drove to the offices of the Silicon Valley data analysis giant to take tea with the company’s chief executive, Alex Karp. 

Starmer, who has made clear his enthusiasm for “rewiring” the British state by harnessing the benefits of artificial intelligence, was given a briefing on Palantir’s existing UK public sector contracts, including a £330m five-year deal to provide the NHS with a huge new data platform, alongside work with the Ministry of Defence and, increasingly, police forces. 

The meeting was evidence of the remarkable momentum of an unconventional company which has been no stranger to controversy since it began in 2003 as the brainchild of Peter Thiel, a co-founder of PayPal and supporter of Donald Trump in 2016, who saw potential in the emerging field of analysing large databases. 

The then start-up, named after the “seeing stone” in JR Tolkien’s Lord of the Rings, was backed to the tune of $2m by the venture capital arm of the CIA and has its roots in the realm of working closely with the US defence and intelligence sectors. One of its early projects was working with the US military analysing data on the placing of roadside bombs in Afghanistan to predict where subsequent attacks might occur.  

Joe Lonsdale, another of the company’s founders, has said its software has allowed counter-terrorism experts and special forces “to neutralise thousands of adversaries (including infamous ones) and prevent dozens of attacks on the United States”. 

Since then, the company has rapidly expanded, gathering a snowballing roster of public and private sector clients for its AI-enhanced software tools which has helped boost its value to some $320bn (£235bn), making it worth more than Disney or Coca Cola. Its annual rate of revenue growth currently stands at nearly 40 per cent. 

It is a journey which has brought criticism, including protests at Palantir’s deal with the US Immigration and Customs Enforcement (ICA) agency to develop software used to track and identify undocumented migrants – a key focus for the Trump administration as it seeks to dramatically ramp up deportations. 

The company has also faced controversy over its work with police forces in the United States. Civil rights groups have raised concerns that its data tools have been used by some police departments for so-called “predictive policing” to flag individuals or neighbourhoods where offences are more likely to happen. Palantir has said its company policy is not to support or allow the use of its software for predictive policing or racial profiling.  

Nonetheless, the company has developed a reputation for being unapologetic about its work and world view. Thiel and his fellow executives have said they see the company’s purpose as helping to defend the West and Western values.  

Speaking earlier this year, Karp said one of the founding aims of Palantir was to create an “impactful company that could power the West to its obvious innate superiority”. 

Cahal Milmo

Labour MP Chi Onwurah, chair of the House of Commons technology select committee, said: “For the digital transformation of government to be successful, people must be able to have confidence in public sector technology. Improving the access and use of data can make public services more effective, but this must be accompanied by the appropriate safeguards and transparency.”

TUC assistant general secretary Kate Bell told The i Paper: “There is a long history of trade unionists being targeted simply for defending members’ interests. It is vital that any processing of trade union information by police forces and others is done in accordance with data protection law.”

However, senior officers also argue that they are in technological race with criminals who have already proved themselves adept at harnessing digital tools, including AI, to commit offences at industrial-scale from online fraud to grooming or blackmail.

Read Next

square ARTIFICIAL INTELLIGENCE

Vast data warehouses to attract AI firms - and could heat swimming pools

Read More

Bedfordshire Police said it considered the privacy of personal information to be “absolutely paramount” and that Nectar had been designed with “robust security measures”. The force said the new system was an “explorative exercise” and that its experience so far suggested it could result in faster response times and “more successful resolutions of cases”.

Palantir said that the Nectar system had identified dozens of additional children at risk of abuse within days of being put into operation and was also being used to enact Clare’s Law – the system giving women the right to know if their partner has an abusive past.

The company, which has built Nectar using its own data analysis software called Foundry, said it wanted to emphasise that its system did not provide any information not already held by police. The spokesperson added: “It simply organises that data in a way that enables faster, better decision making.”

The Home Office source said it would remain a matter for “operationally independent” forces to decide how to deploy AI systems. The National Police Chiefs Council did not respond to a request to comment.

Smart phone downloads and “association charts”: How Nectar works

The precise workings of Nectar remain under wraps for fear of giving criminals insights into police capabilities. But it is understood that the system creates a “dashboard” of data for officers working on a particular incident or case by simultaneously extracting information from multiple sources.

Depending on need, investigators are provided with a live summary of evidence such as location data or messages from a seized phone, number plates, 999 calls or intelligence files with the aim of identifying suspects, building a picture of criminal associates or alerting vulnerable individuals.

Tasks that would previously have taken days, such as crunching data from a mobile phones, take hours and building “association charts” – a task previously associated with pieces of string linking together images on investigation room white boards – now happen 75 per cent quicker, it is claimed.

While sensitive personal data such a political beliefs or sexual orientation is available to the system, it is understood that a “use case” or justification must be provided for each search and an audit trail tracking use is built into the system.

Those involved with the project insist it is already bringing tangible benefits in terms of solving or averting crimes.

Cahal Milmo

Read More Details
Finally We wish PressBee provided you with enough information of ( Police use controversial AI tool that looks at people’s sex lives and beliefs )

Also on site :

Most Viewed News
جديد الاخبار