This article comes from “naturalnews.com”
Government entities, including seven law enforcement agencies in the Department of Homeland Security and Department of Justice, have instituted the use of facial recognition to verify or ascertain someone’s identity using an algorithm that matches a human face from a digital image or a video frame against a database of faces.
However, the case of Harvey Eugene Murphy, Jr. is proof that flawed or misused facial recognition systems can put citizens at risk.
Murphy visited his home state of Texas to get his driver’s license renewed at the Texas Department of Motor Vehicles. Within minutes, he was approached by a police officer who notified him there was a warrant out for him. Murphy said he was not told any details about his supposed crime except for the date the robbery occurred.
“I almost thought it was a joke,” Murphy said. The 61-year-old grandfather was arrested for the armed robbery of a Sunglass Hut in Houston. Murphy’s alibi was that he was back home in California when the theft occurred.
In contrast to law enforcement’s fast action on their “suspected” thief, investigations were too slow. In fact, by the time the Harris County District Attorney’s Office figured he was nowhere in the robbery scene, three men had already sexually assaulted Murphy in a prison bathroom, leaving him with permanent injuries.
“Mr. Murphy’s story is troubling for every citizen in this country,” said Daniel Dutko, the lawyer representing Murphy. “Any person could be improperly charged with a crime based on error-prone facial recognition software, just as he was.”
We are building the infrastructure of human freedom and empowering people to be informed, healthy and aware. Explore our decentralized, peer-to-peer, uncensorable Brighteon.io free speech platform here. Learn about our free, downloadable generative AI tools at Brighteon.AI. Every purchase at HealthRangerStore.com helps fund our efforts to build and share more tools for empowering humanity with knowledge and abundance.
Murphy’s case is the seventh known case of a wrongful arrest due to facial recognition in the U.S., further highlighting the flaws of a technology already widely adopted by police departments and some retailers.
The said technology is also being used by some British police forces, such as London’s Metropolitan Police. It was also used last year to watch crowds at the King’s Coronation, at a soccer match between Arsenal and Tottenham Hotspur, a concert by singer Beyonce and during the F1 Grand Prix at Silverstone. (Related: British police secretly using U.K. passport database to conduct facial recognition searches.)
The civil liberties group Big Brother Watch said since the Met Police started using facial recognition tech, 85 percent of all matches identified by the system were wrong. In South Wales, 90 percent of matches were incorrect.
“We’ve [personally] witnessed people being wrongly stopped by the police because facial recognition misidentified them,” the group said in a report. Despite these disastrous figures, policing minister Chris Phelps wrote to police chiefs last October urging them to “double the number of [facial recognition] searches by May 2024, so they exceed 200,000 across England and Wales.”
“This dangerously authoritarian technology has the potential to turn populations into walking ID cards in a constant police lineup,” Big Brother Watch Director Silkie Carlo warned.
Consequently, when former British Prime Minister Tony Blair and House of Lords member William Hague jointly released a report in February last year calling for “a secure, private, decentralized digital-ID system,” the prime minister’s spokesman was able to say with a straight face: “There are no plans to introduce digital ID. Our position on physical ID cards remains unchanged. We are already carrying out work to enhance the digitalization of public services.”
Facial recognition is also coming to Canada
Ontario’s York and Peel regional police offices are now implementing a facial recognition system to utilize crime scene images for comparison with mugshots.
Both police agencies have engaged with the Information and Privacy Commission of Ontario to ensure compliance with best practices and privacy standards. Idemia Public Security was selected to provide biometric facial recognition software following the consultation.
To ensure the protection of individual privacy rights, the comparison of digital images will be conducted by the Identification of Criminals Act, which provides the legal framework for the collection and use of such images, authorities claimed.
The police services also stated that individuals whose mugshots and fingerprints have been collected by York Regional Police under the said act may request the destruction of these records.
“Partnering with Peel Regional Police is cost-effective and enables us to collaborate more extensively to make both communities safer,” says Jim MacSween, York Regional Police chief.
However, privacy considerations have been a critical issue in the adoption of biometric capture for law enforcement in the country. Controversy erupted when the Royal Canadian Mounted Police and other Canadian police forces were found to have used Clearview AI’s facial recognition technology, which scraped over three billion images from the internet without consent.
FutureTech.news has more stories on the dystopian measures of governments and law enforcement to surveil citizens.
Watch the video below that talks about government agencies that were caught lying about the facial recognition program.
Sources include: