Facial Recognition Misidentifies Members Of Congress As Criminals
Date: July 28, 2018Author: Nwo Report 0 Comments
by Austin Lewis
One of the newest tools that some police departments around the globe are utilizing is facial recognition, and as with many other things, Amazon has created and begun to sell software that offers just such capabilities. However, it seems to be misidentifying a significant number of people.
According to the American Civil Liberties Union, it incorrectly matched several members of Congress with incorrect identities, confusing them with criminals. The test showed that the software had an error rate of around five percent, and that occurred while using a small dataset, much smaller than could be expected in many cities. The software, in use around the country by police officers and law enforcement, misidentified 28 legislators, a scary thought given how “flawed, biased, and dangerous” the software could be.
Representatives John Lewis and Bobby Rush, of Georgia and Illinois, respectively, are both democrat politicians, former civil rights ‘leaders,’ and current members of the Congressional Black Caucus.
When the ACLU ran their pictures, and that with every other Congressperson, through Amazon’s facial recognition software, it claimed that 28 of the people making laws in America are convicted criminals.
This test, which was performed with a sample database of 25,000 publicly available mugshots, amounted to a whopping five percent error rate among legislators.
Furthermore, according to the ACLU, the test seemed to disproportionately misidentify African-American and Latino members of Congress with the people in the sample population of mug shots.
Jacob Snow, who serves as a technology and civil liberties lawyer with the ACLU, said that this suggests that the technology, at least as it currently exists, is “flawed, biased, and dangerous.”
This afternoon, three of the misidentified lawmakers (Senator Edward J. Markey and Representatives Mark DeSaulnier and Luis Gutierrez), sent a letter to Jeff Bezos, CEO of Amazon, saying that they had a serious question about whether it was appropriate for his company to be selling the technology to law enforcement at this time, given its poor performance in this specific test.
The letter also asked Bezos how Amazon tested the facial technology for accuracy and bias and asked him to provide a list of government agencies that are currently using it.
They even asked for information concerning every law enforcement and intelligence agency that Amazon had talked to about the software and about the possible purchase of it.
Two other congressmen who had been wrongly matched with mug shots by the software, Representatives Lewis, and Jimmy Gomez, wrote a letter, since obtained by BuzzFeed, requesting an immediate meeting with Bezos to discuss addressing “the defects” in the facial recognition software they were selling.
A spokeswoman for Amazon Web Services, Nina Lindsey, released a statement, in which she said that Amazon’s customers had been using the software for such noble purposes as preventing human trafficking and reuniting missing children with their families.
She also said that the ACLU had used the software, named Amazon Rekognition, differently in their test than Amazon had recommended using it for law enforcement customers who might use it for facial recognition purposes.
According to Lindsey, law enforcement agencies don’t usually use the software, and the software alone, to make ‘fully autonomous’ decisions about someone’s identity.
Rather, in the statement, Ms. Lindsey stated that the software is meant to be used to help “narrow the field” so that officers for the various law enforcement agencies can review and consider the “options” using their judgment, rather than simply relying on the program.
The Amazon spokeswoman also noted that when the ACLU did their experiment, they used the systems default setting for identifying matches.
That meant that the ‘confidence threshold’ for the program was set at 80 percent.
So, when the program scanned through the photos looking for people who ‘matched’ with the legislators, the system counted matches who had a ‘similarity score’ of 80 percent or more.
Amazon, as a business, uses that same percentage when matching an employee’s face with the ID on their work badges.
However, Amazon recommended to law enforcement agencies that if they use the Amazon Rekognition software, they should use a higher ‘confidence threshold,’ 95 percent, to reduce the likelihood of accidental matches.
Civil liberties groups and technology experts have viewed facial recognition software as a source of no small amount of consternation in recent years.
Proponents say that it can be useful for identifying criminals, such as when police used it to identify the shooter at the Capital Gazette in Maryland.
However, some groups concerned with civil liberties believe that it could be abused and inhibit the ability of people to live their lives ‘anonymously.’
Just this month, Microsoft said that the technology was too risky, and called on Congress to regulate it and provide some amount of government oversight.
Since late May, Amazon has increasingly come under pressure for selling their Rekognition software to law enforcement organizations. Two dozen civil rights groups, led by the ACLU, wrote a letter to Bezos, demanding that the company stop selling the technology to police.
They warned that the technology could be used to track protesters, illegal aliens, or other members of the public.
According to Matt Wood, general manager of artificial intelligence at Amazon Web Services, there have been no reports of the technology being misused.
The Congressional Black Caucus also wrote a letter to Amazon suggesting that the company should spend more time and effort on “properly calibrating” their recognition to account for racial bias.
Whatever demands people may place on Amazon, the reality is that technology moves ever forward, rarely backward. While it may improve as market and competition force it to, the fear of abuse of the product will always be a concern.
Thanks to: https://nworeport.me
Date: July 28, 2018Author: Nwo Report 0 Comments
In a test performed by the ACLU, it misidentified a number of congresspersons as convicted criminals, but Amazon said they were using the system improperly.
by Austin Lewis
One of the newest tools that some police departments around the globe are utilizing is facial recognition, and as with many other things, Amazon has created and begun to sell software that offers just such capabilities. However, it seems to be misidentifying a significant number of people.
According to the American Civil Liberties Union, it incorrectly matched several members of Congress with incorrect identities, confusing them with criminals. The test showed that the software had an error rate of around five percent, and that occurred while using a small dataset, much smaller than could be expected in many cities. The software, in use around the country by police officers and law enforcement, misidentified 28 legislators, a scary thought given how “flawed, biased, and dangerous” the software could be.
Representatives John Lewis and Bobby Rush, of Georgia and Illinois, respectively, are both democrat politicians, former civil rights ‘leaders,’ and current members of the Congressional Black Caucus.
When the ACLU ran their pictures, and that with every other Congressperson, through Amazon’s facial recognition software, it claimed that 28 of the people making laws in America are convicted criminals.
This test, which was performed with a sample database of 25,000 publicly available mugshots, amounted to a whopping five percent error rate among legislators.
Furthermore, according to the ACLU, the test seemed to disproportionately misidentify African-American and Latino members of Congress with the people in the sample population of mug shots.
Jacob Snow, who serves as a technology and civil liberties lawyer with the ACLU, said that this suggests that the technology, at least as it currently exists, is “flawed, biased, and dangerous.”
This afternoon, three of the misidentified lawmakers (Senator Edward J. Markey and Representatives Mark DeSaulnier and Luis Gutierrez), sent a letter to Jeff Bezos, CEO of Amazon, saying that they had a serious question about whether it was appropriate for his company to be selling the technology to law enforcement at this time, given its poor performance in this specific test.
The letter also asked Bezos how Amazon tested the facial technology for accuracy and bias and asked him to provide a list of government agencies that are currently using it.
They even asked for information concerning every law enforcement and intelligence agency that Amazon had talked to about the software and about the possible purchase of it.
Two other congressmen who had been wrongly matched with mug shots by the software, Representatives Lewis, and Jimmy Gomez, wrote a letter, since obtained by BuzzFeed, requesting an immediate meeting with Bezos to discuss addressing “the defects” in the facial recognition software they were selling.
A spokeswoman for Amazon Web Services, Nina Lindsey, released a statement, in which she said that Amazon’s customers had been using the software for such noble purposes as preventing human trafficking and reuniting missing children with their families.
She also said that the ACLU had used the software, named Amazon Rekognition, differently in their test than Amazon had recommended using it for law enforcement customers who might use it for facial recognition purposes.
According to Lindsey, law enforcement agencies don’t usually use the software, and the software alone, to make ‘fully autonomous’ decisions about someone’s identity.
Rather, in the statement, Ms. Lindsey stated that the software is meant to be used to help “narrow the field” so that officers for the various law enforcement agencies can review and consider the “options” using their judgment, rather than simply relying on the program.
The Amazon spokeswoman also noted that when the ACLU did their experiment, they used the systems default setting for identifying matches.
That meant that the ‘confidence threshold’ for the program was set at 80 percent.
So, when the program scanned through the photos looking for people who ‘matched’ with the legislators, the system counted matches who had a ‘similarity score’ of 80 percent or more.
Amazon, as a business, uses that same percentage when matching an employee’s face with the ID on their work badges.
However, Amazon recommended to law enforcement agencies that if they use the Amazon Rekognition software, they should use a higher ‘confidence threshold,’ 95 percent, to reduce the likelihood of accidental matches.
Civil liberties groups and technology experts have viewed facial recognition software as a source of no small amount of consternation in recent years.
Proponents say that it can be useful for identifying criminals, such as when police used it to identify the shooter at the Capital Gazette in Maryland.
However, some groups concerned with civil liberties believe that it could be abused and inhibit the ability of people to live their lives ‘anonymously.’
Just this month, Microsoft said that the technology was too risky, and called on Congress to regulate it and provide some amount of government oversight.
Since late May, Amazon has increasingly come under pressure for selling their Rekognition software to law enforcement organizations. Two dozen civil rights groups, led by the ACLU, wrote a letter to Bezos, demanding that the company stop selling the technology to police.
They warned that the technology could be used to track protesters, illegal aliens, or other members of the public.
According to Matt Wood, general manager of artificial intelligence at Amazon Web Services, there have been no reports of the technology being misused.
The Congressional Black Caucus also wrote a letter to Amazon suggesting that the company should spend more time and effort on “properly calibrating” their recognition to account for racial bias.
Whatever demands people may place on Amazon, the reality is that technology moves ever forward, rarely backward. While it may improve as market and competition force it to, the fear of abuse of the product will always be a concern.
Thanks to: https://nworeport.me