Facial recognition hits and misses
Jan. 10th, 2023 03:26 pmKelly Conlon is an attorney in New Jersey. She accompanied her daughter's Girl Scout troop to a Rockette's Christmas show. And the venue's facial recognition system identified her as working for a law firm that is involved in litigation against a restaurant that is owned by Madison Square Garden, the owner of the Rockette's, and they kicked her out of the facility. Even though Conlon is not involved in the litigation against the restaurant or MSG.
Security intercepted her in the lobby and told her she had to leave the venue.
Continuing in the article, "Instead of attending the festive show with her daughter, Conlon waited outside. NBC reported that others who have been blacklisted have sued MSG over the policy, viewing it as MSG’s way of punishing law firms that go after the titan of entertainment. One firm so far has fought and won in court, becoming the only exception to the policy, but MSG is still appealing that decision."
https://arstechnica.com/tech-policy/2022/12/facial-recognition-flags-girl-scout-mom-as-security-risk-at-rockettes-show/
In a more interesting story, Randal Reid, a BLACK man - you can guess where this is going - in DeKalb County, GEORGIA, was pulled over and arrested for an outstanding warranty for grand theft of over $10,000 worth of Louis Vuitton and Chanel purses with stolen credit cards in - and you will be ever so surprised - Louisiana - after being identified by facial recognition systems.
Differences between Reid and the suspect: 40 lbs, the suspect has flabby arms and Reid does not, Reid has a mole on his face and the suspect does not, and a significant height difference. No one bothered taking Reid's vital statistics. Reid was held in jail for a week before being released.
Reid has never been to Louisiana.
How many millions of dollars is this going to cost the police departments of Georgia and Louisiana, or more accurately, the tax payers thereof, because of this idiocy?
https://arstechnica.com/tech-policy/2023/01/facial-recognition-error-led-to-wrongful-arrest-of-black-man-report-says/
The system used by the Louisiana law enforcement, Clearview, harvested literally billions of photos off of social media without permission and is notorious about terrible rates of false positives, especially when matching against minorities, the young, and women.
But hey! It's a tool that law enforcement can use to arrest black people! Let's keep on using it to close cases!
Security intercepted her in the lobby and told her she had to leave the venue.
Continuing in the article, "Instead of attending the festive show with her daughter, Conlon waited outside. NBC reported that others who have been blacklisted have sued MSG over the policy, viewing it as MSG’s way of punishing law firms that go after the titan of entertainment. One firm so far has fought and won in court, becoming the only exception to the policy, but MSG is still appealing that decision."
https://arstechnica.com/tech-policy/2022/12/facial-recognition-flags-girl-scout-mom-as-security-risk-at-rockettes-show/
In a more interesting story, Randal Reid, a BLACK man - you can guess where this is going - in DeKalb County, GEORGIA, was pulled over and arrested for an outstanding warranty for grand theft of over $10,000 worth of Louis Vuitton and Chanel purses with stolen credit cards in - and you will be ever so surprised - Louisiana - after being identified by facial recognition systems.
Differences between Reid and the suspect: 40 lbs, the suspect has flabby arms and Reid does not, Reid has a mole on his face and the suspect does not, and a significant height difference. No one bothered taking Reid's vital statistics. Reid was held in jail for a week before being released.
Reid has never been to Louisiana.
How many millions of dollars is this going to cost the police departments of Georgia and Louisiana, or more accurately, the tax payers thereof, because of this idiocy?
https://arstechnica.com/tech-policy/2023/01/facial-recognition-error-led-to-wrongful-arrest-of-black-man-report-says/
The system used by the Louisiana law enforcement, Clearview, harvested literally billions of photos off of social media without permission and is notorious about terrible rates of false positives, especially when matching against minorities, the young, and women.
But hey! It's a tool that law enforcement can use to arrest black people! Let's keep on using it to close cases!
no subject
Date: 2023-01-11 12:58 am (UTC)It's sad to say you expect that crap from Georgia and the police, but WTF with MSG. :o :o :o
Not that I think I have anything to worry about, but I would think long and hard about attending an event there. :o :o :o
Hugs, Jon
no subject
Date: 2023-01-11 04:11 am (UTC)Lots, I would hope, but "not a cent" is what I expect.
Well ...
Date: 2023-01-11 10:58 am (UTC)Re: Well ...
Date: 2023-01-11 12:40 pm (UTC)I wonder, would it be possible to poison the data set used by the AI tools? If they are scraping Facebook images, what if people tagged photos of politicians with the names of people banned by MSG, or tagged photos of random people with randomly-selected incorrect names?
Clearview, you say?
Date: 2023-01-11 02:36 pm (UTC)https://www.cbc.ca/news/politics/rcmp-clearview-ai-1.6060228
no subject
Date: 2023-01-11 05:16 pm (UTC)no subject
Date: 2023-01-11 09:28 pm (UTC)Re: Well ...
Date: 2023-01-11 09:45 pm (UTC)It's not easy to poison a non-public, well-established data set. So many well-intentioned people, not knowing that they were the product, blithely put up photos on Facebook every day tagging everybody in every photo, not knowing that even if people don't have FB profiles, FB creates profile stubs, and if they ever create a profile, if they link with "Bob" and Bob tagged them with photos, POOF all these photos are linked in. FB is the surveillance state's best friend. Since MSG is selectively tagging people as hostile to MSG, there's no real way to get in short of a true hack attack against MSG or th vendor, and I guarantee Clearview has very good cybersecurity. Not impenetrable, but very good.
Re: Clearview, you say?
Date: 2023-01-11 10:17 pm (UTC)Yep. No surprise there.
Re: Well ...
Date: 2023-01-12 12:15 pm (UTC)Nobody's money is infinite. It depends whether the pressure exceeds affordability or not. People are happy to do horrible things as long as they can afford it.
But quite a lot of people care more about money than family or alliances. Useful to remember.
A crucial point is choice of victims. In one regard, AI brutalizes disadvantaged groups more than advantaged groups, like black people. But not all victims are equally disadvantaged. Hit one rich black person and the company could get sued into oblivion. If this sort of thing happens a lot, the technology will be deemed unfeasible. And that's mostly down to luck.
>>I wonder, would it be possible to poison the data set used by the AI tools? <<
Yes. Data can be corrupted, individually or en masse.
>>If they are scraping Facebook images, what if people tagged photos of politicians with the names of people banned by MSG, or tagged photos of random people with randomly-selected incorrect names?<<
Go for it.
Of course, facial recognition has other vulnerabilities. One is pattern recognition -- change appearance to break up the face part, and the software can't even lock on. Another is reflection. Things like mirrored lenses or reflective fabric can foil face recognition as well as cameras.
no subject
Date: 2023-01-15 06:57 am (UTC)no subject
Date: 2023-01-15 09:46 am (UTC)That's the size of it.