thewayne: (Default)
[personal profile] thewayne
Kelly Conlon is an attorney in New Jersey. She accompanied her daughter's Girl Scout troop to a Rockette's Christmas show. And the venue's facial recognition system identified her as working for a law firm that is involved in litigation against a restaurant that is owned by Madison Square Garden, the owner of the Rockette's, and they kicked her out of the facility. Even though Conlon is not involved in the litigation against the restaurant or MSG.

Security intercepted her in the lobby and told her she had to leave the venue.

Continuing in the article, "Instead of attending the festive show with her daughter, Conlon waited outside. NBC reported that others who have been blacklisted have sued MSG over the policy, viewing it as MSG’s way of punishing law firms that go after the titan of entertainment. One firm so far has fought and won in court, becoming the only exception to the policy, but MSG is still appealing that decision."

https://arstechnica.com/tech-policy/2022/12/facial-recognition-flags-girl-scout-mom-as-security-risk-at-rockettes-show/


In a more interesting story, Randal Reid, a BLACK man - you can guess where this is going - in DeKalb County, GEORGIA, was pulled over and arrested for an outstanding warranty for grand theft of over $10,000 worth of Louis Vuitton and Chanel purses with stolen credit cards in - and you will be ever so surprised - Louisiana - after being identified by facial recognition systems.

Differences between Reid and the suspect: 40 lbs, the suspect has flabby arms and Reid does not, Reid has a mole on his face and the suspect does not, and a significant height difference. No one bothered taking Reid's vital statistics. Reid was held in jail for a week before being released.

Reid has never been to Louisiana.

How many millions of dollars is this going to cost the police departments of Georgia and Louisiana, or more accurately, the tax payers thereof, because of this idiocy?

https://arstechnica.com/tech-policy/2023/01/facial-recognition-error-led-to-wrongful-arrest-of-black-man-report-says/


The system used by the Louisiana law enforcement, Clearview, harvested literally billions of photos off of social media without permission and is notorious about terrible rates of false positives, especially when matching against minorities, the young, and women.

But hey! It's a tool that law enforcement can use to arrest black people! Let's keep on using it to close cases!

Date: 2023-01-11 12:58 am (UTC)
disneydream06: (Disney Shocked)
From: [personal profile] disneydream06
OMg!!!!!!!!!!!!! And WTF!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

It's sad to say you expect that crap from Georgia and the police, but WTF with MSG. :o :o :o
Not that I think I have anything to worry about, but I would think long and hard about attending an event there. :o :o :o
Hugs, Jon

Date: 2023-01-11 04:11 am (UTC)
kathmandu: Close-up of pussywillow catkins. (Default)
From: [personal profile] kathmandu
"How many millions of dollars is this going to cost the police departments of Georgia and Louisiana, or more accurately, the tax payers thereof, because of this idiocy?"

Lots, I would hope, but "not a cent" is what I expect.

Well ...

Date: 2023-01-11 10:58 am (UTC)
ysabetwordsmith: Cartoon of me in Wordsmith persona (Default)
From: [personal profile] ysabetwordsmith
They will give up on facial recognition if and only if it becomes an embarrassing game of whack-a-mole that causes them to hemorrhage money. But even broke people can help with the "embarrassing" part by boosting the signal, and the more they do that, the more money it costs the police as they have to pay a PR firm or lawyers to do damage control.

Re: Well ...

Date: 2023-01-11 12:40 pm (UTC)
armiphlage: Ukraine (Default)
From: [personal profile] armiphlage
Hemorrhaging money seems like it would stop such lazy tools from being used, but (based on past evidence), if investors in the facial recognition companies are friends, family, or donors of key politicians, we'll still have to suffer.

I wonder, would it be possible to poison the data set used by the AI tools? If they are scraping Facebook images, what if people tagged photos of politicians with the names of people banned by MSG, or tagged photos of random people with randomly-selected incorrect names?

Clearview, you say?

Date: 2023-01-11 02:36 pm (UTC)
dewline: Text: Searching and Researching (researching)
From: [personal profile] dewline
From June 2021, filed by Catharine Tunney with CBC News Ottawa:

https://www.cbc.ca/news/politics/rcmp-clearview-ai-1.6060228

Date: 2023-01-11 05:16 pm (UTC)
kaishin108: girl sitting by magicrubbish dw (Default)
From: [personal profile] kaishin108
What a disgusting tool. It probably isn't even programed right to recognize different people of color :(

Date: 2023-01-11 09:28 pm (UTC)
greghousesgf: (Ewww!)
From: [personal profile] greghousesgf
oh, god.

Re: Well ...

Date: 2023-01-12 12:15 pm (UTC)
ysabetwordsmith: Cartoon of me in Wordsmith persona (Default)
From: [personal profile] ysabetwordsmith
>>Hemorrhaging money seems like it would stop such lazy tools from being used, but (based on past evidence), if investors in the facial recognition companies are friends, family, or donors of key politicians, we'll still have to suffer.<<

Nobody's money is infinite. It depends whether the pressure exceeds affordability or not. People are happy to do horrible things as long as they can afford it.

But quite a lot of people care more about money than family or alliances. Useful to remember.

A crucial point is choice of victims. In one regard, AI brutalizes disadvantaged groups more than advantaged groups, like black people. But not all victims are equally disadvantaged. Hit one rich black person and the company could get sued into oblivion. If this sort of thing happens a lot, the technology will be deemed unfeasible. And that's mostly down to luck.

>>I wonder, would it be possible to poison the data set used by the AI tools? <<

Yes. Data can be corrupted, individually or en masse.

>>If they are scraping Facebook images, what if people tagged photos of politicians with the names of people banned by MSG, or tagged photos of random people with randomly-selected incorrect names?<<

Go for it.

Of course, facial recognition has other vulnerabilities. One is pattern recognition -- change appearance to break up the face part, and the software can't even lock on. Another is reflection. Things like mirrored lenses or reflective fabric can foil face recognition as well as cameras.

Date: 2023-01-15 06:57 am (UTC)
silveradept: A kodama with a trombone. The trombone is playing music, even though it is held in a rest position (Default)
From: [personal profile] silveradept
These tools are accomplishing their true goals, which are not about law enforcement or even waving one's hand in the direction of something legal, but about surveillance and state-sponsored terrorism, currently state-sponsored stochastic terrorism, but I'm sure it will get a lot more targeted as we go along…

January 2026

S M T W T F S
    1 23
45678910
11121314151617
18192021222324
25262728293031

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 2nd, 2026 09:45 pm
Powered by Dreamwidth Studios