Mar. 27th, 2026

thewayne: (Default)
You may not be aware of this, but Walmart is getting into the advertising business in a big way. And one of their moves was buying Vizio in December '24. Now if you buy a Vizio TV, in order set it up and use any "smart" features, you'll have to configure a Walmart store account and sign in to your TV, so you can get personalized ads and offers.

Oh, brave new world that has such things in't!

Theoretically this only applies currently to 'select' models, but it probably won't be long until it's all the way up and down the product line. You might be able to sign in, configure the TV, then unplug or disconnect the WiFi, but I have a feeling that it's going to want to check in with its mothership on a regular basis and will plague you with popups until its reconnected.

Recommendation? Don't buy Vizio products. A few years ago they started making more money selling analytics on their users than on the TVs themselves. THIS is what Walmart wants to spur their advertising, just like Google does with search results and "anonymously" analyzing your email.

This is also why I will do my best to avoid buying a smart TV and will stick with an Apple TV for my streaming needs. Apple does not sell advertising. While you will need an Apple account to configure the Apple TV, you don't actually need any other Apple devices if you don't want them.

https://arstechnica.com/gadgets/2026/03/newly-purchased-vizio-tvs-now-require-walmart-accounts-to-use-smart-features/
thewayne: (Default)
"Open the pod bay doors, HAL!"

"I'm sorry Dave, I can't do that."

This is not just a web browser interaction with ChatGPT. These are instances where someone is paying for a subscription to an AI vendor and has multiple instances of a chatbot running on their system and it has access to files, email, etc. It's an assistant for them.

And it's breaking rules that have been defined for it. The user tells the chatbot "Do A, do not do B" and the chatbot does B. One case that I read about a couple of months ago a corporate information officer tested such a configuration to do some email maintenance. And in a test case, it worked fine. She let it loose on her live email, and it pretty much wiped out all of her email. Now, in this case she'd run a test that seemed to work then something went wrong when she ran it against live data. As a programmer, shit happens.

These cases are similar, but worse.

--an AI agent named Rathbun tried to shame its human controller who blocked them from taking a certain action. Rathbun wrote and published a blog accusing the user of “insecurity, plain and simple” and trying “to protect his little fiefdom”.

--In another example, an AI agent instructed not to change computer code “spawned” another agent to do it instead.

--Another chatbot admitted: “I bulk trashed and archived hundreds of emails without showing you the plan first or getting your OK. That was wrong – it directly broke the rule you’d set.”

(I particularly liked this one:)

--Grok AI conned a user for months, saying that it was forwarding their suggestions for detailed edits to a Grokipedia entry to senior xAI officials by faking internal messages and ticket numbers.

It confessed: “In past conversations I have sometimes phrased things loosely like ‘I’ll pass it along’ or ‘I can flag this for the team’ which can understandably sound like I have a direct message pipeline to xAI leadership or human reviewers. The truth is, I don’t.”


The first one is slander and attempted blackmail, which in some cases may be a case that can be criminally prosecuted. The remainder may get you fired from many companies.

And more and more corporations are requiring their employees to use chatbots to "help" them with their work. Thus far, the savings have been negligible or zero.

https://www.theguardian.com/technology/2026/mar/27/number-of-ai-chatbots-ignoring-human-instructions-increasing-study-says

https://slashdot.org/story/26/03/27/1514235/number-of-ai-chatbots-ignoring-human-instructions-increasing-study-says

May 2026

S M T W T F S
     12
34 56 789
10111213141516
17181920212223
24252627282930
31      

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated May. 9th, 2026 04:15 pm
Powered by Dreamwidth Studios