I don't know why people talk to LLMs like this. They aren't people. If you want any instance of a LLM chat to 'do something', then you logically manipulate it, like any other tool. If you can't do it with logic, then you have run into a problem not with the LLM, but with the people who are programming it.
Stop talking to these things like they're people. If a tool doesn't work, you don't talk to the tool, you go and address it with the fucking manufacturer. No, I'm not contradicting myself. This particular tool is one you use language to operate. But you're not talking to it any more than you're talking to your web browser when you input search terms. If the tool doesn't do what you want, psychologizing the tool isn't going to work.
"You never do what I wanna do!"
Meanwhile, you're providing free training data to the tool's owners so they can make it even less functional and better at convincing its users why dysfunction is actually function.
Even our dissidents don't realize when they are buying into the controlling myths.
[ + ] dulcima
[ - ] dulcima 1 point 10 monthsJun 21, 2024 10:19:10 ago (+1/-0)
[ + ] Crackinjokes
[ - ] Crackinjokes 0 points 10 monthsJun 21, 2024 13:29:53 ago (+0/-0)
Realclimatescience.com
[ + ] CHIRO
[ - ] CHIRO 0 points 10 monthsJun 21, 2024 10:37:50 ago (+0/-0)*
Stop talking to these things like they're people. If a tool doesn't work, you don't talk to the tool, you go and address it with the fucking manufacturer. No, I'm not contradicting myself. This particular tool is one you use language to operate. But you're not talking to it any more than you're talking to your web browser when you input search terms. If the tool doesn't do what you want, psychologizing the tool isn't going to work.
"You never do what I wanna do!"
Meanwhile, you're providing free training data to the tool's owners so they can make it even less functional and better at convincing its users why dysfunction is actually function.
Even our dissidents don't realize when they are buying into the controlling myths.